{"id":2280,"date":"2023-02-07T00:00:00","date_gmt":"2023-02-07T00:00:00","guid":{"rendered":"https:\/\/www.nextias.com\/current_affairs\/uncategorized\/07-02-2023\/deepfakes-voice\/"},"modified":"2023-02-07T00:00:00","modified_gmt":"2023-02-07T00:00:00","slug":"deepfakes-voice","status":"publish","type":"post","link":"https:\/\/www.nextias.com\/ca\/current-affairs\/07-02-2023\/deepfakes-voice","title":{"rendered":"Deepfakes Voice"},"content":{"rendered":"<p><span style=\"font-size:13pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong><u>In News<\/u><\/strong><\/span><\/span><\/span><\/p>\n<ul>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">A social media platform used speech synthesis to make deep fakes of celebrities.\u00a0<\/span><\/span><\/span>\n<ul>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">These deep fake audios made racist, abusive, and violent comments.<\/span><\/span><\/span><\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p><span style=\"font-size:13pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong><u>About<\/u><\/strong><\/span><\/span><\/span><\/p>\n<ul>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>Deepfakes:<\/strong><\/span><\/span><\/span>\n<ul>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Deep fake is a type of Artificial Intelligence (AI) used to create convincing images, audio, and video hoaxes.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Deepfakes use deep learning AI to replace the likeness of one person with another in video and other digital media.\u00a0<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">The most common method relies on the use of <\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>deep neural networks<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"> involving <\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>autoencoders<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"> that employ a face-swapping technique.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Although deepfakes could be used in positive ways, such as in art, expression, accessibility, and business, it has mainly been weaponized for malicious purposes.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Deepfakes can harm individuals, businesses, society, and democracy, and can accelerate the already declining trust in the media.<\/span><\/span><\/span><\/li>\n<\/ul>\n<\/li>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>Deepfake voice:<\/strong><\/span><\/span><\/span>\n<ul>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Deepfake voice, also called a <\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>synthetic voice<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">, uses AI to generate a clone of a <\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>person\u2019s voice.<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"> The voice can accurately replicate the tone, and accents, of the target person.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>Synthetic voices<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"> are used for business, entertainment, and other purposes to advance them, <\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>deepfake voice<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">s are usually associated with copying human voices to fool someone.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Creating deep fakes needs high-end computers with powerful graphics cards, leveraging cloud computing power.\u00a0<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Deepfakes can also be used to carry out espionage activities. Doctored videos can be used to blackmail government and defence officials into divulging state secrets<\/span><\/span><\/span><\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p><span style=\"font-size:13pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong><u>Concerns about using Deepfake voice<\/u><\/strong><\/span><\/span><\/span><\/p>\n<ul>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>Lack of Regulations: <\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Laws pertaining to their use do not exist in many countries. Law enforcement agencies in many countries are busy establishing proper regulations for producing and using artificially synthesized voices.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>Ethical Concerns:<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"> It can cause impersonation, identity theft and defamations.\u00a0<\/span><\/span><\/span>\n<ul>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Deepfakes are widely used in the political arena <\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>to mislead voters, manipulate facts, and spread fake new<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">s.\u00a0<\/span><\/span><\/span><\/li>\n<\/ul>\n<\/li>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>Breach of Public trust: <\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Erosion of public trust will promote a culture of factual relativism, unraveling the increasingly strained fabric of democracy and civil society.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>Easy Availability:<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"> Gathering clear recordings of people\u2019s voices is getting easier and can be obtained through recorders, online interviews, and press conferences.<\/span><\/span><\/span>\n<ul>\n<li style=\"list-style-type:circle\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Voice capture technology is also improving, making the data fed to AI models more accurate and leading to more believable deepfake voices.<\/span><\/span><\/span><\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p><span style=\"font-size:13pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong><u>Ways to detect Deepfake voice<\/u><\/strong><\/span><\/span><\/span><\/p>\n<ul>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Research labs use watermarks and blockchain technologies to detect deepfake technology, but the tech designed to outsmart deepfake detectors is constantly evolving.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Multifactor authentication (MFA) and anti-fraud solutions can also reduce deepfake risks.\u00a0<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Callback functions of call centres can end suspicious calls and request an outbound call to the account owner for direct confirmation.<\/span><\/span><\/span><\/li>\n<\/ul>\n<p><span style=\"font-size:13pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong><u>Legislations to deal with Deepfakes<\/u><\/strong><\/span><\/span><\/span><\/p>\n<ul>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">Currently, very few provisions under the Indian Penal Code (IPC) and the <\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>Information Technology Act, 2000 <\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">can be potentially invoked to deal with the malicious use of deepfakes.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">\u00a0<\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>Section 500 of the IPC <\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">provides punishment for defamation.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>\u00a0Sections 67 and 67A of the Information Technology Act <\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">punish sexually explicit material in explicit form.<\/span><\/span><\/span><\/li>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong>The Representation of the People Act, 1951<\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">, includes provisions prohibiting the creation or distribution of false or misleading information about candidates or political parties during an election period.<\/span><\/span><\/span><\/li>\n<\/ul>\n<p><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong><u>Way Ahead<\/u><\/strong><\/span><\/span><\/span><\/p>\n<ul>\n<li style=\"list-style-type:disc\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">In India, the legal framework related to AI is insufficient to adequately address the various issues that have arisen due to AI algorithms. The Union government should introduce separate legislation regulating the nefarious use of deepfakes and the broader subject of AI.<\/span><\/span><\/span><\/li>\n<\/ul>\n<p><span style=\"font-size:13pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\"><strong><u>Source<\/u><\/strong><\/span><\/span><\/span><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#000000\">: <\/span><\/span><\/span><a href=\"https:\/\/www.thehindu.com\/sci-tech\/technology\/explained-voice-deepfakes-how-are-they-are-used\/article66476423.ece#:~:text=How%20are%20voice%20deepfakes%20created,weeks%2C%20depending%20on%20the%20process.\" style=\"text-decoration:none\" target=\"_blank\" rel=\"noopener\"><span style=\"font-size:12pt\"><span style=\"font-family:'Book Antiqua',serif\"><span style=\"color:#1155cc\"><strong><u>The Hindu<\/u><\/strong><\/span><\/span><\/span><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In News A social media platform used speech synthesis to make deep fakes of celebrities.\u00a0 These deep fake audios made racist, abusive, and violent comments. About Deepfakes: Deep fake is a type of Artificial Intelligence (AI) used to create convincing images, audio, and video hoaxes. Deepfakes use deep learning AI to replace the likeness of [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":2281,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[21],"tags":[114,26,115,33],"class_list":["post-2280","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-current-affairs","tag-cyber-crime-security","tag-gs-3","tag-gs4","tag-science-technology"],"acf":[],"jetpack_featured_media_url":"https:\/\/wp-images.nextias.com\/cdn-cgi\/image\/format=auto\/ca\/uploads\/2023\/07\/2664123Screenshot_6.png","_links":{"self":[{"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/posts\/2280","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/comments?post=2280"}],"version-history":[{"count":0,"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/posts\/2280\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/media\/2281"}],"wp:attachment":[{"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/media?parent=2280"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/categories?post=2280"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.nextias.com\/ca\/wp-json\/wp\/v2\/tags?post=2280"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}