Former PM Imran Khan’s party Pakistan Tehreek-i-Insaaf (PTI) levelled a fresh series of accusations against Pakistan Muslim League-Nawaz (PML-N). PTI alleged that PML-N is employing foreign technology companies to create “deep fake” videos of Imran Khan in a bid to tarnish his public image. Taking to Twitter, Farmer Pakistan’s Human Rights Minister and member of the PTI party, Shireen Mazari claimed that PML-N is resorting to its “nefarious tactics” to produce “disgusting videos” of Khan through the help of foreign “tech companies.” What is deepfake? Deepfakes (a portmanteau of “deep learning” and “fake”) are synthetic media in which a person in an existing image or video is replaced with someone else’s likeness. While the act of faking content is not new, deepfakes leverage powerful techniques from machine learning and artificial intelligence to manipulate or generate visual and audio content with a high potential to deceive. The main machine learning methods used to create deepfakes are based on deep learning and involve training generative neural network architectures, such as autoencoders or generative adversarial networks (GANs).
All the technologies have bad uses. As the 21st century’s answer to photoshopping, deepfakes use a form of artificial intelligence called deep learning to make images of fake events, hence the name deepfake.
Many are pornographic. The AI firm Deeptrace found 15,000 deepfake videos online in September 2019, a near doubling over nine months. As new techniques allow unskilled people to make deepfakes with a handful of photos, fake videos are likely to spread beyond the celebrity world to fuel obscene content. As Danielle Citron, a professor of law at Boston University puts it: “Deepfake technology is being weaponised against women.” Deepfake technology can create convincing but entirely fictional photos from scratch. A non-existent Bloomberg journalist, “Maisy Kinsley”, who had a profile on LinkedIn and Twitter, was probably a deepfake. Another LinkedIn fake, “Katie Jones”, claimed to work at the Center for Strategic and International Studies, but is thought to be a deepfake created for a foreign spying operation. Audio can be deepfaked too, to create “voice skins” or “voice clones” of public figures. Last March, the chief of a UK subsidiary of a German energy firm paid nearly £200,000 into a Hungarian bank account after being phoned by a fraudster who mimicked the German CEO’s voice. The company’s insurers believe the voice was a deepfake, but the evidence is unclear. Similar scams have reportedly used recorded WhatsApp voice messages.
SYED TAHIR RASHDI
SINDH
PTI’s Claims
Must Read
Can revisit civil disobedience call if talks bear fruit: PTI
Sh Waqas Akram says PTI founder has barred them from compromising on two primary demands
"We don't see any apparent relief with regard...