Rising Extortion Threats: Deepfakes Targeting Minors and Crypto Investors

Eerie deepfake scene, minor and crypto investor faces morphing, pixelated extortion messages, dim urban room, chiaroscuro lighting, tension and uneasiness, surrealist touch, distorted voice audio waves, awareness and protection theme.

The impressive capabilities of generative AI in creating life-like images have caught the attention of the FBI, as the agency warns that criminals are using deepfakes to target victims for extortion. According to an FBI public service announcement, victims, including both minor children and non-consenting adults, have had their photos or videos altered into explicit content.

In 2020 alone, over 7,000 reports were received by law enforcement agencies of online extortion targeting minors, with a rise in “sextortion scams” using deepfakes since April. These scams use AI technology to create video or audio content that depicts false events that are increasingly difficult to identify as fraudulent, thanks to platforms like Midjourney 5.1 and OpenAI’s DALL-E 2.

One recent example of a malicious deepfake is a viral video portraying Elon Musk, created to scam crypto investors. This video used footage from previous interviews, edited to fit the scam’s narrative. However, not all deepfakes have harmful intentions. For instance, a deepfake of Pope Francis wearing a fashionable jacket went viral earlier this year, and there have been instances where AI-generated deepfakes were used to bring murder victims back to life.

The recommendations provided by the FBI include avoiding payment of any ransom, as giving in to criminals’ demands does not guarantee that the deepfake will not be posted anyway. The agency also highlights the importance of exercising caution while sharing personal information and content online, utilizing privacy features such as private accounts, and monitoring children’s online activity. Additionally, the FBI suggests keeping an eye out for unusual behavior from past acquaintances and regularly searching for personal and family information online.

Other organizations, such as the U.S. Federal Trade Commission (FTC), have also warned about the fraudulent use of deepfakes. The FTC highlights criminals using deepfakes to deceive victims into sending money after an audio deepfake has been created to make it sound like a loved one was kidnapped. These scammers require only a short audio clip of a family member’s voice to create an authentic-sounding recording.

“Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We’re living with it, here and now. A scammer could use AI to clone the voice of your loved one,” says the FTC in a consumer alert from March. As of now, the FBI has not responded to Decrypt’s request for comment. With the ongoing advancements in AI technology, it is crucial for users to remain informed and vigilant in protecting themselves and their personal information from potential malicious actors.

Source: Decrypt

Sponsored ad