Rise Of AI Sextortion: FBI Sounds Alarm On Deepfake Threats
The FBI cautions about the alarming increase in the use of artificial intelligence to create fake explicit videos for sextortion.
The FBI has issued a warning regarding the escalating use of artificial intelligence (AI) in the production of counterfeit videos to facilitate sextortion schemes targeting minors and unsuspecting adults.
In a recently published alert, the FBI disclosed that the use of AI to generate fabricated videos depicting individuals engaged in sexually explicit activities had witnessed a surge in recent months. This distressing trend has resulted in victims, including minors and non-consenting adults, reporting that their images or videos have been manipulated into explicit content. The perpetrators then distribute these altered media on social media platforms and pornographic websites with the intent of harassing the victims or furthering their sextortion schemes.
The FBI has noted an increase in the number of sextortion victims reporting the use of fabricated images or videos derived from their social media posts, web content, or captured during video chats. The offenders typically demand either monetary payment or real sexually themed images and videos. The prevalence of software and cloud-based services enabling the creation of deep fake videos is extensive, ranging from freely accessible open-source tools to subscription-based accounts. Recent advancements in AI have significantly enhanced the quality of these tools, allowing the creation of realistic videos using a person's likeness based on a single image of their face. Although some deep fake software may claim to include protective measures against misuse, these safeguards are often circumvented, and illicit services are available on underground markets without any restrictions.
To produce sexually explicit images that resemble the victim, scammers frequently acquire the victim's photos from social media platforms or other sources. They then disseminate these manipulated images on social media, public forums, or pornographic websites. Many victims, including minors, remain oblivious to the fact that their images have been copied, altered, and circulated until someone else brings them to their attention. The malicious actors behind these schemes either directly send the images to the victims for sextortion or harassment or wait for victims to discover the manipulated content online. Once these images are in circulation, victims face significant challenges in halting the relentless sharing of their compromised content or removing it from the internet.
In response to these alarming developments, the FBI urges individuals to take precautions to prevent their images from being exploited for deep fake purposes. Seemingly innocent images and videos posted or shared online can provide malicious actors with a vast pool of content to exploit for criminal activities. The convergence of advanced content creation technology and the accessibility of personal images on the internet create new avenues for perpetrators to locate and target victims, leaving them susceptible to embarrassment, harassment, extortion, financial losses, or ongoing victimisation.
The FBI has issued a warning about the increasing use of artificial intelligence (AI) in the creation of counterfeit videos for sextortion schemes targeting minors and unsuspecting adults.
Victims, including minors and non-consenting adults, have reported that their images or videos have been manipulated into explicit content using AI-generated deep fake technology.
Perpetrators distribute these altered media on social media platforms and pornographic websites to harass victims or further their sextortion schemes, demanding payment or additional explicit content.