FBI Alerts Public to AI-Generated Deepfake Scams Exploiting Social Media Images
The FBI has issued a warning regarding the use of AI technology by scammers to produce sexually explicit deep fake images and videos as a means to extort money from individuals, commonly known as “sextortion.”
This alarming threat takes advantage of the innocent photos shared on public social media accounts.
With the help of sophisticated image and video editing software, malicious actors can manipulate these images and generate pornographic content using the victim’s face.
โThe FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content,โ the agency said(Opens in a new window) in the alert.
โThe photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.โ
As a result, the FBI is warning the public about the danger of posting photos and videos of themselves online.
โAlthough seemingly innocuous when posted or shared, the images and videos can provide malicious actors an abundant supply of content to exploit for criminal activity.”
While the FBI did not provide specific numbers, the agency’s alert was prompted by the significant increase in sextortion schemes targeting minors.
In these cases, online predators often pose as attractive individuals to deceive teenage boys into sharing explicit photos.
Subsequently, the scammer threatens to publicly distribute the compromising content unless a ransom is paid.
In todayโs alert, the FBI noted recent sextortion schemes have also involved the use of deepfakes.
โAs of April 2023, the FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats,โ the agency said.
In some cases, the predators will also use the deepfakes to pressure a victim into sending them โreal sexually-themed images or videos.โ
The increasing prevalence of malicious deepfakes has raised concerns, leading to a potential expansion of laws prohibiting their use. Currently, only a handful of states, including Virginia and California, have enacted bans specifically targeting deepfake pornography.
However, in an effort to combat non-consensual deepfakes, Representative Joe Morelle (D-NY) recently introduced federal legislation that aims to criminalize their creation and distribution. This proposed legislation seeks to address the issue on a national level.