The Federal Bureau of Investigation said the images of the victims appeared “real” and that in some cases children were attacked.
The Federal Bureau of Investigation (FBI) warned Americans that delinquents use more and more Artificial Intelligence (AI) to create sexually explicit images with the purpose of intimidate and extort to the victims.
The content you want to access is exclusive to subscribers.
In an alert circulated this week, the fbi said that he had recently observed a increase in extortion victims who said they had been attacked using manipulated versions of innocent images taken from online posts, private messages or video chats.


fbi.jpg

Wikipedia.
“The photos are then sent directly to the victims by malicious actors for extortion either sexual harassment“, the warning said, adding: “Once it circulates, victims may face major issues to prevent continued sharing of tampered content or removal from the internet.”
The FBI said the images appeared “real” and that, in some cases, there were children that they were attacked. Meanwhile, the Office did not go into details about the programs used to generate sexual imagesbut noted that technological advances “continuously improve the qualitypersonalization and accessibility of content creation enabled by Artificial Intelligence (AI)On the other hand, they did not respond to queries seeking details about the phenomenon.
Extortion.jpg

The innocuous image manipulation to create sexually explicit images is almost as old as photography itself, but the release of open source AI tools made the process easier than ever.
The results are often indistinguishable from real life photographs and in recent years a number of websites and social media channels have sprung up that specialize in the creating and sharing sexual images enabled by AI.
By Raphael Satter.-
Source: Ambito