The manipulated images were then spread to other users on the Telegram app, according to an investigation carried out by Sensity AI, a company that specialises in detecting deepfakes and other forms of “malicious visual media”.
The report assessed images shared in public up to the end of July 2020, and noted that some of the targets “appeared to be underage”.
In a 12-page report seen by The Independent, the company outlined how a new “deepfake ecosystem” evolved on Telegram using an “AI-powered bot that allows users to photo-realistically ‘strip naked’ clothed images of women.”
The research revealed that anyone can send the bot a photo through the Telegram mobile or web app and receive a nude back within minutes. The “service” is free, though users can pay a base of 100 rubles (approximately $1.50) for perks such as removing the watermark on the “stripped” photos or skipping the processing queue, said the report.
Sensity AI highlighted a strong geographical skew among users in these channels, with roughly 70 per cent coming from Russia and neighbouring countries, and that the primary targets were women from Argentina, Italy, Russia and the US.
While deepfakes have previously been used mainly on celebrities and politicians, Sensity’s poll of the Telegram bot’s users found 63 per cent were primarily “interested to undress” women they “know in real life”.
The findings also alluded to the broader threats that the bot presented. “Specifically, individuals’ stripped images can be shared in private or public channels beyond Telegram as part of public shaming or extortion based attacks,” the authors said.
Sensity said that it has disclosed the findings to Telegram but has so far not received a response from them. It said it has also contacted relevant law enforcement agencies.