in

Russian Hackers Are Using Fake AI “Nudify” Sites to Steal Data


“They are looking for people who are doing borderline shady things to start with.”

Scamming the Scammers

Multiple sites masquerading as “nudify” services, which use AI to deepfake clothed photographs into often nonconsensual nudes, have been linked to a notorious Russian hacker collective that was believed to be dead.

As 404 Media reports, Zach Edwards of the cybersecurity firm Silent Push said that the Russian group Fin7 seems to be behind several websites that use variations of the name “AINude.ai” to trick their mostly male victims into giving them their info without their knowledge.

“The deepfake AI software may have an audience of mostly men with a decent amount who use other AI software or have crypto accounts,” Edwards told 404. “There’s a specific type of audience who wants to be on the bleeding edge of creepy (while ignoring new laws around deepfakes), and who are proactively searching out deepfake AI nude software.”

Edwards and his colleagues found that these Fin7-linked AI sites contained “infostealer” malware that the site said was necessary to “nudify” images.

As its name suggests, infostealer malware targets infected machines by stealing their data and sending them off-server to hackers. Using that data, bad actors like Fin7 can threaten to release personal information — unless, of course, their victims pay up.

Mighty Fall

While this scheme is relatively run-of-the-mill for shady porn sites — which the AI nude sites link to as well — perhaps what’s most shocking about Silent Push’s finding is that the Russian hackers in question are supposed to be defunct.

Last year, the US Department of Justice went as far as to declare that Fin7, an unusually professional outfit that ran fake security fronts and had operatives in both Russia and Ukraine, is “no more” after three of its hackers were charged and sentenced to prison.

As this news makes clear, that declaration was premature. This hack’s obvious Dropbox links containing the malware files, however, seem far less sophisticated than Fin7’s previous work that involved setting up entire shell companies to get away with their scams.

“They are looking for people who are doing borderline shady things to start with,” Edwards told 404, “and then having malware ready to serve to those people who are proactively hunting for something shady.”

At the end of the day, it’s hard to say who is worse: those trying to almost certainly nudify other peoples’ images noncsensually, or those trying to rip the creeps off.

More on deepfakes: Google Caught Taking Money to Promote AI Apps That Create Nonconsensual Nudes


Prompt Engineering for Beginners – Tutorial 1 – Introduction to OpenAI API