Gross.
False Profit
Google is taking money to promote AI apps that produce nonconsensual deepfake nudes, new reporting from 404 Media reveals.
As caught by 404, searches for terms like “undress apps” and “best deepfake nudes” display paid advertisements for websites offering services like “NSFW AI Image Generator” and other readily available AI tools that can be used to create explicit imagery of real people without their consent.
Google has drawn widespread criticism over its failure to curb the proliferation of AI deepfakes of real people in its search results, which have historically been ridiculously easy — as in, one-search-query-and-one-click-away-level easy — to find on the search giant’s platform. In response to this criticism, Google just last week announced that it would expand existing search policies to “help people affected” by the “non-consensual sexually explicit fake content” that crops up in its search pages.
But 404’s reporting reveals that Google’s deepfake problem also exists on its ad side, where the search giant is actively profiting from promoted posts advertising some of the exact same AI services that help bad actors make invasive and nonconsensual explicit content in the first place.
Pro Active
Google has reportedly taken action to delist the specific advertisements and websites flagged by 404’s journalists, with a spokesperson for Google telling the outlet that services designed “to create synthetic sexual or nude content are prohibited from advertising through any of our platforms or generating revenue through Google Ads.”
Per 404, the spokesperson added that the search giant is “actively investigating this issue and will permanently suspend advertisers who violate our policy, removing all their ads from our platforms.” But according to 404, the spokesperson didn’t address questions regarding why advertisers were allowed to pay Google to promote links for search terms like “undress app” to begin with, which seems to be the core of the issue.
Deepfaked porn is harmful to its victims. And with easily accessible AI deepfake tools in public hands, a new wave of troubling instances involving the creation of nonconsensual fake nudes — particularly cases in middle and high schools, where school systems and law enforcement are struggling with how to police it — continue to make news.
As for Google, the website where most people go to find stuff, it clearly has a long way to go in its effort to mitigate the rising tide of deepfakes on the web — not to mention the searchability of the tools plainly designed to make them.
More on deepfakes: Schoolchildren Sentenced for AI-Generating Nudes of Their Classmates