“One of the scariest things about the AI relationship chatbots is the potential for manipulation of their users.”
Trust Issues
Spilling your hopes, secrets, and fantasies to your AI girlfriend? You might want to reconsider.
In a new report, experts at the Mozilla Foundation warn that AI companion bots — including the popular app Replika — are plagued by deeply concerning privacy pitfalls and murky data use policies.
“So-called ‘AI soulmates’ are giving Mozilla the ick when it comes to how much personal information they collect,” reads the Mozilla report, “especially given the lack of transparency and user control over how this data is protected from abuse.”
In other words, bots designed to provide humans with an outlet for intimacy are data-hoarding troves of privacy tripwires — and the companies making them might just be using your very intimate data for their profit.
“To be perfectly blunt, AI girlfriends and boyfriends are not your friends,” Mozilla researcher Misha Rykov said in a statement (emphasis his.) “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”
Eep! And we thought human-to-human relationships had trust issues.
Bad Date
The researchers’ findings are striking, not to mention extremely creepy.
Per the report, Mozilla researchers put 11 AI companion bots through their paces, ultimately finding that “ten of the 11 chatbots failed to meet Mozilla’s Minimum Security Standards, such as requiring strong passwords or having a way to manage security vulnerabilities.” So, not only do the apps fall flat on the security front end, but they also generally fail to give users more power over their privacy preferences.
Among other concerns including poor protections against use by minors, the experts also discovered a staggering amount of data trackers across the various apps. These trackers were caught sending users’ in-app data to an array of third-party outfits including Facebook, the Google-owned DoubleClick, and a range of marketing and advertising firms, with some observed to ship data across Indian and Chinese exchanges.
But researchers aren’t just worried about outcomes like targeted shopping ads here. Again, companion bots are explicitly designed to build intimate relationships with their users. Given the sensitivity of the information shared, it stands to reason that bad actors could effectively — read: dangerously — exploit it.
“One of the scariest things about the AI relationship chatbots is the potential for manipulation of their users,” Jen Caltrider, the director of Mozilla’s *Privacy Not Included project, said in a statement. “What is to stop bad actors from creating chatbots designed to get to know their soulmates and then using that relationship to manipulate those people to do terrible things, embrace frightening ideologies, or harm themselves or others?”
“This is why we desperately need more transparency,” Caltrider added, “and user control in these AI apps.”
More on AI companions: Experts Say AI Girlfriend Apps Are Training Men to Be Even Worse