in

Google’s Search AI Recommends Changing Your Car’s Blinker Fluid, Which Is a Made Up Thing That Does Not Exist


We’ll miss you, internet.

Internet of Pieces

Google’s AI search wants you to change your car’s “blinker fluid.” No, dear reader: blinker fluid does not exist.

Last week, at its yearly I/O conference, Google doubled down on its AI-powered vision for the future of search — a vision that, as it stands, basically involves embedding an AI-paraphrased regurgitation of search results at the top of a user’s results page.

“Sometimes you want a quick answer, but you don’t have time to piece together all the information you need,” the company’s Head of Search Liz Reid wrote in an I/O-accompanying blog post. “Search will do the work for you with AI Overviews.”

But Google’s AI Overviews, previously known as Search Generative Experience, are still far from reliable.

Case in point: as one X-formerly-Twitter user pointed out over the weekend, AI Overview will respond to the search entry “blinker not making sound” — which would ideally return helpful, expert-penned posts or videos that help googlers figure out why their car’s blinker isn’t working — with the advice to “replace the blinker fluid.” Which, again, is not a real thing.

When we tested the search query for ourselves, we got the same terrible advice.

Blunder Road

The blinker fluid advice is one of many instances of AI Overview failing to provide correct information, often due to terrible sourcing. To wit: recently, Redditors prompting the feature with the search “food names end with um” noticed that the AI search function will return the woefully incorrect response of “Applum, Bananum, Strawberrum, Tomatum, and Coconut” — which was stolen from an obviously ironic answer to the same question posted years ago in a Quora forum.

When we just tested the similar query “fruits that end with um” for kicks, AI Overview told us that the Japanese fruit “Umeboshi” ends with “um,” troublingly citing an incorrect AI-generated answer to the same response from another AI chatbot, Poe.

Speaking of bad sourcing, as Jalopnik points out, Google’s blinker fluid gaffe is actually derived from a common joke. Those who know their way around vehicles are well aware that blinker fluid doesn’t exist, so telling a less-car-savvy person that they need to “replace blinker fluid” is a old inside quip. You can even buy empty blinker fluid bottles as a joke; indeed, the “source” that AI Overview cites is a comment in a wildly random travelers’ forum in which the commenter includes a clearly ironic photo of one of these empty bottle gag gifts with the phony advice that one “should replace” the fake fluid “every 2 years or so…..”

Great things happening in the infosphere. The future of search is here, we guess — but it’s still very much under construction, and might break the internet in the process.

More on Google AI being wrong about stuff: During Huge Demo, Google’s AI Provides Idiotic Advice about Photography That Would Destroy Film


The US Is Forming a Global AI Safety Network With Key Allies

The US Is Forming a Global AI Safety Network With Key Allies

Launching Niia.ai, a generative AI-driven platform for Fashion