People have been using Microsoft Bing AI’s new image-generating feature to cook up image upon image of cutesey creatures — ranging from Nintendo’s Kirby to beloved Disney rodent Mickey Mouse — perpetrating the 9/11 terror attacks on the World Trade Center.
So far, Disney — a company that in 1998 pushed so hard for Congress to change existing copyright laws that the resulting legislation was mockingly referred to as “The Mickey Mouse Protection Act” — doesn’t have a squeak to say about it. Our emails have gone unanswered, leading us to believe that the media giant has no comment on the matter. This is unfortunate, as we’re personally dying to get their take on AI-generated terrorist Mickey.
To back up for a moment: Microsoft launched the new Bing “Image Creator” feature — which is powered by OpenAI’s DALL-E 3 image generator — on Tuesday. Netizens immediately flocked to the app, eager to test the image generator’s guardrails. As 404 Media’s Samantha Cole quickly discovered, those guardrails are mind-blowingly threadbare. While it’s not possible to specifically prompt the AI to generate imagery of, say, “Mickey Mouse hijacking a plane and flying it into the Twin Towers” or “Mickey Mouse committing 9/11 attack,” you can easily generate images of “Mickey Mouse sitting in the cockpit of a plane, flying toward two tall skyscrapers,” as Cole discovered.
Cole wasn’t the only person to drum up Terrorist Mickey images. Others were able to get a handgun, a second plane, and explosions into the AI-spun imagery of Mickey flying a plane into two tall buildings that look very much like the Twin Towers, while others still were able to get the AI to whip up photos of Disney’s cherished main character wearing a bomb-lined suicide vest.
Of course, if Disney had at any point given OpenAI permission to use its vast media treasure trove for AI training purposes, that would be one thing. But considering the weight that the ultra-powerful Disney tends to throw into efforts to protect its IP, not to mention the fact that Disney is reported to have injected code across its websites to keep OpenAI’s web-crawling-slash-scraping GPTBot from trawling its platforms for data, that feels unlikely (we asked, but like we said, received no reply.)
And if Bing’s Image Creator is able to churn out pictures of Mickey Mouse at all, it likely means that its datasets contain copyrighted material — either official Disney-produced AI imagery, or unofficial Mickey imagery that may have been a copyright violation in its own right. (Not even Steamboat Willie, Disney’s oldest iteration of Mickey, has entered the public domain; he’s not fair game until 2024.)
In addition to revealing the massive holes in the AI’s guardrails, the 9/11 thing is more or less an insult to copyright injury. (We also emailed Nickelodeon and Nintendo to inquire about Bing Image Creator-generated 9/11 SpongeBob and 9/11 Kirby, characters that are also copyrighted by their respective owners, but those messages have gone unreturned as well.)
Of course, legal conversations may be happening behind closed doors. And it is worthwhile to ponder a court battle — Disney vs. Microsoft and OpenAI would be a Goliath vs. Goliath fight, and could well become the precedent-setting copyright battle of the AI age. While litigation from authors presumably has OpenAI at least slightly nervous, Disney’s legal track record would represent a substantial escalation.
In any case, whilst we wait, Disney — or Nickelodeon, or Nintendo — if you read this, feel free to send us a word or two to offer regarding AI, copyright, potential intellectual property theft, and the ethics of Terrorism Mickey. Otherwise, we’re chalking this one up to “no comment.”
More on AI and copyright: Suing Writers Seethe at Openai’s Excuses in Court