Is AI a “tool,” or a “creature”?
Creature Feature
The mercurial OpenAI CEO Sam Altman got interviewed by The Advocate, and the whole thing is worth a read — but the most fascinating moment about his technical work was when the publication asked Altman about the biggest misconceptions about AI.
Altman seized the opportunity to describe what he thinks is the biggest example, and it’s an interesting one: that many people are confused about whether AI is a “creature” or a “tool.” In his words:
The biggest misconception about AI I think is there’s confusion about whether AI is thought of as a tool or a creature. It’s a better movie plot if it’s a creature in a sci-fi movie, for example. If you use ChatGPT, it’s clearly a tool.
There’s still risks with tools, of course, but they’re of a different shape and a different kind of profile. And I think the popular misconception of AI as sci-fi is very, very different from people who have been using it as a tool for a long time.
And by the way, I think it’s great that what we’re building is like a tool, because if you give humans better tools, they do these amazing things to surprise you on the upside and that builds all this new value for all of us.
Tool Age
The more closely you read the passage, the harder it is to parse exactly what Altman was saying. He doesn’t actually come right out and say that AI writ large falls into the tool category, though he does specifically say that he sees OpenAI’s flagship consumer-facing product ChatGPT as one. But then he says that “what we’re building” more generally is a tool, implying that his vision for the company’s tech is less on the creature side.
And in a strict sense, that’s probably fair. For now, AI is a bunch of data and math that produces statistically probable outputs, and less like the novel organism implied by the word “creature.”
Let’s be real, though: it’s a convenient narrative at a moment of more and more concern about whether OpenAI and the greater AI industry are poised to replace jobs. The idea of AI as a tool that can help people do their jobs better, like a personal computer, is much less scary than the idea of AI as an autonomous entity that can do your job instead of you.
That hasn’t always been the story he’s telling, though. Not so many months ago, Altman was predicting that soon AI will be able to replace the median human worker, resulting in mass job loss. In fact, he couldn’t resist hyping a perfect example in the Advocate‘s followup question: personalized AI-powered tutors.
The tech often has a “creature” type of feel, too. OpenAI’s breakout product, ChatGPT, is explicitly designed to act like a conversational interlocutor, and that strategy is being widely imitated by its competitors. For better or worse, many people are now fostering parasocial relationships with AI. And there’s a lot of buzz about AI “agents” that would be able to act with relative autonomy — projects that in many cases will undoubtebtly be powered by OpenAI’s tech.
Are those types of applications creatures or tools, to use Altman’s terminology? It’s tough to say — and if we know one thing about Altman, it’s that before we’ve even properly grappled with the old paradigm, he’ll be ushering in a new one.
More on Altman: Sam Altman Under SEC Investigation for Potentially Misleading Investors