There was nothing in Drew Ortiz’s author biography at Sports Illustrated to suggest that he was anything other than human.
“Drew has spent much of his life outdoors, and is excited to guide you through his never-ending list of the best products to keep you from falling to the perils of nature,” it read. “Nowadays, there is rarely a weekend that goes by where Drew isn’t out camping, hiking, or just back on his parents’ farm.”
The only problem? Outside of Sports Illustrated, Drew Ortiz doesn’t seem to exist. He has no social media presence and no publishing history. And even more strangely, his profile photo on Sports Illustrated is for sale on a website that sells AI-generated headshots, where he’s described as “neutral white young-adult male with short brown hair and blue eyes.”
Ortiz isn’t the only AI-generated author published by Sports Illustrated, according to a person involved with the creation of the content who asked to be kept anonymous to protect them from professional repercussions.
“There’s a lot,” they told us of the fake authors. “I was like, what are they? This is ridiculous. This person does not exist.”
“At the bottom [of the page] there would be a photo of a person and some fake description of them like, ‘oh, John lives in Houston, Texas. He loves yard games and hanging out with his dog, Sam.’ Stuff like that,” they continued. “It’s just crazy.”
The AI authors’ writing often sounds like it was written by an alien; one Ortiz article, for instance, warns that volleyball “can be a little tricky to get into, especially without an actual ball to practice with.”
According to a second person involved in the creation of the Sports Illustrated content who also asked to be kept anonymous, that’s because it’s not just the authors’ headshots that are AI-generated. At least some of the articles themselves, they said, were churned out using AI as well.
“The content is absolutely AI-generated,” the second source said, “no matter how much they say that it’s not.”
After we reached out with questions to the magazine’s publisher, The Arena Group, all the AI-generated authors disappeared from Sports Illustrated‘s site without explanation. Our questions received no response.
The AI content marks a staggering fall from grace for Sports Illustrated, which in past decades won numerous National Magazine Awards for its sports journalism and published work by literary giants ranging from William Faulkner to John Updike.
But now that it’s under the management of The Arena Group, parts of the magazine seem to have devolved into a Potemkin Village in which phony writers are cooked up out of thin air, outfitted with equally bogus biographies and expertise to win readers’ trust, and used to pump out AI-generated buying guides that are monetized by affiliate links to products that provide a financial kickback when readers click them.
Do you know anything about The Arena Group’s use of AI-generated content? Shoot us an email at [email protected]. We can keep you anonymous.
Making the whole thing even more dubious, these AI-generated personas are periodically scrubbed from existence in favor of new ones.
Sometime this summer, for example, Ortiz disappeared from Sports Illustrated‘s site entirely, his profile page instead redirecting to that of a “Sora Tanaka.” Again, there’s no online record of a writer by that name — but Tanaka’s profile picture is for sale on the same AI headshot marketplace as Ortiz, where she’s listed as “joyful asian young-adult female with long brown hair and brown eyes.”
“Sora has always been a fitness guru, and loves to try different foods and drinks,” read Tanaka’s bio. “Ms. Tanaka is thrilled to bring her fitness and nutritional expertise to the Product Reviews Team, and promises to bring you nothing but the best of the best.”
But Tanaka didn’t last, either. Eventually she also disappeared, replaced by yet another profile that carried no headshot at all, which Sports Illustrated deleted along with the other AI-generated content after we reached out.
It wasn’t just author profiles that the magazine repeatedly replaced. Each time an author was switched out, the posts they supposedly penned would be reattributed to the new persona, with no editor’s note explaining the change in byline.
None of the articles credited to Ortiz or the other names contained any disclosure about the use of AI or that the writer wasn’t real, though they did eventually gain a disclaimer explaining that the content was “created by a 3rd party,” and that the “Sports Illustrated editorial staff are not involved in the creation of this content.”
Do you know anything about that “3rd party,” or how the content was created? Email us at [email protected]. We can keep you anonymous.
Though Sports Illustrated‘s AI-generated authors and their articles disappeared after we asked about them, similar operations appear to be alive and well elsewhere in The Arena Group’s portfolio.
Take TheStreet, a financial publication cofounded by Jim Cramer in 1996 that The Arena Group bought for $16.5 million in 2019. Like at Sports Illustrated, we found authors at TheStreet with highly specific biographies detailing seemingly flesh-and-blood humans with specific areas of expertise — but with profile photos traceable to that same AI face website. And like at Sports Illustrated, these fake writers are periodically wiped from existence and their articles reattributed to new names, with no disclosure about the use of AI.
Sometimes TheStreet‘s efforts to remove the fake writers can be sloppy. On its review section’s title page, for instance, the site still proudly flaunts the expertise of AI-generated contributors who have since been deleted, linking to writer profiles it describes as ranging “from stay-at-home dads to computer and information analysts.” This team, the site continues, “is comprised of a well-rounded group of people who bring varying backgrounds and experiences to the table.”
People? We’re not so sure.
The “stay-at-home dad” linked in that sentence above, for instance, is a so-called “Domino Abrams” — “a pro at home cleaning and maintenance,” at least until he was expunged from the site — whose profile picture can again be found on that same site that sells AI-generated headshots.
Or look at “Denise McNamara,” the “information analyst” that TheStreet boasted about — “her extensive personal experience with electronics allows her to share her findings with others online” — whose profile picture is once again listed on the same AI headshot marketplace. Or “Nicole Merrifield,” an alleged “first grade teacher” who “loves helping people,” but whose profile is again from that AI headshot site. (At some point this year, Abrams, McNamara, and Merrifield were replaced by bylines whose profile pictures aren’t for sale on the AI headshot site.)
Basic scrutiny shows that the quality of the AI authors’ posts is often poor, with bizarre-sounding language and glaring formatting discrepancies.
This article about personal finance by the AI-generated Merrifield, for example, starts off with the sweeping libertarian claim that “your financial status translates to your value in society.”
After that bold premise, the article explains that “people with strong financial status are revered and given special advantages everywhere around the world,” and launches into a numbered list of how you can “improve your finance status” for yourself. Each number on what should be a five-point list, though, is just number one. Mistakes happen, but we can’t imagine that anyone who can’t count to five would give stellar financial advice.
Abysmal-quality AI content, though, shouldn’t be surprising at The Arena Group.
Back in February, when the company first started publishing AI-generated health advice at its magazine Men’s Journal, we found that its first story was riddled with errors, prompting it to issue a massive correction.
Before that, when The Arena Group first announced its foray into AI, its CEO Ross Levinsohn promised in an interview with The Wall Street Journal that its quality would be outstanding.
“It’s not about ‘crank out AI content and do as much as you can,'” he told the newspaper early this year. “Google will penalize you for that and more isn’t better; better is better.”
Needless to say, neither fake authors who are suddenly replaced with different names nor deplorable-quality AI-generated content with no disclosure amount to anything resembling good journalism, and to see it published by a once-iconic magazine like Sports Illustrated is disheartening. Bylines exist for a reason: they give credit where it’s due, and just as importantly, they let readers hold writers accountable.
The undisclosed AI content is a direct affront to the fabric of media ethics, in other words, not to mention a perfect recipe for eroding reader trust. And at the end of the day, it’s just remarkably irresponsible behavior that we shouldn’t see anywhere — let alone normalized by a high-visibility publisher.
The Arena Group is also hardly alone, either. As powerful generative AI tools have debuted over the past few years, many publishers have quickly attempted to use the tech to churn out monetizable content.
In almost every case, though, these efforts to cut out human journalists have backfired embarrassingly.
We caught CNET and Bankrate, both owned by Red Ventures, publishing barely-disclosed AI content that was filled with factual mistakes and even plagiarism; in the ensuing storm of criticism, CNET issued corrections to more than half its AI-generated articles. G/O Media also published AI-generated material on its portfolio of sites, resulting in embarrassing bungles at Gizmodo and The A.V. Club. We caught BuzzFeed publishing slapdash AI-generated travel guides. And USA Today and other Gannett newspapers were busted publishing hilariously garbled AI-generated sports roundups that one of the company’s own sports journalists described as “embarrassing,” saying they “shouldn’t ever” have been published.
If any media organization finds a way to engage with generative AI in a way that isn’t either woefully ill-advised or actively unethical, we’re all ears. In the meantime, forgive us if we don’t hold our breath.
More on AI-generated journalism: USA Today Updates Every AI-Generated Sports Article to Correct “Errors”