Who Wrote Shy Girl and Why It Matters

Who Wrote Shy Girl and Why It Matters
Now heading for the pulping machine: the UK edition of Shy Girl.

Last week, the book world learned that publishing house Hachette would not be releasing Shy Girl in the United States amid allegations that author Mia Ballard used AI extensively to write it. This latest twist for a book originally self-published before being acquired and published by Hachette UK last year is raising concerns among readers and industry followers about fraying trust and what publishers will ignore in the name of profit. The fallout exposes a widening gap between publishers’ supposed anti-AI posture and their private incentives and processes, and a due-diligence regimen that hasn’t kept pace: little to no provenance checks before acquisition, no tool-use disclosures from authors and editors, and no edit history audits.

The allegations against Ballard didn’t appear out of nowhere; sharp-eyed readers began raising concerns last year, suggesting the book read more AI than human. Quality aside, how did Shy Girl land such a great deal with one of the largest publishing houses on the planet? Either Hachette suspected and proceeded anyway, betting hype would overshadow scrutiny, or its systems weren’t built to detect or define what counts as permissible AI use.

What’s equally striking is the acquisition process that’s become the new normal. Self‑pub used to be the end of the line; now editors scour Reddit and #BookTok for heat. Buzz means profit, especially in commercial fiction and romance, and speed beats vetting. Given that velocity, it's not surprising that little editing happens post‑acquisition. Recent self‑pub breakouts like Colleen Hoover and Amanda Hocking certified the new playbook of capturing momentum first and backfill editorial later. As author and industry watcher Lincoln Michel puts it: “The incentive in such cases is to take over the sales as quickly as possible… This [Shy Girl] scandal shows the risks to that approach.”

Stepping back from the immediate crisis involving Ballard, big publishers are trying to capitalize on AI when it’s convenient for them. “Trade publishers are (somewhat) quietly adopting AI tools themselves, dancing delicately around whether this conflicts with their public-facing principles,” writes industry critic Thad Mcllroy. Hachette UK's own stated position specifically opposes "machine creativity" while encouraging what it calls responsible operational uses. Starting in 2024, several publishing houses began licensing books for AI training, and though Hachette is not among them, their marketing FAQ page states that “HBG will not license your work for training a LLM without your [the author’s] consent, and on terms consistent with your publishing agreement’s provision for electronic rights licensing.” Hachette may well have buried LLM-related language in the contract that authors simply don't read.

Following the allegations concerning Shy Girl in the Times, Hachette pulled the book, citing a “lengthy investigation in recent weeks” that led to their decision. Unsold copies in the UK are being pulped.

Mia Ballard, for her part, says this has been nothing short of a nightmare: “This controversy has changed my life in many ways and my mental health is at an all-time low, and my name is ruined for something I didn’t even personally do” she wrote to the Times. She is considering pursuing litigation. In a now-deleted comment beneath a video takedown of her book, Ballard claims a freelance editor used AI tools without her consent.

Did Hachette act on principle after a “lengthy investigation,” or dump a hot potato once the Times called? Either way, the incentives prioritize speed over vetting and shift risk onto authors. If any part of Ballard’s account is true, what remains are procedural questions of liability, editorial oversight, and the line between acceptable tool use and undisclosed composition.

The question of what constitutes AI use cuts to the quick. Ghostwriters, for example, have long been accepted and legitimate members of the publishing ecosystem. If authors must affirm their work is AI-free, will authors have to disclose when they’ve worked with a hired hand? Currently, some do, some don’t. Discretion has always been part of the arrangement, and I generally work with authors who want that respected. If we’ve always accepted that books have invisible collaborators, what is the line being drawn here, and who gets to draw it? Even Ballard blamed any AI in her work on a freelance editor. The distinction, I believe, is in whether that help has a pulse, and even then, can roil readers. Recall that a few years ago, singer Millie Bobby Brown worked with ghostwriter Kathleen McGurl for her debut novel. The book was a bestseller, but readers seemed upset to learn that it wasn’t Brown opening a vein at the keyboard and pouring herself into her work.  

AI is now part of the writing world. Authors, editors, researchers, and publishers are using AI in various ways as a tool, not a substitute, though some use it generate work from whole cloth. Add to that the rise of slop (some of which peddles deadly misinformation) that now thoroughly clogs the marketplace, and you're left with a chaos masterpiece.

Much of the current uproar stems from readers feeling duped, that the relationship between author and reader was compromised. “Novelists wrestle with the human condition,” writes author Andrea Bartz in a New York Times op-ed. “If you remove the flesh and blood from the equation—or plant seeds of doubt about whether what they are consuming is authentically human—the writer-reader relationship falls apart.”

Ballard maintains that she didn’t use AI, and detectors are notorious for generating false positives. Will this case force authors to pay to certify that their work is AI-free? However this plays out, we’re still left with traditional publishing chasing trends when it’s convenient and shirking its covenant that draws readers to books at all—that a human mind is meeting another in paper and ink.