Radius News

Publishing Note: Will AI Turn Publishing Upside Down?

07/25/2023

Generative artificial intelligence is the hot topic of 2023 and beyond as companies and employees scramble to keep up with how it will shape their businesses, their content, and their livelihoods.

Thad McIlroy’s article, “AI Is About to Turn Book Publishing Upside-Down” (Publishers Weekly, Jun 02, 2023) begins with this bold claim:

I believe that every function in trade book publishing today can be automated with the help of generative AI. And, if this is true, then the trade book publishing industry as we know it will soon be obsolete.

This claim is predicated on a premise about AI and some qualifications. The premise is that AI is here to stay and we’d all better get used to it. The qualifications—1. that AI is more than ChatGPT, and 2. the claim applies to “good enough” publishing (i.e., not highest quality, top-of-the-line publishing but what a large majority of the book-buying public will accept)—condition the scope of the claim. Curiously, McIlroy published this article on his own blog under a different title, “A Manifesto, of Sorts,” tipping his hand with respect to AI and book publishing. Also of note, in a followup post (AI and Book Publishing: Another Step on the Journey, June 9, 2023), McIlroy reacts to apparent plagiarism of that very article possibly created using AI. That leaves us to decide whether or not we agree that AI really is about to turn the book publishing industry upside down.

On the one hand, I agree that the technologies giving rise to AI have been and will continue to be disruptive across many industries, including publishing. Yes, AI looks like it will be with us for the foreseeable future. McIlroy’s rundown of its impacts on different sectors of publishing points to obvious examples of ways AI can be used in the industry. He refers back to the invention of the printing press and more recently the introduction of desktop computers, the worldwide web, and publishing software as thresholds of like-kind that introduced dramatic change, not only to publishing but also to human culture more generally. On the other hand, as an application of technology, AI will only be as disruptive as the creators and users of the technology, and those that resist its use, allow it to be. This isn’t a contest between Luddites and technophiles, where one side or the other wins, because whichever side you are on, the technology exists—that bell can’t be unrung. Yet, AI is still nascent enough for human beings to have choices about its development and uses.

If we accept the premise that AI is here to stay, then we must address three issues raised by McIlroy. First is to evaluate the validity of the “good enough” proviso. Second is to take up the more philosophical question of considering the implications and potential impacts of AI on humanity. Last is to state whether we embrace or reject the claim that AI will turn the book publishing industry upside down, and why.

Good Enough or Not?

On its face, it makes sense to posit “good enough” as the bar for acceptance of AI in book publishing. Can AI produce a book that a sufficient percentage of customers accepts as good enough to purchase? Maybe, in time. What Spotify did to the music industry, what the technology underlying online purchases did to the retail industry, AI could do to the publishing industry. McIlroy surveys the different sectors of the publishing industry noting places where AI is having or likely will have an impact. Any new technology is like a newborn bird. They hatch and are nestlings for the first while, lacking feathers and needing food. Before long, they grow feathers and transform into fledglings. After the parent birds feed, feed, feed, those fledglings, they prod them to get out of the nest, start walking and hopping, then stretch their wings and take flight. No one would expect nestling technology to perform like it will as full-fledged adult technology. Nor would readers expect books published using nascent AI technology to be on par with what they are used to buying. But, as McIlroy points out, forty years ago, when desktop publishing technology disrupted the publishing industry, early results on the printed page were not spectacular. In relatively little time, though, two complementary things happened. One was that the actual products improved, reflecting advances in the technology and processes. The second was that people adapted their expectations to the output of the technology. High-end printing done on offset presses was and still is superior to digital laser and inkjet printing. Even so, gradually, paying customers have come to accept books printed using these newer technologies.

But “good enough” pertains to what’s inside the covers of the book, as well as what the cover looks like and how clean the printing itself is. Here’s the rub for people within and outside the book publishing industry. At the moment, in my opinion, AI-generated content fails to pass the scratch and sniff test; it isn’t good enough. The question isn’t whether that will change, but how fast. When will AI consistently produce original content that a majority of readers either accept or actually embrace and value? How soon will people stop comparing AI-generated content to content created by humans?

AI: Friend or Foe?

On the philosophical front, as with most things, AI probably won’t turn out to be all good or all bad. The pragmatist in me wants to weigh the benefits versus the costs of AI, evaluating AI much more broadly than in simple financial terms. If editors can employ a GPT or LLM tool to improve their writing and be more efficient in doing their work, great. Potentially, AI could benefit all sectors of the book publishing industry, some sooner and more completely than others. Supply chain, for instance, is ripe for AI improvement of efficiency. Efficiency, saving time or effort, is not all that matters when it comes to using AI in book publishing though. Plus, measuring what is “good enough,” or leads to that threshold for customers, depends on the metrics you employ.

I concur with McIlroy that we need not paint ourselves into a philosophical corner by fabricating a straw panacea vs. apocalypse (what he calls promise and peril) scenario around AI. Even so, I don’t agree with the libertarian tack that, “you can only understand the perils surrounding a new technology after you fully appreciate the opportunities that it affords.” We need not stand idly by while AI proliferates (i.e., introduces opportunities), then look around and decide whether or not we like what we see happening. That’s because the issues surrounding AI far exceed business or financial or other practical matters related to book publishing. They relate to human agency, bias (conscious, unconscious, and systemic), and fallout, which shouldn’t be defined by “the opportunities [AI] affords.” Let me be clear that the human component is inherently neither promise nor peril. People will draw different conclusions or judgments about how AI either enhances or hinders human agency. It is people, after all, that decide how to employ AI and to what ends, until the technology itself begins to act on its own volition. Therefore, the question of human agency should inform the decision about how to build, utilize, and evaluate the impacts of AI on book publishing.

We can indeed imagine at least a portion of the good and the bad that will accompany AI before it matures into whatever it will become, and we should. A statement from the Center for AI Safety makes the stakes perfectly clear:

Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.

Alarmist as this may sound, pay attention to the people making the statement. These aren’t fringe kooks peddling conspiracy theories but the very people and companies inventing and developing the technologies on which AI is built. Without detouring too far afield, we must remind ourselves that the technologies on which AI is built are created by human beings. Programmers are human and programming is a human activity, though computers can do programming too. Human bias is part of that programming, in the form of the diversity of the individuals doing the programming, the training and knowledge they possess, the conscious and unconscious stereotypes and language choices they make, the designs and user interfaces they create, the data they feed into their AI programs, etc. AI technology is defined by the people that build the hardware, program the software, and decide what data to input. Therefore, it is not hard to envision how their choices will affect the output and results of whatever AI is asked to do.

One peril of AI’s use in the book publishing industry has to do with copyright and creativity. For instance, review the recent lawsuit filed against OpenAI by authors claiming that AI companies used copyrighted material without permission to train their machines. (Lawsuit says OpenAI violated US authors’ copyrights to train AI chatbot by Blake Brittain, June 29, 2023). Plagiarism is another manifestation of this issue. When AI is used to write a book manuscript, but the foundational data behind the AI is the creative work of known authors, which is under copyright, and perhaps permission was not obtained to use the copyrighted material in that way, has the AI infringed on the writer’s copyright? Who can claim ownership? When AI indiscriminately creates content that uses the exact words of a human author without crediting or securing permission to use that author’s words and ideas, is the AI or are the people that own or created the AI liable for plagiarism?

When addressing the friend or foe question, we would be wise to consider the value proposition. What value does AI add? What liabilities or drawbacks does AI bring with it? Do the pluses outweigh the minuses? What intended and unintended consequences may follow adoption of AI technology in book publishing? The doers will want to jump in, play around with AI, and see what happens. The thinkers will want to ponder the ramifications of AI across the waterfront of the industry. Like with social media, some will be all-in early adopters, others will refuse to touch the stuff, and people’s acceptance of AI will fall across a wide spectrum. AI may be nice when it works and a pain in the neck when it doesn’t, or works poorly. So people and the publishing industry will cherry pick how it adopts and where it utilizes the technology.

Upside Down or Not?

Having spent the past seven years at Radius Book Group, fully immersed in author-centered publishing (a.k.a. hybrid publishing), I am quite familiar with what disrupting the publishing industry looks like. Our entrepreneurial approach to publishing welcomes alternative strategies that help authors publish books successfully. On that score, I am inclined to say, “Hooray for AI!” It offers me tools and technologies I can experiment with and potentially employ to benefit my authors. Whether or not I use AI directly, partners in my publishing ecosystem may use AI, thus benefiting Radius even if we have no idea they are using AI. At the same time, Radius Book Group’s reputation is based on superior quality—in the level of excellence displayed by the people we employ and work with, the quality of the books we publish, the stature and substance of the authors we publish, our forward-thinking approach, and our unassailable integrity—we aim to be the best independent publishing house and to publish the most influential authors and impactful books anywhere. Therefore, if “good enough” is the bar, then neither are we impressed, nor are we interested in lowering our standards to that level.

While it may make a magazine article sound catchy to say AI will turn book publishing upside down, McIlroy’s assertion about the current trade book publishing industry becoming obsolete is far more noteworthy. Upside down and obsolete are not, in my opinion, synonymous. Nor will the automation of functions alone bring about obsolescence. It’s not so much through automation as through added capabilities that the industry will evolve a new form. Taking a long view of AI, perhaps in time we may look back and mark the present as a watershed moment for book publishing. Decades from now, the industry won’t be so much upside down compared to what is currently the case; because that presumes the current industry will still be around, just oriented differently. Rather, book publishing as we know it will be superseded by an industry altogether of a different order, what we might call “creative engagement.” Elements of present practices, what we recognize as book publishing, will persist (i.e., processes and products). Yet, as with many industries, barriers and lines of demarcation between discrete domains will blur and vanish, being replaced by wholly new amalgams yet to emerge. A single creative engagement—what we might think of as a book manuscript right now—will take many forms, will be consumed and experienced through all available technologies and channels of distribution, will become co-creative or collective acts, and will be commodified in ways that haven’t yet been invented. Books, those finite artifacts we love to hold and read and collect on library shelves and store in digital tablets, will still be available. They will have loads of company, however, that their original authors never dreamed of. And we’ll be left to wonder, “What will they think of next?”

web-20220422-Mark-Fretz-024