AI threatens authors by flooding the market with cheap imitations
“Why are you so worried about artificial intelligence? If your work is good, people will still buy it!”
That’s the basic argument I hear from people who think I’m reacting too strongly to generative AI (artificial intelligence that generates content). I should stop sweating it, they say. “Why not let others take advantage of it if they want to? It’s no skin off your nose. Besides, it’s fun!”
Well, I have plenty of fun doing my own writing without any help from something with the word “artificial” in its name, thank you very much. I also prefer butter to margarine, and the only reason I use artificial sweetener is because I’m diabetic, and too much sugar could kill me or cause neuropathy. I think that’s a pretty good reason.
So let’s get this out of the way right off the bat: If you have a disability that puts you at a disadvantage when it comes to writing, I have nothing against using AI to level the playing field.
But here’s the rub: For the rest of us, it doesn’t level the playing field—it tilts it toward the AI user.
Slush pile
In the era of self-publishing, writers are able to (and do) flood the market with their own works, bypassing the traditional publishing companies that served as “gatekeepers” between author and reader. They were, at best, imperfect gatekeepers, deciding what should and shouldn’t see the light of day, often using poor discretion and making rash decisions.
As someone who’s been published traditionally and who publishes his own work, I’m all in favor of self-publishing. It puts creative control in the hands of the author. However, it has the unfortunate side effect of flooding the market. When everyone from the novice to the bestseller can publish their own works, there’s a lot more from which to choose. Amazon becomes a living slush pile, and the reader has to do the work previously ascribed to the gatekeeper (traditional publisher) of deciding what’s worthy and what isn’t.
The result, predictably, is that readers—who have jobs of their own and no time to do an acquisition editor’s job on top of it—rely on things such as marketing, cover art, trends, and books that “go viral” in order to make buying decisions. Those of us who don’t have a big marketing budget are at an immediate disadvantage.
The one advantage I do have as an author is the ability to churn out a large number of titles in a short period of time. This keeps my name in front of readers, and being prolific (26 titles in three years) is a great talent to have in a world driven by instant gratification and “what have you done for me lately?”
AI robs me of that lone advantage. By generating content for “authors,” it enables them to churn out the same number of books in a fraction of the time—using work that is not their own. They are then able to flood the market and, if they have a marketing budget bigger than mine (something that is probably the norm rather than the exception), make their derivative works more visible than my original works.
Checkmate.
Class dismissed
Put simply, AI gives them an unfair advantage. It tilts the playing field.
Imagine being in a class that’s graded on a curve in which everyone else has an “open-book” final exam, while you are instructed to answer questions from memory. They all get higher scores than you do, so you flunk. AI is like that. It doesn’t reflect an author’s ability, any more than such an open-book exam reflects a student’s grasp of the subject. (The use of open-book tests has, frankly, always mystified me for this very reason.)
Another analogy: Allowing one team to put a sixth player on the court during a basketball game. No one would allow that, yet a writer putting a robotic “sixth player” in the game is somehow OK? I just don’t buy it.
If you think it’s harmless, think again. Automation has already put thousands of employees out of work by replacing human checkers with automated checkstands in places like Walmart. And look what’s happened to newspapers, once gatekeepers in their own right, now reduced to a shadow of their former selves by another form of AI: targeted internet advertising. If you think the result—being without an independent watchdog holding public officials accountable—is fine and dandy, I urge you to take another look at the polarized chaos that now passes for politics and social interaction.
I haven’t even mentioned the threat AI poses to intellectual property rights. But others have. Authors Paul Tremblay and Mona Awad have sued ChatGPT’s parent company for using their novels as a framework to “train” its generative AI programs. Comedian Sarah Silverman and two other authors have done the same. If what they allege is true, this is intellectual theft. And if AI programmers can steal from others, they can steal from me.
Apples and oranges
Generative AI isn’t like a photograph in relation to a painting or an airplane compared to a bicycle. Those inventions enabled human beings to do explicitly new and different things that they hadn’t been capable of before. Generative A1 isn’t new or different; it’s derivative. It doesn’t break new ground, it mimics legitimate artistic expression and seeks to pass itself off as the same. It is, in a word, phony: an imposter seeking to supplant the real thing.
And it’s not like looking something up in a thesaurus; it’s a program that actually finishes (and suggests) full sentences for you. That, to me, is a big problem, because you’re no longer writing the book, you’re collaborating with a program that has been “trained” using the works of other authors such as Tremblay and Awad. So unless you want to share a byline with ChatGPT and whatever authors whose work it has data-mined, I suggest you avoid using it.
(Personally, I think it might be fun to write a book with Tremblay in particular, but I would have the decency to ask him to collaborate.)
I’m hardly alone in my concern about generative AI. A Reuters/Ipsos poll in May found that more than 6 in 10 Americans—not just authors and artists—are worried about the adverse effects of AI on the future of humanity.
What’s that, again, about A1 being “fun”?
My response
When it comes down to it, I’m not worried about Skynet taking over the world, but I am worried about AI threatening jobs and the future of free, creative artistic expression.
In response, I plan to include the following notation in my works going forward, and urge other authors to craft a similar statement.
“The contents of this volume and all other works by Stephen H. Provost are entirely the work of the author, with the exception of direct quotations, attributed material used with permission, and items in the public domain. No artificial intelligence (“AI”) programs were used to generate content in the creation of this or any of the author’s works.”
Or as Queen so succinctly put it on their best albums: “No synthesizers!”
Stephen H. Provost is the author of 50 books, covering topics ranging from highway history to the shopping centers in America, as well as fantasy, adventure, and science fiction novels. He is also the founder of the ACES of Northern Nevada online bookshop portal. His books are available on Amazon. Banner image: Jonathan Harris as Dr. Zachary Smith and the Robot from the original “Lost in Space” (public domain photograph).