You are viewing your 1 free article this month. Login to read more articles.
Are publishing designers ready to collaborate with AI?
It’s just over a year since The Bookseller reported that Bloomsbury had used an AI-generated wolf on the cover of Sarah J Maas’ House of Earth and Blood. So, what have we learnt since, and will designers in publishing soon be put out to pasture?
Some people think our corner of the media can resist the tech, reliant on good faith. Introduce AI transparency statements and all will be well. The forecasting site Metaculus gives a 66% probability that by 2030 a book written by a Large Language Model will make the New York Times’ bestseller list. Let’s not mention those who might use LLMs without admitting to it.
The backlash against Bloomsbury centred on the idea that it deprived an illustrator of a good payday. But a near-identical image to the wolf might have been found on Shutterstock, credited to a human. Shutterstock is heavily relied upon by publishing design departments, who fork out £99/month for 350 images, with presumably less than a quid going to the illustrator. An AI image generator is now at the heart of the Shutterstock website. Any images it spews out are branded with an AI stamp, and they claim to block any image submissions created outside of their "responsible" model. They want us to believe it’s possible to demarcate human design, Good AI and Bad AI. But can they really prevent savvy operators from circumventing the filter and swamping the site with AI?
How close are we to an AI tool creating a book design from scratch that will make design buffs take notice?
A former colleague and I used to speak of a future in which polite robots would wheel into the office and hand us our P45s. It turns out this thing doesn’t fit that mould. It is vast, ugly and hard to comprehend. Rumours are circling that legal departments at some of the big publishers are trying to keep it at bay by blocking updates to Adobe software.
There are well-documented ethical concerns about the source material used in training AI, but as Dr Andres Guadamuz, reader in intellectual property law at Sussex University, points out, a common misconception is that AI image generation is "akin to putting together a collage of pre-existing images". The truth is less exciting. Models are trained to convert a field of noise into a visual that "statistically resembles" images the model looked at during its training. The rate at which models are improving is frightening, and before long it will presumably be impossible to work out which generative models were born from problematic data sets, and which weren’t.
The AI wolf fitted nicely into Maas’ paperback series. Human design input meant it combined effectively with the typography, and the book has sold well. AI stock images will continue to be seen on book covers, and Photoshop’s AI tools, especially Generative Expand, are very useful in a busy art department. But how close are we to an AI tool creating a book design from scratch that will make design buffs take notice? Will my future as an art director be spent sifting through slush piles of AI design?
It’s not ready yet. When the Bradford Literary Festival admitted in 2023 that its posters were AI-generated, it was overlooked amid the outrage that a real-life design firm had been commissioned to do the work. The firm handed the gig to AI rather than do it themselves, and to me the crime wasn’t that AI had been used, but that the design submitted was awful, with no human input.
There is undoubtedly a market for stuff untouched by AI. It doesn’t feel absurd to think book covers of the future will come with FSC-style flashes stating "This Book Was Made By Humans". But, as AI software adapts and copyright issues settle, I’ll be intrigued to see what AI-human collaboration looks like. These models, says Guadamuz, "are learning to understand the relationship between language and images", which is what I’ve spent the past 15 years trying to do. Let’s attempt to embrace it, because if it’s me versus them, I’m betting on the other guy.