You are viewing your 1 free article this month. Login to read more articles.
Nicola Solomon, a special adviser and former chair of the Creators’ Rights Alliance (CRA), and the former chief executive of the Society of Authors (SoA), says the UK government is being “beguiled by big tech” into “giving away” authors’ work to train generative Artificial Intelligence (AI).
Solomon, a solicitor with an in-depth knowledge of the publishing industry and the many associated legal areas, from copyright and defamation, to privacy, data protection and contract, was awarded an OBE in the 2025 King’s New Year’s Honours List for services to literature and the creative industries.
This accolade does not mean she is going to be giving the government a break anytime soon, however. She told The Bookseller: “I’ve worked for creators, often individuals, to help empower them, and to help their working life to be a bit better so they can continue to inspire us. If the government is prepared to recognise that services to creators are a good thing, then it ought to think that their work is important enough to put in some legislation in place that makes the lives of creators better, so they can continue to create. That’s the spirit in which I’m accepting the OBE.”
The CRA is a 500,000-strong cross-industry members’ body. It is part of the Creative Rights in AI Coalition, along with the Publishers’ Association, SoA and Authors’ Licensing and Collecting Society (ALCS) and 30 other members’ bodies.
In December, the UK government launched a consultation process “on proposals to give creative industries and AI developers clarity over copyright laws”. The government’s proposals suggest introducing “an exception to copyright law for AI training for commercial purposes while allowing rights holders to reserve their rights, so they can control the use of their content”, plus transparency rules.
Effectively, this proposal puts the onus on creators to “opt out” of such usage. “It’s very disappointing. What’s happened is [the government] has been beguiled by big tech. Big tech can afford to have policy and lobbyist people in Whitehall, and [creators] can’t,” Solomon told The Bookseller.
The CRA and others are gathering responses from members to meet the government’s 25th February consultation deadline, but Solomon’s view is already damning.
“I’d like to think that the government doesn’t understand the effect of its proposals on creators, but I find that astonishing, because we’ve been making this case for a very long time,” she said. “Before it put out the consultation [on AI], it called together organisations right across the creative industries and said, ‘What do you think about this? Will an opt-out provision work?’ And we all said, ’No. It doesn’t work, and you’re giving away creators’ rights.’”
Solomon points out that what the government is suggesting goes against the Berne Convention for the Protection of Literary and Artistic Works. She also describes the Copyright Act as “a gold standard system that works”, so the idea of making an exception for AI is “insulting”.
An argument sometimes made by tech companies, is that some large language model (LLM) AI tools have allegedly already been trained using large tranches of books that have been pirated.
Solomon said: “Copyright law has always been there. I find it extraordinary that tech companies even try to suggest that [using pirated works to train LLMs] isn’t an infringement. We’ve asked and asked in government meetings with big tech for them to provide us with something which explains to us why they believe it isn’t copyright infringement. They have never done it. I’m a copyright specialist. If big tech could tell me why the copyright law doesn’t apply then we would respond, but they don’t even attempt to. Essentially the argument is: we’ve forced our way into your house. We’ve taken it. Stop moaning.”
Licensing work, as the government’s AI proposals suggest, would offer some potential remuneration to creators, and Solomon isn’t averse to licensing, but she says the opt-out being suggested is not good enough because creators will find any infringements of their copyright hard to enforce.
Continues...
“I don’t think there’s anything wrong in licensing models. In fact, I think licensing models are absolutely the way forward, but they must be freely entered into by creators who are prepared for their work to be used. One problem is that we can’t even know what [big tech] has used [to train LLMs already],” Solomon said.
“I think the government is suggesting opt-outs because it’s looking over at Europe. But it’s not at all clear what it means in the European Act, and how that’s going to work out. Also, if you look at European law, they have the Copyright Directive, which very unfortunately we missed entering into because it came exactly at the time of Brexit, but the government had, at that time, agreed that we would sign up to. The Copyright Directive includes the transparency triangle, is essential for creators. The transparency triangle insists that anyone using copyright work should make available how it’s being used, how much money they’ve made from it, and the creator will be equitably remunerated for such uses and have the right for those rights to revert if work isn’t being exploited. These are essential safeguards which together with effective enforcement mechanisms and a ’touch once only’ system for opt out would be a minimum baseline before opt out could even be considered."
When it comes to books and AI, Solomon says publishers should be leading the way in protecting creators and ensuring that they receive a fair share of any remuneration from uses of their work. Having been chief executive of the SoA for 13 years, Solomon can see the parallels over contractual rights to eBooks. “As we all know, contracts are incredibly variable. My view is the publishers should start from the basis that they don’t own the rights [to license material for training AI],” she said. “When I started at the Society of Authors 14 years ago we were looking into this with e-books. We looked at the contracts, and in most contracts the rights granted didn’t include e-books. Publishers need to work with authors and their representatives to agree fair licensing models.”
A Department for Culture, Media and Sport spokesperson said: “Currently the application of copyright law to AI is disputed and uncertainty is holding back both the creative and AI sectors from realising their full potential. This status quo cannot continue which is why we are consulting on proposals for a way forward.
“We will not rush into decisions without being confident that we have a practical and effective plan that gives creators increased transparency and control over how their work is used by AI firms, and improves their ability to be paid for its use.”