You are viewing your 1 free article this month. Login to read more articles.
The Society of Authors (SoA) has published guidance for authors on how they can protect themselves and their work from “the impact of new technologies”, as it says the emergence of Artificial Intelligence (AI) systems is “already having a direct impact on SoA members”.
Its practical steps, which the organisation says it will review regularly given the fast-evolving nature of the technologies, cover author’s use of AI as a tool and what to consider, and publishers’ use of AI.
Earlier this week the Publishers Association (PA) announced it is forming a new AI Taskforce to support the industry as it navigates rapid new developments in AI. Meanwhile both Pan Macmillan and Bonnier Books UK are setting up internal working groups to help shape their approach to the new technology, including directives not to publish AI-generated books, and to be transparent about any generative AI tools used in the publishing process.
The SoA advisory committee that compiled the guidance said: “We have seen AI companies advertise editing and book review services and others selling machine translations. AI narration is used in the creation of audiobooks, and the publishing industry is already using AI-generated images on book covers and in promotions.
“Some AI-generated content may be low grade at present, but it is good enough in many situations and it is improving daily. AI learns by accessing content and copying it briefly before deleting it, after which the system will remember what it has learned. If the copying takes place without the permission of the copyright holder, it is an infringement of copyright under UK law even though the copy is held only briefly by the AI before being deleted. It is important to ensure that you try to control whether and how AI has access to your work, your style and your voice.”
In terms of contracts, it suggests authors ensure it is clear in the contract how the work will be used by the publisher and to try to limit licensing to third parties, as well as ensuring work can’t be amended without consent. When it comes to AI specifically, it warns that publishers and developers can monetise authors’ work by using it to assist machine learning, so encourages authors to ensure the contract contains a clause forbidding such use.
Moreover, it suggests authors might want to prevent AI technologies being used in connection with the creation or exploitation of their work – for instance, forbidding AI-rendered translation, editing, cover design, indexing and audio recordings. It states, too: “Your publisher may likewise ask for assurances from you as to whether or not you have relied on AI to, for example, render a draft or a rough translation. It is in everyone’s interest to be transparent and to protect the authenticity and originality of human creators.”
Considering “your voice”, the advisory team says: “If you earn your living from your voice – as a performer, or narrating audiobooks, for example – be sure to weigh up the pros and cons of AI carefully. It might be tempting to sell rights allowing copying of your voice to an AI company, but bear in mind that doing so will help systems learn to imitate human speech generally, and you personally. Money today could simply speed up the absence of any work at all in the future.”
It urges authors to ask publishers to confirm they “will not make substantial use of AI for any purpose in connection with your work – such as proof-reading, editing (including authenticity reads and fact-checking), indexing, legal vetting, design and layout, or anything else without your consent”. The full guidance can be found here.
Ultimately, transparency underpins all the guidance, the SoA concludes, stating: “A common theme in the points above is the need for transparency and for you to understand the risks as well as the opportunities in the systems you and those you work with use. Before entering into agreements, be aware of what you are agreeing to – whether you are about to sign a contract for publication or you are signing up to an online service.
“But we live in a tick-box world, in which we rarely think twice before clicking uncritically to agree to a website’s terms during a sign-up process. Now more than ever, as companies implement new ways to monetise and manipulate our information, behaviour and creativity, we need to pause at these moments of consent to ensure we clearly understand what we are signing away.”