A psychological contract is an informal agreement between two sides that is built around a set of expectations in terms of behaviours. It was first developed in a business context, but I think it also applies in other areas.
I have been thinking about this mostly in relation to copyright, and the government’s recent pronouncements around artificial intelligence that suggest it wants to weaken the rules around how written content is used and repurposed by, in the main, big tech.
In December, the UK government launched a consultation “on proposals to give creative industries and AI developers clarity over copyright laws”. The government wants to introduce “an exception to copyright law for AI training for commercial purposes while allowing rights holders to reserve their rights, so they can control the use of their content”.
In its rationale the government re-hashes the old canard that there is “uncertainty about how copyright law applies to AI”. Just this week it further showed its hand, with prime minister Keir Starmer promising to “mainline AI into the veins” of the nation, becoming an AI maker, rather than taker, as part of a wider plan to give researchers and AI companies access to public data sets, while investing in super-computers and other infrastructure to underpin the push.
Creators want an opt-in model, not one where they have to actively opt out
It is little wonder that Nicola Solomon, former chief executive of the Society of Authors (SoA), says the UK government has become “beguiled by big tech”. Solomon is now a special adviser and former chair of the Creators’ Rights Alliance (CRA), a 500,000-strong cross-industry members body that is part of the Creative Rights in AI Coalition. The government consultation runs until the end of February but, as Solomon told The Bookseller this week, its approach so far has been “insulting”. Author Richard Osman was similarly succinct, telling the Guardian: “A lot of the issues around AI are complex, but this one is very simple. If you want to use a copyrighted work, you ask permission, and then you pay for it. Anything else is theft.”
It is the “ask permission” bit that is important: creators want an opt-in model, not one where they have to actively opt out. Deals such as those done by HarperCollins – and a range of academic publishers – with AI businesses done on this basis, are part of a way forward that the government risks undermining. In its brief, it argues that such licences are in their “infancy”, asserting that a change in the law will increase the value of these arrangements for those works reserved by their copyright owners. Actually, it could pull off the same trick by insisting on an “opt-in” approach, making it clear that any other approach is theft.
This is not a battle the government will win easily. Its use of terms such as “disputed” around the current law are misleading, and it will find many battle-hardened creatives, publishers and their legal representatives standing in line to argue that UK copyright law is a gold standard, and well tested in such new environments.
I also worry that this breaking of the psychological contract between writers and the society that supports them is emblematic of a government that cares too little about creativity. Copyright is not a modern invention (circa 1700), but it was originally seen as way of encouraging learning, something once regarded as a social good.
In proposing this exception, the government is underming this covenant and putting at risk a centuries-old commitment to authors (and others) that will have ramifications well beyond what big tech does with its new toy. Governments should back our artists, not pick their pockets.