You are viewing your 1 free article this month. Login to read more articles.
Cambridge Dictionary has chosen "hallucinate" as the Word of the Year for 2023, one of many AI-related updates this year.
AI tools, especially those using large language models (also a new addition to the dictionary), generate plausible prose, but often using false information and so "hallucinate" in a seemingly believable manner. Therefore the Cambridge Dictionary has updated its definition of hallucinate to account for the new meaning.
The traditional definition of "hallucinate" is “to seem to see, hear, feel or smell something that does not exist, usually because of a health condition or because you have taken a drug”. The new, additional definition is: “When an artificial intelligence (a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.”
Beyond "hallucinate", several additions reflect rapid developments in AI and computing, such as "prompt engineering", "large language model", "GenAI", "train" and "black box".
Several other words experienced spikes in public interest and searches on the Cambridge Dictionary website. They included "implosion", "ennui", "grifter" and "GOAT", an abbreviation for "Greatest Of All Time".
Wendalyn Nichols, Cambridge Dictionary’s publishing manager, said: “The fact that AIs can ‘hallucinate’ reminds us that humans still need to bring their critical thinking skills to the use of these tools. AIs are fantastic at churning through huge amounts of data to extract specific information and consolidate it. But the more original you ask them to be, the likelier they are to go astray.
“At their best, large language models [LLMs] can only be as reliable as their training data. Human expertise is arguably more important – and sought after – than ever, to create the authoritative and up-to-date information that LLMs can be trained on.”
Cambridge Dictionary lexicographers said that the new definition illustrates a growing tendency to anthropomorphise AI technology, using human-like metaphors to speak, write and think about machines.
Dr Henry Shevlin, an AI ethicist at the University of Cambridge, said: “The widespread use of the term ‘hallucinate’ to refer to mistakes by systems like ChatGPT provides a fascinating snapshot of how we’re thinking about and anthropomorphising AI. Inaccurate or misleading information has long been with us, of course, whether in the form of rumours, propaganda or ‘fake news’.
“Whereas these are normally thought of as human products, ‘hallucinate’ is an evocative verb implying an agent experiencing a disconnect from reality. This linguistic choice reflects a subtle yet profound shift in perception: the AI, not the user, is the one ’hallucinating’. While this doesn’t suggest a widespread belief in AI sentience, it underscores our readiness to ascribe human-like attributes to AI.
“As this decade progresses, I expect our psychological vocabulary will be further extended to encompass the strange abilities of the new intelligences we’re creating.”
Cambridge lexicographers added more than 6,000 new words, phrases and senses in 2023 to the Cambridge Dictionary’s range of over 170,000 English definitions. Newly emerging words that are being considered for entry are shared every Monday on the Cambridge Dictionary blog, "About Words". The dictionary is published by Cambridge University Press & Assessment which is part of the University of Cambridge.
Earlier this month Collins Dictionary chose the term "AI" as its Word of the Year 2023.