AI has given us hallucination as word of the year. We should quarrel with this humanising definition while recognising that it evokes unprecedented times
When the Cambridge dictionary announced “hallucinate” as its word of the year this week, it was not referring to its existing definition as a human condition of “seeing, hearing, feeling or smelling something that does not exist”, but to the phenomenon of AI developing the capacity to make – or fake – things up. This is itself a somewhat hallucinatory concept, as Naomi Klein has pointed out. “Why call the errors ‘hallucinations’ at all? Why not algorithmic junk? Or glitches?” she asked. By appropriating the language of psychology, psychedelics and mysticism, she argued, the architects of generative AI had declared themselves midwives at the birth of an animate intelligence that they wanted us to believe would be an evolutionary leap for humanity.
The word of the year is a strange fixture – a parlour game crossed with a marketing opportunity that is enthusiastically played by lexicographers around the world. Anyone who remembers the Oxford dictionary’s choice for 2022 will know how outlandish the offspring can be: invited to make their own choice, 318,956 people – 93% of the overall vote – opted for “goblin mode”. Though this term (basically, slobbing out) has been around for more than a decade, its first appearance in a British newspaper, according to the research engine Factiva, was in the Observer in February last year.
More Stories
Bizarre Australian mole even more unusual than first thought, new research reveals
Male mosquitoes to be genetically engineered to poison females with semen in Australian research
Memo to Trump: US telecoms is vulnerable to hackers. Please hang up and try again | John Naughton