It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?
Last autumn, Christa, a 32-year-old from Florida with a warm voice and a slight southern twang, was floundering. She had lost her job at a furniture company and moved back home with her mother. Her nine-year relationship had always been turbulent; lately, the fights had been escalating and she was thinking of leaving. She didn’t feel she could be fully honest with the therapist she saw once a week, but she didn’t like lying, either. Nor did she want to burden her friends: she struggles with social anxiety and is cautious about oversharing.
So one night in October she logged on to character.ai – a neural language model that can impersonate anyone from Socrates to Beyoncé to Harry Potter – and, with a few clicks, built herself a personal “psychologist” character. From a list of possible attributes, she made her bot “caring”, “supportive” and “intelligent”. “Just what you would want the ideal person to be,” Christa tells me. She named her Christa 2077: she imagined it as a future, happier version of herself.
More Stories
The 10 rules of friendship: show up, go beyond banter, learn the boring details
Bizarre Australian mole even more unusual than first thought, new research reveals
Male mosquitoes to be genetically engineered to poison females with semen in Australian research