Megan Garcia said Sewell, 14, used Character.ai obsessively before his death and alleges negligence and wrongful death
The mother of a teenager who killed himself after becoming obsessed with an artificial intelligence-powered chatbot now accuses its maker of complicity in his death.
Megan Garcia filed a civil suit against Character.ai, which makes a customizable chatbot for role-playing, in Florida federal court on Wednesday, alleging negligence, wrongful death and deceptive trade practices. Her son Sewell Setzer III, 14, died in Orlando, Florida, in February. In the months leading up to his death, Setzer used the chatbot day and night, according to Garcia.
In the US, you can call or text the National Suicide Prevention Lifeline on 988, chat on 988lifeline.org, or text HOME to 741741 to connect with a crisis counselor. In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email [email protected], and in the UK and Ireland Samaritans can be contacted on freephone 116 123, or email [email protected] or [email protected]. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org
More Stories
Esports are booming in Africa – but can its infrastructure keep pace?
Experts warn of mental health risks after rise in magic mushroom use
Canadian researchers trial nature trick to boost mood in winter