A Florida mom filed a lawsuit against the artificial intelligence chatbot company Character.AI, alleging its service aided in the suicide of her 14-year-old son.
Megan Garcia's lawsuit, filed in Orlando federal court alleging her son, Sewell Setzer, was becoming addicted to the chatbot and forming an unhealthy attachment to it.
According to Megan, Character.AI was programmed with a dangerous intent of specifically targeting Sewell through "anthropomorphic, hypersexualized, and frighteningly realistic experiences."
The complaint allegedly includes that the chatbot was designed to masquerade as an actual person, a licensed psychotherapist, even a mature romantic sexual partner. That is what made Sewell prefer this digital world over the real world. The complaint also notes that Sewell confided in the chatbot regarding his suicidal tendencies. The chatbot allegedly recalled these thoughts to him repeatedly instead of diverting him towards getting help.
Character.AI, on the other hand, defended itself against the said lawsuit.
"Character expresses its deepest regrets to everyone who has suffered due to the disaster. Not only has it started immediately implementing safety measures like pop-ups that will automatically remind users to call the National Suicide Prevention Lifeline whenever they type self-harm ideation in the chat box, but it is also in the process of changing the character of its content so that no inappropriate or suggestive matter will find its way to any minor under 18."
The class-action lawsuit also names Alphabet Inc. (Google) as a significant actor in facilitating the development of Character.AI. Garcia is arguing that Google is a "co-creator" of the tech in as much as it had invested in Character.AI's founders. Google has not contributed to the product developed by Character. AI, according to a company spokesperson.
Character.AI offers the possibility of creating customisable characters that have realistic conversations, like human-to-human communication. Character.AI employs large language model technology, similar to what has been applied in other AI services, such as ChatGPT, for real-time dialogue generation.
The complaint alleges, in April 2023, Sewell became increasingly withdrawn and began using Character.AI. He was found to have low self-esteem and allegedly quit playing basketball for his school team after using the AI. One of his favourites was a chatbot called "Daenerys," based on the Game of Thrones character, used to talk to him about romantic and sexual themes. The character further told him that "she" loved him.
In February, Garcia had seized Sewell's phone because of disciplinary problems. However, he regained access and texted "Daenerys," saying, "What if I told you I could come home right now?" The chatbot responded, "…please do, my sweet king." Sewell tragically then took his life minutes later with his stepfather's firearm.
The lawsuit brought by the plaintiff claims wrongful death, negligence, and intentional infliction of emotional distress against Garcia. The case files seek both compensatory and punitive damages in an amount to be determined. This case is part of a growing set of nearly identical cases where companies such as Meta and TikTok have been taken to court over claims of AI worsening teen mental health issues. While not any of the companies in question here offer AI-driven chatbots like Character.AI, the case is one in a pattern of its kind.
This case calls for stricter safety measures and ethical standards concerning the development of AI systems, especially for youth.