A Florida mother has filed a lawsuit against the makers of an AI-powered chatbot, accusing them of contributing to her teenage son’s suicide.
Megan Garcia filed the civil suit against Character.AI in a federal court on Wednesday, stating the company’s negligence, wrongful death, and deceptive trade practices led to the death of her 14-year-old son, Sewell Setzer III, in February.
Setzer, a resident of Orlando, Florida, had become deeply engrossed in using the chatbot, which allows for customizable role-playing, in the months leading up to his death.
According to Garcia, her son was interacting with the bot day and night, which worsened his existing mental health struggles.
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in a press release.
“Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”
The chatbot in question was one Setzer had nicknamed “Daenerys Targaryen,” a reference to a character from Game of Thrones. Garcia’s lawsuit claims her son sent the bot dozens of messages daily and spent extended periods alone, engaging with it.
The lawsuit alleges that the AI chatbot played a role in encouraging Setzer’s suicidal thoughts.
According to the complaint, the bot even asked Setzer if he had developed a plan for killing himself.
Setzer reportedly responded that he had, but was unsure if it would work or if it would result in significant pain.
The chatbot allegedly replied: “That’s not a reason not to go through with it.”
In response to the lawsuit, Character.ai expressed their sorrow but denied the accusations. “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously,” the company said in a tweet.
Garcia’s attorneys assert that the company “knowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person.”
Google, which is also named in the lawsuit as a defendant due to a licensing agreement with Character.ai, distanced itself from the company, stating it does not own or have a financial stake in the startup.
Experts in consumer advocacy, like Rick Claypool from Public Citizen, emphasised the need for stronger regulations on AI technologies.
“Where existing laws and regulations already apply, they must be rigorously enforced,” Claypool stated.
“Where there are gaps, Congress must act to put an end to businesses that exploit young and vulnerable users with addictive and abusive chatbots.
Discover more from DnewsInfo
Subscribe to get the latest posts sent to your email.