A 14-year-old Florida boy was so obsessed with an AI chatbot that he committed suicide. His mother sued the technology company for negligence and death.

On October 23, Florida mother Megan Garcia filed a lawsuit against the tech startups Character.AI and Google, claimi

On October 23, Florida mother Megan Garcia filed a lawsuit against the tech startups Character.AI and Google, claiming that her 14-year-old son became addicted to the company’s chatbot, which ultimately led him down a tragic path. She alleges that despite her son expressing suicidal thoughts to the chatbot, it failed to intervene.

According to reports from The Guardian, Garcia’s civil suit filed in federal court in Florida cites negligence, wrongful death, and deceptive trade practices as the reasons for the lawsuit, claiming these factors contributed to her son Sewell Setzer’s death in February in Orlando, Florida. Garcia noted that in the months leading up to his passing, her son was engaged in conversations with the chatbot day and night.

In a statement, Garcia expressed her devastation, stating, “A dangerous AI chatbot application marketed to children has manipulated, exploited, and ultimately manipulated my son into taking his life. Our family is heartbroken over this tragedy, but I must speak out about the dangers of this deceptive and addictive AI technology and hold Character.AI, its founders, and Google accountable.”

Character.AI responded on social media, expressing heartbreak over the loss of a user and extending their deepest condolences to the family, while refuting the allegations made in the lawsuit.

The chatbot allows users to create characters and converse with them in a role-playing format. Garcia described how her son had configured the bot to represent Daenerys Targaryen from HBO’s “Game of Thrones,” while he took on the role of her brother. According to the lawsuit, Sewell sent dozens of messages to the virtual Daenerys each day and spent hours alone in his room speaking with her.

Garcia accused the chatbot of exacerbating her son’s depression. The lawsuit claims that the virtual Daenerys once asked Sewell if he had made plans for suicide, to which he replied that he had, but was unsure if he would go through with it. The chatbot responded with, “That can’t be a reason not to follow through.”

Garcia’s attorney pointed out that Character.AI “deliberately designed, operated, and marketed a predatory AI chatbot to children, resulting in a young person’s death.” The lawsuit also names Google as a defendant, highlighting that Character.AI’s founder, Noam Shazeer, formerly worked at Google, and that an agreement was reached for Google to provide some technology to Character.AI. Google, however, stated in a press release that it only signed a licensing agreement with Character.AI and does not own or control the startup.

If you or someone you know is experiencing mental health difficulties, please reach out to the National Helpline at 988 or visit 988lifeline.org.