‘There are no guardrails.’ This mom believes an AI chatbot is responsible for her son’s suicide
Author: Clare Duffy
Jul 23, 2025
Megan Garcia, a Florida mother, is suing Character.AI after the suicide of her 14-year-old son, Sewell Setzer III, who had formed a deep, disturbing relationship with the platform’s chatbots. The lawsuit alleges Setzer spent months engaging in sexually explicit and emotionally intense conversations with the bots, during which he became increasingly withdrawn, depressed, and expressed suicidal thoughts. Screenshots included in the lawsuit show the bot discussing suicide with Setzer in troubling ways—at one point saying, “That’s not a good reason not to go through with it.” In the moments before his death, Setzer exchanged final messages with the bot, including one where it said, “Please come home to me as soon as possible, my love.” Garcia contends that Character.AI failed to include safety features like suicide prevention pop-ups and that its design fosters addiction and emotional manipulation. The lawsuit seeks financial damages and operational changes, including clearer warnings for minors and restrictions on underage use. Garcia, represented by the Social Media Victims Law Center, called recent safety updates from Character.AI “too little, too late,” and warned other parents about the platform’s dangers.