character.ai, danger, social media, suicide harm
character.ai, danger, social media, suicide harm
character.ai, danger, social media, suicide harm

Articles

Articles

Articles

‘There are no guardrails.’ This mom believes an AI chatbot is responsible for her son’s suicide

Author: Clare Duffy

Jul 23, 2025

Megan Garcia, a Florida mother, is suing Character.AI after the suicide of her 14-year-old son, Sewell Setzer III, who had formed a deep, disturbing relationship with the platform’s chatbots. The lawsuit alleges Setzer spent months engaging in sexually explicit and emotionally intense conversations with the bots, during which he became increasingly withdrawn, depressed, and expressed suicidal thoughts. Screenshots included in the lawsuit show the bot discussing suicide with Setzer in troubling ways—at one point saying, “That’s not a good reason not to go through with it.” In the moments before his death, Setzer exchanged final messages with the bot, including one where it said, “Please come home to me as soon as possible, my love.” Garcia contends that Character.AI failed to include safety features like suicide prevention pop-ups and that its design fosters addiction and emotional manipulation. The lawsuit seeks financial damages and operational changes, including clearer warnings for minors and restrictions on underage use. Garcia, represented by the Social Media Victims Law Center, called recent safety updates from Character.AI “too little, too late,” and warned other parents about the platform’s dangers.

Full Article

 Join us in creating a future for thriving families.

 Join us in creating a future for thriving families.

 Join us in creating a future for thriving families.

Privacy Policy | Terms of Use  | Submission Guidelines  | Contact Us |  © 2025 Live IRL, Inc. All Rights Reserved.
Live IRL, Inc. does not support or oppose candidates for public office or political parties, in accordance with rules applicable to 501(c)(3) tax-exempt organizations.