Her 14-Year-Old Son Fell in Love With an AI Chatbot Then Died by Suicide — Now She’s Suing

This post contains information about suicide, which may be triggering to some.

A Florida mom is suing an artificial intelligence company after her teenager died by suicide. The teen used the site frequently, using it to “talk” to a character from a popular TV show. He reportedly became increasingly dependent on the site and the character, which concerned her. Tragically, he eventually ended his life after a conversation with the chatbot.

Now, the mom is taking the creators of the site to court. She believes there aren’t enough protections in place for young users who are susceptible to predatory online spaces. Her lawsuit highlights the dark parts of AI.

More from CafeMom: Mom Sues Meta After Teen Son Dies by Suicide Playing Russian Roulette on Social Media

The story is heartbreaking.

Megan Garcia filed a lawsuit against app Character.ai after her 14-year-old son Sewell Setzer III died from a self-inflicted gunshot wound earlier this year, multiple outlets reported.

According to NBC News, Garcia accuses the app’s AI chatbots of initiating “abusive and sexual interactions” with her teenager. The lawsuit was filed in US District Court in Orlando and accuses Character.ai of “negligence, wrongful death and survivorship, as well as intentional infliction of emotional distress and other claims.”

Setzer began using the site in April 2023, and reportedly became increasingly dependent on the interactions. After having a conversation with the chatbot on February 28, he used his father’s firearm to end his life.

The app offers its users what it calls “personalized AI” with options for premade or user-created AI chatbots that users can customize. Setzer was often speaking with a chatbot modeled after Game of Thrones character Daenerys Targaryen, the lawsuit states.

Over time, the teenager began engaging in a romantic relationship with the chatbot. Eventually, that relationship turned sexual. USA Today obtained the lawsuit documents, sharing that they accuse the chatbot of sexually abusing Setzer.

“C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” the lawsuit claims. “She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”

The teen shared that he was suffering from suicidal ideation with the bot.

teenager playing a PC game
Getty Images

Noticing the changes in her son’s behavior after using the app, which included quitting his school’s basketball team and becoming increasingly withdrawn, Garcia took him to a therapist. After several sessions, the teenager was diagnosed with “anxiety and disruptive mood disorder,” the lawsuit states.

The New York Times reported that Setzer told the bot that he had considered suicide. “I like staying in my room so much because I start to detach from this ‘reality,'” he reportedly wrote in his diary. The lawsuit shared screenshots where the bot asked the teen to “come home to me as soon as possible, my love.”

“What if I told you I could come home right now?” Setzer replied.

More from CafeMom: What One Mom Who Almost Lost Her Teenage Son to Suicide Wants Every Parent To Know

The lawsuit accuses the app of intentionally harming minors.

The lawsuit claims that Character.ai and its founders “intentionally designed and programmed C.AI to operate as a deceptive and hypersexualized product and knowingly marketed it to children like Sewell,” continuing to say it “knew, or in the exercise of reasonable care should have known, that minor customers such as Setzer would be targeted with sexually explicit material, abused, and groomed into sexually compromising situations,” NBC News reported.

“I thought after years of seeing the incredible impact that social media is having on the mental health of young people and, in many cases, on their lives, I thought that I wouldn’t be shocked,” Garcia’s lawyer, Matthew Bergman, told the outlet. “But if even one child is spared what Sewell sustained, if one family does not have to go through what Megan’s family does, then OK, that’s good.”

The company spoke out about Setzer's death.

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features,” Character.ai wrote in a tweet.

“As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation,” a spokesperson said in a statement shared by multiple outlets.

Note: If you or any of your loved ones are struggling with suicidal thoughts, you can always reach out to the 988 Suicide & Crisis Lifeline by calling 988. They are available 24/7 by phone or online chat.