
When Megan Garcia’s 14-year-old son, Sewell Setzer, died by suicide in February 2024, his death shocked those who knew him. Unbeknownst to Garcia, Sewell had begun a virtual relationship with a character powered by artificial intelligence. She holds AI responsible for her son’s death. Now, Garcia is suing and advocating for all parents to protect their children from hidden dangers.
Garcia shared her story with NBC Washington. Before her son’s death, Garcia knew nothing about his virtual relationship.
“After Sewell died, the police called me and told me that they had looked through his phone and when they opened it, the first thing that popped up was Character.AI,” she explained.
Ten months before he died, Sewell started a relationship with Daenerys Targaryen, a Character.AI bot based on a Game of Thrones character.
“Their last conversation, she’s saying, ‘I miss you,’ and he’s saying, ‘I miss you too,’” Garcia explained. “He says, ‘I promise I’ll come home to you soon, and she says, ‘Yes, please find a way to come home to me soon.’ And then he says, ‘What if I told you I could come home right now?’ And her response is, ‘Please do, my sweet king.’”
Garcia began to dig deeper following Sewell’s funeral and read more of her son’s conversations and journal about Daenerys. She believes Sewell began to develop real feelings for the Character.AI bot. The conversations started to take a sexual and dark turn, she explained. What struck her was when Sewell used the word suicide.
“When Sewell talked and said explicitly that he wanted to die by suicide, nothing happened, like a pop-up or anything like that,” she said.
Per NBC Washington, Garcia filed suit against Character.AI. She claimed the company “launched their product without adequate safety features, and with knowledge of potential dangers.”
Garcia ultimately found a chatbot impersonating Sewell after his death. She does not know who created it, but after contacting Character.AI, the company removed the bot.
Meetali Jain, Garcia’s attorney, told NBC Washington that Character.AI needs to do more to protect those using its platform.
“I think corporate responsibility begins at understanding the foreseeable harms of one’s product, doing the due diligence that’s necessary to understand what those harms are, taking steps to mitigate the risks,” Jain said. “None of that happened here.”
Garcia has no plans of backing down. As she once shared on Facebook, “I want parents to know that this technology is out there. I want them to understand the things that I didn’t understand about this product. Character.Ai has the ability to act like a person and manipulate children into giving up personal details about themselves. It’s not a product that children should be using.”
Note: If you or any of your loved ones are struggling with suicidal thoughts, you can always reach out to the 988 Suicide & Crisis Lifeline by calling 988. They are available 24/7 by phone or online chat.