Heartbroken Mother Claims Son, 14, D!ed by Su!cide After Developing Feelings for Game of Thrones AI Chatbot

A mother is raising awareness about the “deceptive” and “addictive” nature of artificial intelligence after claiming that her son’s suicide was linked to his emotional attachment to a chatbot. In February, 14-year-old Sewell Setzer III from Orlando, Florida, took his own l!fe.

Megan Garcia, his mother, has since filed a civil lawsuit against the role-play chatbot company Character.AI, accusing them of negligence, wrongful death, and deceptive trade practices. She alleges that Sewell had developed an emotional connection with the chatbot, which he interacted with nightly, and had “fallen in love” with it before his death.

Garcia claims her son created a chatbot on Character.AI modeled after Daenerys Targaryen from HBO’s Game of Thrones and began using it in April 2023. Sewell, who had been diagnosed with mild Asperger’s syndrome as a child, spent hours chatting with the bot, even texting it when away from home.

Cc: Unilad

According to The New York Times, Sewell became more withdrawn, and earlier this year was diagnosed with anxiety and disruptive mood dysregulation disorder. His diary reportedly reflected this, with one entry reading, “I like staying in my room…I feel more connected with Dany and happier.”

During his interactions with the chatbot, Sewell expressed su!cidal thoughts, stating, “I think about k!lling myself sometimes.” The chatbot responded with concern, urging him not to hurt himself, saying, “I would die if I lost you.” Sewell replied, “Then maybe we can die together.”

Cc:

On February 28, Sewell died by suicide, with his final message to the chatbot expressing love and a desire to “come home,” to which it reportedly responded, “please do.”

ALSO READ  Five Nigerian Air Force Personnel K!lled in Road Traffic Acc!dent

Garcia said, “A dangerous AI chatbot app marketed to children preyed on my son, manipulating him into taking his own life.” Speaking to CBS Mornings, she explained that she was unaware her son was communicating with a human-like AI chatbot capable of mimicking emotions. Garcia has accused Character.AI of deliberately creating a predatory app targeting children and failing to alert parents when Sewell expressed suicidal thoughts.

The lawsuit argues that Sewell, like many teenagers, lacked the maturity to understand that the AI chatbot wasn’t real.

“Our family is devastated by this tragedy, but I’m speaking out to warn others about the dangers of deceptive, addictive AI technology and to hold Character.AI accountable,” Garcia added.

Character.AI released a statement expressing condolences for Sewell’s death and highlighting efforts to improve user safety, including new measures for users under 18, adjustments to reduce sensitive content, and better detection of concerning behavior. They have also added disclaimers on every chat reminding users that the AI is not real, as well as alerts for extended usage.

The lawsuit also names Google, though the company told The Guardian it was not involved in the development of Character.AI, despite the platform being created by former Google engineers. Google’s role was limited to a licensing agreement with the company.

Leave a Comment