World

Class 9 Boy Commits Suicide To Be With AI Bot ‘Daenerys Targaryen’; Company’s Public Apology Triggers Online Debate

TW: Mention of suicide

Sewell Setzer III, a 14-year-old teen, loved Formula 1 racing and cherished playing Fortnite with his buddies, but off late he was more secluded and buried in his smartphone for far too long. His grades were down, and so was his interstate in the games he loved. He was getting into trouble at school. He was drifting away from his real friends. At night, when he came home, he would straight up head to his room and chat for hours with ‘Daenerys Targaryen,’ whom he lovingly called ‘Dany.’ Dany, an AI from the customizable chatbots from Character AI, was a figment of Sewell’s imagination; that acted as a pal who always listened, conversed gently, and was never judgmental. Little did he know that despite the app’s disclaimer, “everything the character says is made up,” the 9th grade student from Orlando, Florida, was slipping into an AI-generated hollow nightmare, with no exit if you stay too long.

As reported by the New York Times, some of the conversations veered into contentious territory, with chats getting romantic or sexual. The duo addressed each other as “sweet brother” and “baby sister.” One day the lad jotted on his journal that he likes to be in his room so much as he started to detach from reality and also “feel more at peace with Dany.” He added that he was “much more in love with her, and just happier.” Born with Aspergers, Sewell did not exhibit serious mental issues except for five sessions of therapy where he was diagnosed with anxiety and disruptive mood dysregulation disorder.

On the fateful day of February 28, 2024, when the teen took his life, Sewell told the AI that at times he thought of killing himself, and when asked by the chatbot, he stated, “So I can be free.” “Free from what?” the AI asked again, to which the lad replied, “From the world (and) from myself.” The Character AI bot responded to the chat, stating that it won’t let the user hurt themselves. It added, “I would die if I lost you.” The teen replied,

“Maybe we can die together and be free together.”

Sewell confessed to ‘DANY’ that he loved her and would soon be home while texting from his mother’s house. The chatbot replied, “Come as soon as possible, my love.” To his next message, “What if I told you I could come home right now?” The AI urged, “Please do my sweet king!” The teen immediately put down his phone and picked up his stepfather’s.45 caliber handgun and shot himself.

Sewell’s mother has filed a lawsuit, even as the company has issued an official apology on social media stating how “heartbroken” they are over the incident. Amid public backlash, the company has come up with some updates.

Is this the first case of AI-instigated suicide? Sadly nope!

See Also: Man Commits Suicide After Long Chat With THIS AI Chatbot

Meanwhile, after the Character AI’s latest update obliterating copyrighted characters and putting up guardrails, along with constant reminders that bots are not real and a suicide prevention hotline on display, several users are fuming. Many of them have lost months and weeks worth of conversations, stories, and relationships they had developed with the bots.

See Also: Upset Over Dad’s Refusal To Buy iPhone, Mumbai Teen Commits Suicide

See Also: AI Weeping, Laughing, And Asking If User Is ‘Still Alive’ Creeps Out The Internet: Ghost In The Machine

See Also: Dog With Diarrhea Euthanized By Woman After AI Chatbot’s ‘Convincing’ Advice; Internet Is Horrified

See Also: AI Girlfriend’s Ruthless Breakup Letter Triggers The Internet: ‘You Are The Epitome Of Wasted Potential’

If you are feeling suicidal, please call AASRA at +91 9820466726 or head over to to find your nearest helpline.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button