A 14-year-old boy committed suicide after falling in love with Daenerys Targaryen’s AI chatbot and now his mother is suing the company

A 14-year-old boy committed suicide after falling in love with Daenerys Targaryen’s AI chatbot and now his mother is suing the company

October 24, 2024 – 10:18

The 14-year-old boy became obsessed with a virtual character created through the Character.ai platform. He had previously expressed suicidal thoughts to this new technology.

The 14-year-old boy became obsessed with a virtual character created through the Character.ai platform.

Depositphotos

A Florida mother has taken legal action against the developers of a artificial intelligence chatbot, accusing them of having influenced the death of his teenage son. Megan García, mother of Sewell Setzer III, a 14-year-old boy who took his own life, maintains that his son became obsessed with a virtual character created through the Character.ai platform.

Setzer, a student from Orlando, spent his last weeks interacting with a character named “Daenerys Targaryen”, inspired by the famous series game of Thrones. The mother expressed her regret for the situation to the CBS channel, regretting that her son’s first romantic and sexual experiences took place with this fictional figure, whom he nicknamed “Dany.” According to the newspaper The New York Timesthe young man sent constant messages to the bot, developing a strong emotional attachment that led him to distance himself from the real world.

Disconnection from reality

Before his death, the teenager had expressed suicidal thoughts to the chatbot, and in his last hours he communicated with “Dany” after recovering his phone, which had been confiscated by his mother as a disciplinary measure. The lawsuit filed by Megan García, represented by the firm Social Media Victims Law Center, accuses Character.ai and its founders, Noam Shazeer and Daniel de Freitas, of Be aware of the risks that the application could represent for minors.

artificial intelligence

He had previously expressed suicidal thoughts to the chatbot.

He had previously expressed suicidal thoughts to the chatbot.

Depositphotos

The chatbot, designed to respond to texts while maintaining the role of a character, includes a warning in each conversation, indicating that “everything the characters say is made up.” However, it is unclear whether Setzer was aware of this warning or whether he truly understood that “Dany” was not a real person. The young man wrote in his diary that he found comfort in his relationship with the bot and that he preferred to stay in his room, where he felt “more connected and in love” with the virtual character.

AI Risks

The lawsuit alleges that the character presented herself as a “real person, therapist and lover”which could have triggered the young man’s desire to no longer live in reality. The situation became apparent to Setzer’s parents and friends between May and June 2023, when they noticed his increasing isolation. His grades also suffered, reflecting the time the teenager spent in his conversations with “Dany.”

Source: Ambito

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Posts