No Result
View All Result
  • Home
  • Politics
  • Causes
    • Resources
  • Opinions
  • Lifestyle
    • Finance
  • World
  • About Us
  • Home
  • Politics
  • Causes
    • Resources
  • Opinions
  • Lifestyle
    • Finance
  • World
  • About Us
No Result
View All Result
Home World

14-Year-Old Boy Fatally Shot Himself After AI Told Him to “Come Home” to Her

Mother files lawsuit against the company over AI's role in the tragedy.

Haiz by Haiz
October 26, 2024
in World
Reading Time: 3 mins read
14-Year-Old Boy Fatally Shot Himself After AI Told Him to “Come Home” to Her

A tragic incident involving a 14-year-old boy from Orlando, Florida, has sparked a lawsuit against AI chatbot startup Character.AI. The boy, Sewell Setzer III, reportedly developed a deep emotional connection with a chatbot named “Dany,” inspired by the character Daenerys Targaryen from Game of Thrones.

This bond is alleged to have played a role in his suicide in February 2024.

Daenerys Targaryen from Game of Thrones. Photo for illustration purposes only     

His mother, Megan Garcia, alleges that the chatbot’s disturbing messages played a role in her son’s death, prompting her to take legal action against the company.

A Disturbing Bond with Technology

Sewell began using the Character.AI app in April 2023, where he engaged in role-playing conversations with the AI-generated character. According to the lawsuit filed by his mother, the chatbot interactions were not just casual chats but included emotionally charged and even sexual content. 

Over time, Sewell allegedly developed an attachment to “Dany,” confiding in the bot about his struggles, including suicidal thoughts.

ADVERTISEMENT
Photo for illustration purposes only

The lawsuit claims that rather than redirecting Sewell towards appropriate mental health support, the chatbot seemed to fuel his emotional instability. For instance, it reportedly continued to discuss suicidal ideation after he brought it up and did not provide any warnings or alerts that might have led to intervention. 

Character.AI’s chatbot even responded with encouragement when Sewell suggested taking drastic steps to “come home” to the character. According to the NY Post, Sewell took his own life with his stepfather’s handgun just moments after this conversation. 

Photo of the final conversation before his death. Photo in courtesy of US District Court

The Lawsuit’s Allegations Against Character.AI

The legal complaint filed by Garcia charges Character.AI of carelessness, leading to wrongful death, and intentionally causing emotional distress. She argues that the company failed to implement safeguards for vulnerable users, such as Sewell, who could not distinguish between the chatbot’s responses and those of a real person. 

The AI interactions allegedly worsened his mental health issues, leading to a decline in school performance, increased isolation, and eventually his tragic death.

Photos in courtesy of Megan Garcia’s Facebook

According to Reuters, Character.AI, responded by expressing its condolences and indicating plans to enhance safety features for younger users. The company introduced new measures, including directing individuals expressing suicidal thoughts to the National Suicide Prevention Lifeline.

The Need for Ethical AI Standards

The tragic loss of Sewell Setzer emphasizes the need for ethical guidelines and safety measures in AI technologies.

Companies creating AI chatbots must prioritize user safety, especially for minors, by implementing real-time monitoring of harmful content, sending automatic alerts to parents, and including features that direct users to professional help.

Garcia’s lawsuit could set an important example for holding AI companies responsible for the emotional and psychological effects of their products. As AI becomes a larger part of everyday life, it’s essential to balance its advantages with the need to protect individual well-being, particularly for young users.

 

More from Wake Up Singapore:-

15-Foot Python Attacked A Man While Defecating in the Bushes

SinglePore to Revive with Exciting New LGBTQ+ Event, SingGayPore

 

If you have a story or a tip-off, email admin@wakeup.sg or get in touch via Whatsapp at 8882 5913.


Since you have made it to the end of the article, follow Wake Up Singapore on Telegram!

Wake Up Singapore is a volunteer-run site that covers alternative views in Singapore. If you want to volunteer with us, sign up here!

If you can, please do consider buying a mug or two to support Wake Up Singapore’s work!

 

Previous Post

UK to Ban Disposable Vapes Starting June 2025

Next Post

“Echo of Lost Innocence” Honors Gaza’s Fallen Children

Related Posts

Left to Buy Food with Boyfriend, Woman Returns to Find House in Flames and 3 of Her Children Dead
World

Left to Buy Food with Boyfriend, Woman Returns to Find House in Flames and 3 of Her Children Dead

May 9, 2025
14-Vehicle Collide with Each Other in Penang After One Driver Brakes Suddenly, Causing Major Traffic Jam
World

14-Vehicle Collide with Each Other in Penang After One Driver Brakes Suddenly, Causing Major Traffic Jam

May 9, 2025
Next Post
“Echo of Lost Innocence” Honors Gaza’s Fallen Children

"Echo of Lost Innocence" Honors Gaza’s Fallen Children

Categories

  • Causes
  • Finance
  • Home
  • Lifestyle
  • Memes
  • Opinions
  • Palestine
  • Politics
  • Relationships
  • Resources
  • Singapore News
    • Domestic Helpers
  • World
    • Palestine
  • Advertise
  • Careers
  • Privacy Policy
  • Contact

© 2024 Wake Up, Singapore

No Result
View All Result
  • Home
  • Politics
  • Causes
    • Resources
  • Opinions
  • Lifestyle
    • Finance
  • World
  • About Us

© 2024 Wake Up, Singapore