Character.AI: why has a lawsuit been filed - and what does it claim?

Watch more of our videos on ShotsTV.com 
and on Freeview 262 or Freely 565
Visit Shots! now

This article contains affiliate links. We may earn a small commission on items purchased through this article, but that does not affect our editorial judgement.

He chatted with Game of Thrones inspired bots 😨
  • Grieving mother files lawsuit against Character.AI for “wrongful death”. 
  • Her 14-year-old son regularly interacted with the platform’s chatbots. 
  • She alleges the company did not have “adequate safety features”. 

A mother has filed a lawsuit against an AI-chatbot company following the death of her teenage son. The grieving parent has hit Character.AI, its founders and Google with a “wrongful death” claim.

Megan Garcia’s 14-year-old son Sewell Setzer III, from the US, died in February and she says he regularly interacted with Character.AI’s chatbots. In her lawsuit, she alleges the company did not have “adequate safety features”. 

Hide Ad
Hide Ad

Launched in November 2021, Character.AI’s founders are both former Google employees who left to start the chatbot company. However Google acquired the company’s leadership team in August of this year, according to The Verge, which is why they have been named in the lawsuit. 

Sewell Setzer III. Photo: US District CourtSewell Setzer III. Photo: US District Court
Sewell Setzer III. Photo: US District Court | US District Court

Google didn’t respond to a request for comment on the lawsuit when approached by The Verge. The BBC, in a report from earlier this year, stated that the majority of Character.AI’s users are young people - ranging from 16 to 30. 

Mum files lawsuit against AI company after son’s death 

Sewell Setzer III was a 14-year-old boy who died after taking his own life in February of this year, The Verge reports. His mother, in her lawsuit, says that he began interacting with chatbots modelled after characters from the hit TV series Game of Thrones, including one based on Daenerys Targaryen. 

She said that he chatted with them continuously in the months before his death earlier this year. His mother alleges that he interacted with the chatbot “seconds” before he died by suicide in February, according to The Verge. 

Hide Ad
Hide Ad

In the lawsuit for “wrongful death”, which was filed on October 22, Sewell’s mother is demanding a civil trial by jury. The filing includes allegations that the platform’s chatbots offered “psychotherapy without a licence”. 

His mother claims he interacted with mental-health focused chatbots like “Therapist” and “Are You Lonely”, the Verge reports. 

What has Character.AI said? 

In a statement shared on X (formerly Twitter), Character.AI’s official account posted: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features.” 

The company also issued an update on its community safety measures on October 22 (the same day the lawsuit was filed) and detailed how it was introducing “new guardrails for users under the age of 18”. 

Hide Ad
Hide Ad

Character.AI explained: “Our goal is to offer the fun and engaging experience our users have come to expect while enabling the safe exploration of the topics our users want to discuss with Characters. Our policies do not allow non-consensual sexual content, graphic or specific descriptions of sexual acts, or promotion or depiction of self-harm or suicide.” 

The company has also promised the new safety measures including: 

  • Changes to our models for minors (under the age of 18) that are designed to reduce the likelihood of encountering sensitive or suggestive content.
  • Improved detection, response, and intervention related to user inputs that violate our Terms or Community Guidelines. 
  • A revised disclaimer on every chat to remind users that the AI is not a real person.
  • Notification when a user has spent an hour-long session on the platform with additional user flexibility in progress.

Character.AI added: “We proactively, and in response to user reports, remove Characters that violate our Terms of Service. We also adhere to the DMCA requirements and take swift action to remove reported Characters that violate copyright law or our policies.”

The Samaritans can offer information and support for anyone affected by the content of this article.You can call their helpline on 116 123 or email [email protected] in the UK.

The YouTube channel Code.Org has a really good video explaining how exactly AI chatbots and Large Language Models work. It is just seven minutes and you can watch it by clicking the link here

News you can trust since 1855
Follow us
©National World Publishing Ltd. All rights reserved.Cookie SettingsTerms and ConditionsPrivacy notice