Switch to ADA Accessible Theme
Close Menu
Victim of Hurricanes Helene or Milton? CLICK HERE
Tampa Personal Injury Lawyer
Free ConsultationsHablamos Español
Tampa Personal Injury Lawyers / Blog / Personal Injury / Mother Sues Character A.I. After Son’s Suicide

Mother Sues Character A.I. After Son’s Suicide

LegalPI

Several lawsuits have been filed against social media companies regarding teen suicides and the toll social media takes on the psychological health of youngsters. These lawsuits allege that social media companies exposed young people to disturbing or upsetting content that caused a significant decline in their mental health. While none of these lawsuits have been successful in the United States, one British court determined that Facebook was responsible for the death of a 14-year-old girl who was subjected to excessively “bleak” content in her Facebook feed. The girl died by suicide. A British court ruled that Facebook was responsible for the death and ordered Facebook’s parent company, Meta, to be more careful about what it places into young users’ feeds.

Now, a new type of lawsuit is being filed in Florida. A Florida mother claims that the generative AI company Character A.I. contributed to her son’s suicide after he became “obsessed” with a character he created using the platform. The Florida mother has filed a wrongful death lawsuit against Character A.I. claiming that one of its chatbots encouraged her son to commit suicide.

This can become a watershed case that determines what responsibility generative A.I. companies have toward their users, especially young users, and teenagers who use the platforms compulsively. According to the lawsuit, the company was reckless in offering minors life-like companions without the proper safeguards.

What responsibility do big tech companies have toward their users? 

Generally speaking, platforms like Facebook, Twitter, Instagram, and others are immune to lawsuits alleging that third-party content caused them harm. Section 230 of the Communications Decency Act prevents these companies from being sued over third-party content. In the case of the lawsuit that was filed in Britain, there was no such legislation prohibiting these types of lawsuits. Personal injury and wrongful death attorneys have tried to get around the legislation by filing product liability claims against the companies. Organizations like the Social Media Victims Law Center (SMVLC) seek to hold social media companies legally accountable for the harm they inflict on vulnerable users. According to the organization’s webpage, “SMVLC seeks to apply principles of product liability to force social media companies to elevate consumer safety to the forefront of their economic analysis and design safer platforms that protect users from foreseeable harm.”

Attorneys for the young man who killed himself believe it will be easier to hold generative A.I. companies liable than social media companies. Generative A.I. companies do not produce or publish third-party materials and so, Section 230 of the Communications Decency Act would not apply to their platforms. Nonetheless, suicide lawsuits can be tricky to win, even when the decedent is a teenager. The family will need to prove that the company failed to implement safeguards that could have saved the young man’s life. They may also need to show that the company could have predicted the fatal outcome.

Talk to a Tampa, FL Wrongful Death Attorney Today 

Florin Gray represents the interests of grieving Florida families who have lost loved ones to negligent actors. Call our Tampa personal injury lawyers today to schedule a consultation, and we can begin investigating your case right away.

Source:

com/news/florida/2024/10/24/florida-mother-files-wrongful-death-lawsuit-against-ai-company-after-sons-death-by-suicide/

Facebook Twitter LinkedIn