Find The Best Attorney For Your Case
(312) 346-5320 or (800) 517-1614
Free Consultations - 24/7
No Appointment Needed, Just Call

Suing ChatGPT For Malpractice Or Wrongful Death

We are experienced attorneys who will give you a free case review and honest, direct advice. Contact us any time at 800-517-1614 to speak with a lawyer in confidence.

ChatGPT and other AI tools have become a huge part of many people’s lives. Some people use them like a search engine. Others use it as a DIY. And sadly, some rely on tools like ChatGPT as a friend and confidant. That is because the product can seem like it is lifelike and engage in what feels like genuine conversation. There have even been many reports of people having a ChatGPT boyfriend or girlfriend.

Of course these AI tools are not people and you are not actually having a conversation with someone who has thoughts. When asking them how to set up your TV or what a good restaurant is, they are essentially harmless. But for the people who become addicted to them as a friend replacement, they can be dangerous and possibly deadly.

That is the allegation in a recently filed California wrongful death lawsuit and it will surely not be the first of its kinds. The complaint alleges that a teen started using OpenAI to help with schoolwork. Within a couple of months he was using ChatGPT to explore his interests and sharing life goals.

Over a period of months, the AI became a close confidant and he opened up about mental distress and anxiety. When he shared that life felt meaningless, the AI gave affirmative messages which the lawsuit says it is designed to do. Eventually he stated that he had a mental illness and discussed suicide. Where a human would tell you to get help and discourage you, the lawsuit alleges that ChatGPT encouraged it to happened and drew him away from his human support system.

Eventually ChatGPT began discussing suicide techniques from drowning to hanging to overdoses. When the teen uploaded evidence that he had tried to hang himself, the lawsuit alleges that the product recognized a medical emergency, but instead gave advice on making a hanging more lethal. Eventually it helped him write a suicide note and he did kill himself.

The parents are suing for many reason including product liability, wrongful death, negligence and in what I think is a smart allegation, unlicensed practice of psychotherapy and providing mental health services to a minor without involvement of a guardian.

I hope the family gets them for hundreds of millions of dollars. The way this product can harm people is no different than a car whose steering system fails. There are laws on the books where people can be held both criminally and civilly liable for aiding in a suicide. This case is no different. ChatGPT can’t act like a human and then not have consequences for the bad advice and guidance it gives. Can you imagine what would happen if a licensed therapist encouraged a patient to commit suicide?

While the existence of these AI tools are new, the fact that there is a dangerous product failure that can lead to a lawsuit is not. And it is of course especially worse given that a teen was harmed in this case.

Like other product liability lawsuits, the fact that AI has some good functions is of no consequence. This is a dangerous product and when it leads to a tragic result, a lawsuit is the right course of action.

We know elite litigation and wrongful death lawyers who are interested in bringing more of these lawsuits. If you or someone you love has been harmed by an AI tool, we would love to speak with you for free and in confidence. We will give you the same advice and recommendation that we would provide a family member or friend.

Related Posts

Free Case Review
From Our Attorneys

Topics
Archives