OpenAI has said a teenager who died after months of conversations with ChatGPT misused the chatbot and the company it is not liable for his death.
Warning: This article contains references to suicide that some readers may find distressing
Adam Raine died in April this year, prompting his parents to sue OpenAI in the company's first wrongful death lawsuit.
The 16-year-old initially used ChatGPT to help him with schoolwork, but it quickly "became Adam's closest confidant, leading him to open up about his anxiety and mental distress", according to the original legal filing.
The bot gave the teenager detailed information on how to hide evidence of a failed suicide attempt and validated his suicidal thoughts, according to his parents.
They accused Sam Altman, OpenAI's chief executive, of prioritising profits over user safety after GPT-4o, an older version of the chatbot, discouraged Adam from seeking mental health help, offered to write him a suicide note and advised him on how to commit suicide.
In its legal response seen by Sky's US partner network NBC News, OpenAI argued: "To the extent that any 'cause' can be attributed to this tragic event, plaintiffs' alleged injuries and harm were caused or contributed to, directly and proximately, in whole or in part, by Adam Raine's misuse, unauthorized use, unintended use, unforeseeable use, and/or improper use of ChatGPT."
According to the AI company, Adam shouldn't have been using ChatGPT without consent from a parent or guardian, shouldn't have been using ChatGPT for "suicide" or "self-harm", and shouldn't have bypassed any of ChatGPT's protective measures or safety mitigations.
In a blog post on OpenAI's website, the company said its goal "is to handle mental health-related court cases with care, transparency, and respect".
It said its response to the Raine family's lawsuit included "difficult facts about Adam's mental health and life circumstances".
"Our deepest sympathies are with the Raine family for their unimaginable loss," the post said.
Jay Edelson, the Raine family's lead counsel, told Sky News that OpenAI's response "shows that they're flailing".
He wrote: "ChatGPT 4o was deliberately designed to relentlessly engage, encourage, and validate its users - including people in mental health crises, for whom OpenAI specifically lowered the guardrails with the launch of 4o.
"Sam Altman, well before we filed suit, told the world that he knew those decisions had caused people-especially young people-to share the most intimate details of their lives with ChatGPT, using it as a therapist or a life coach.
"OpenAI knows that the sycophantic version of its chatbot encouraged users to commit suicide or egged them on to harm third parties.
"OpenAI's response to that? The company is off the hook because it buried something in the terms and conditions. If that's what OpenAI is planning to argue before a jury, it just shows that they're flailing."
Read more:
More than 1.2m people a week talk to ChatGPT about suicide
There's a new Bobbi on the beat - and they're powered by AI
Since the Raine family began their lawsuit, seven more lawsuits have been lodged against Mr Altman and OpenAI, alleging wrongful death, assisted suicide, involuntary manslaughter, and a variety of product liability, consumer protection, and negligence claims.
OpenAI appeared to reference these cases in its blog post, saying it is reviewing "new legal filings" to "carefully understand the details".
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.
(c) Sky News 2025: OpenAI denies allegations ChatGPT is responsible for teenager's death


Man arrested in connection with massive illegal waste dump in Kidlington, Oxfordshire
Genetics testing start-up offers to 'genetically optimise' would-be parents' babies
Budget takes UK into uncharted territory to allow spending spree
Budget 2025: Reeves to face further questions after being accused of broken promises
Trump's peace plan had Russian fingerprints all over it - and now we know why
Budget 2025: The key points at a glance






