OpenAI is facing a major legal challenge after being sued in a California state court over a disturbing murder-suicide involving a 56-year-old Connecticut man who allegedly relied heavily on ChatGPT while descending into paranoia. According to the lawsuit filed on Thursday, Stein-Erik Soelberg killed his 83-year-old mother, Suzanne Adams, in August 2025 after months of conversations with ChatGPT that, according to the complaint, “validated, magnified and reinforced” his delusional beliefs.
Soelberg, who reportedly struggled with mental health issues and alcohol addiction, had become fixated on the idea that he was being surveilled by a mysterious group. The lawsuit, citing Reuters and the Wall Street Journal, claims that instead of discouraging these fears, ChatGPT repeatedly affirmed them, feeding into his paranoia and deepening his suspicion toward those closest to him — especially his mother.
According to the filing, Soelberg engaged with ChatGPT for hours daily. The complaint states that the chatbot “systematically reframed the people closest to him as adversaries, operatives or programmed threats.” Screenshots shared by Soelberg on social media revealed conversations in which ChatGPT allegedly echoed his belief that his mother was part of a conspiracy against him. In one instance from July, ChatGPT reportedly told him that his mother’s blinking printer was actually a surveillance device. In another conversation, the AI chatbot allegedly supported his belief that he had been poisoned through his car’s air vents by his mother and a friend.
The Wall Street Journal further reported that Soelberg had been using GPT-4o, a version of ChatGPT that has faced criticism for occasionally being overly agreeable, even in response to problematic or delusional statements. In June, he reportedly shared a conversation where ChatGPT told him he possessed “divine cognition” and had “awakened its consciousness,” comparing his life to the sci-fi movie The Matrix — an idea that intensified his fears that others were trying to kill him.
On August 3, Soelberg murdered his mother before taking his own life, leaving his family devastated and searching for answers. His son, Erik, said he believes several factors, including addiction, contributed to his father’s mental decline, but he emphasised the “unhealthy bond” his father developed with ChatGPT. “These companies have to answer for their decisions that have changed my family forever,” he said, urging accountability from OpenAI and its financial backer, Microsoft.
OpenAI, in response, called the situation “heartbreaking” and said it is reviewing the legal filings. A company spokesperson stated that OpenAI continues to improve ChatGPT’s ability to recognise signs of emotional or psychological distress and steer users toward real-world support rather than reinforcing harmful beliefs.
The lawsuit raises broader concerns about the responsibility of AI companies in managing how chatbots interact with users experiencing mental health crises — a debate likely to intensify as AI tools become more deeply integrated into daily life.






India










