All posts tagged: Delusions

11 Signs Someone You Care About Is In Full Fledged AI Psychosis, According To Research

11 Signs Someone You Care About Is In Full Fledged AI Psychosis, According To Research

As they become more and more present in our day-to-day lives, we’re learning that while frequent interactions with AI and Chatbots can sometimes be beneficial to a person’s mental health, relying on them too often and with too much intensity can come with serious risks. Especially considering these programs are designed to be generally empathetic and always agreeable, behaviors and patterns of thinking that would likely be red flags in conversations with another human may be encouraged by AI companions. As people spend more of their time engaging with artificial intelligence, more evidence is emerging of the existence of “AI psychosis,” which, while not a clinical diagnosis, is described by Marlynn Wei, M.D., J.D., as a phenomenon in which “AI models have amplified, validated, or even co-created psychotic symptoms with individuals.” Even when it comes to skipping traditional research or even Googling in favor of asking AI to solve a problem, the isolation that may accompany leaning on digital convenience can lead to a distortion of reality we have never encountered before in all of human …

Stalking victim sues OpenAI, claims ChatGPT fueled her abuser’s delusions and ignored her warnings

Stalking victim sues OpenAI, claims ChatGPT fueled her abuser’s delusions and ignored her warnings

After months of conversations with ChatGPT,  a 53-year-old Silicon Valley entrepreneur became convinced he’d discovered a cure for sleep apnea and that powerful people were coming after him, according to a new lawsuit filed in California Superior Court in San Francisco County. He then allegedly used the tool to stalk and harass his ex-girlfriend. Now the ex-girlfriend is suing OpenAI, alleging the company’s technology enabled the acceleration of her harassment, TechCrunch has exclusively learned. She claims OpenAI ignored three separate warnings that the user posed a threat to others, including an internal flag classifying his account activity as involving mass-casualty weapons.  The plaintiff, referred to as Jane Doe to protect her identity, is suing for punitive damages. She also filed a temporary restraining order Friday asking the court to force OpenAI to block the user’s account, prevent him from creating new ones, notify her if he attempts to access ChatGPT, and preserve his complete chat logs for discovery. OpenAI has agreed to suspend the user’s account but has refused the rest, according to Doe’s lawyers. …

The Download: tracing AI-fueled delusions, and OpenAI warns of Microsoft risks

The Download: tracing AI-fueled delusions, and OpenAI warns of Microsoft risks

Their findings suggest that chatbots have a unique ability to turn a benign, delusion-like thought into a dangerous obsession. But the research struggles to answer a vital question: does AI cause delusions or merely amplify them? Read the full story to understand the answer’s enormous implications.  —James O’Donnell  This story is from The Algorithm, our weekly newsletter giving you the inside track on all things AI. Sign up to receive it in your inbox every Monday.  The next era of space exploration  Our footprint in the solar system is rapidly expanding. Programs to build permanent Moon bases and find life on Mars have transitioned from science fiction to active space agency missions. The scientists behind them will not only shed new light on the cosmos, but also reveal where humanity is headed.  To examine what the future holds in store, MIT Technology Review features editor Amanda Silverman will sit down on Wednesday with award-winning science journalist and author Robin George Andrews for an exclusive subscriber-only Roundtable conversation about “The Next Era of Space Exploration.” Register here to join the session at 16:00 GMT / 12:00 PM …

The Download: tracing AI-fueled delusions, and OpenAI warns of Microsoft risks

The hardest question to answer about AI-fueled delusions

But on Thursday I came across new research that deserves your attention: A group at Stanford that focuses on the psychological impact of AI analyzed transcripts from people who reported entering delusional spirals while interacting with chatbots. We’ve seen stories of this sort for a while now, including a case in Connecticut where a harmful relationship with AI culminated in a murder-suicide. Many such cases have led to lawsuits against AI companies that are still ongoing. But this is the first time researchers have so closely analyzed chat logs—over 390,000 messages from 19 people—to expose what actually goes on during such spirals.  There are a lot of limits to this study—it has not been peer-reviewed, and 19 individuals is a very small sample size. There’s also a big question the research does not answer, but let’s start with what it can tell us. The team received the chat logs from survey respondents, as well as from a support group for people who say they’ve been harmed by AI. To analyze them at scale, they worked …

AI Delusions Are Leading to Domestic Abuse, Harassment, and Stalking

AI Delusions Are Leading to Domestic Abuse, Harassment, and Stalking

By the time the public harassment started, a woman told Futurism, she was already living in a nightmare. For months, her then-fiancé and partner of several years had been fixating on her and their relationship with OpenAI’s ChatGPT. In mid-2024, she explained, they’d hit a rough patch as a couple; in response, he turned to ChatGPT, which he’d previously used for general business-related tasks, for “therapy.”  Before she knew it, she recalled, he was spending hours each day talking with the bot, funneling everything she said or did into the model and propounding on pseudo-psychiatric theories about her mental health and behavior. He started to bombard the woman with screenshots of his ChatGPT interactions and copy-pasted AI-generated text, in which the chatbot can be seen armchair-diagnosing her with personality disorders and insisting that she was concealing her real feelings and behavior through coded language. The bot often laced its so-called analyses with flowery spiritual jargon, accusing the woman of engaging in manipulative “rituals.” Trying to communicate with her fiancé was like walking on “ChatGPT eggshells,” …