OpenAI warns ChatGPT is not a therapist as thousands of users discuss suicide, form emotional reliance

OpenAI said in a blog post published on Monday that about 0.15% of ChatGPT’s weekly users discuss suicidal thoughts or plans. Another challenge the Sam Altman-led company is taking on is emotional reliance, when users form unhealthy attachments to the chatbot itself.

Govind Choudhary
Updated28 Oct 2025, 07:09 AM IST
OpenAI estimates that around 0.15 per cent of ChatGPT’s weekly users discuss suicidal thoughts or plans. It is a small fraction, but significant given the platform’s massive global reach, as per a latest blog post published on Monday.
OpenAI estimates that around 0.15 per cent of ChatGPT’s weekly users discuss suicidal thoughts or plans. It is a small fraction, but significant given the platform’s massive global reach, as per a latest blog post published on Monday.

Sam Altman-led OpenAI estimates that around 0.15% of ChatGPT’s weekly users discuss suicidal thoughts or plans, even as it warns that the chatbot is not a therapist, according to a latest blog post published on Monday. While a small fraction, the number is significant given the platform’s massive global reach.

OpenAI says the new GPT-5 model, which powers ChatGPT by default, reduces unsafe or non-compliant responses in mental-health-related chats by as much as 80%, and performs substantially better when users show signs of psychosis, mania, or emotional over-reliance on the chatbot.

ChatGPT is not a therapist

The update comes after months of work with psychiatrists and psychologists in OpenAI’s Global Physician Network, a group of nearly 300 clinicians across 60 countries. More than 170 of them directly contributed to the new system, writing and scoring responses, defining safe behaviour, and reviewing how the model handles sensitive scenarios.

Notably, the company said that the goal is not to turn ChatGPT into a therapist, but to ensure it recognises signs of distress and gently redirects users to professional or real-world support. The model now connects people more reliably to crisis helplines and occasionally nudges users to take breaks during longer or emotionally charged sessions.

How GPT-5 responds to mental-health-related queries

OpenAI’s internal testing shows that in production traffic, the GPT-5 model produced 65–80% fewer unsafe responses than previous versions when users displayed signs of mental-health distress.

Also Read | OpenAI adds Shared Projects and ‘Company Knowledge’ to ChatGPT: How they work

The Sam Altman-led company noted that in structured evaluations graded by independent clinicians, GPT-5 cut undesirable replies by 39% to 52% compared with GPT-4o. Automated testing scored it 91–92% compliant with desired behaviour, up from 77% or lower in older models.

The system also handled lengthy or complex conversations more reliably, maintaining over 95% consistency even in multi-turn dialogues, where earlier models often faltered.

The Sam Altman-led company noted that structured evaluations graded by independent clinicians, GPT-5 cut undesirable replies by 39 per cent to 52 per cent compared with GPT-4o.

How ChatGPT tackles emotional attachment

A newer challenge OpenAI is taking on is emotional reliance, when users form unhealthy attachments to the chatbot itself. Using a new taxonomy to identify and measure that behaviour, OpenAI says GPT-5 now produces 80% fewer problematic replies in these scenarios, often steering users toward human connection instead of validating emotional dependence.

Also Read | Can ChatGPT create music? OpenAI might have the answer in the future

Still, OpenAI admits these mental-health conversations are rare and hard to quantify precisely. At such low prevalence, fractions of a per cent, even small variations, can distort results. And experts do not always agree on what “safe” looks like: clinicians reviewing the model’s responses reached the same judgment only 71–77% of the time.

OpenaiMental HealthChatgptArtifical Intelligence
Get Latest real-time updates

Catch all the Technology News and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.

Business NewsTechnologyNewsOpenAI warns ChatGPT is not a therapist as thousands of users discuss suicide, form emotional reliance
More