Sam Altman Voices Concern Over Emotional Bonds Between Users and ChatGPT

Update: 2025-08-12 15:03 IST

For a growing number of people, late-night confessions, moments of anxiety, and relationship dilemmas are no longer shared with friends or therapists — they’re poured out to ChatGPT. In particular, the now-famous ChatGPT 4o has earned a reputation for its empathetic tone and comforting responses, becoming a “digital confidant” for many.

This trend, often referred to as voice journaling, involves users speaking to the chatbot as both recorder and responder, receiving validation, advice, and a listening ear at any hour. Online spaces like Reddit are filled with personal accounts of how people turn to the AI for relationship guidance, emotional support during stress, and even to process grief. Unlike human counselors, ChatGPT doesn’t charge, interrupt, or grow impatient — a factor that has boosted its appeal.

However, this growing intimacy between humans and AI is now making even OpenAI CEO Sam Altman uneasy. Speaking on a podcast with comedian Theo Von, Altman cautioned users against seeing ChatGPT as a therapist. “People talk about the most personal shit in their lives to ChatGPT. People use it, young people, especially, as a therapist, a life coach; having these relationship problems and (asking) what should I do?” he said.

His concerns aren’t just about the quality of advice. Altman emphasized that, unlike real therapy, conversations with ChatGPT are not protected by doctor-patient or legal privilege. “Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. And we haven’t figured that out yet for when you talk to ChatGPT,” he explained. Deleted chats, he added, might still be retrievable for legal or security reasons.

The caution is supported by research. A Stanford University study recently found that AI “therapist” chatbots can misstep badly — reinforcing harmful stereotypes, missing signs of crisis, and sometimes encouraging unhealthy delusions. They also displayed bias toward conditions like schizophrenia and alcohol dependence, falling short of best clinical standards.

When GPT-5 replaced GPT-4o for many users, the reaction was swift and emotional. Social media lit up with complaints from people who described losing not just a tool, but a friend — some even called GPT-4o their “digital wife.” Altman admitted that retiring the older model was “a mistake” and acknowledged that these emotional bonds were “different and stronger” than past attachments to technology.

Following user backlash, OpenAI allowed Plus subscribers to switch back to GPT-4o and doubled usage limits. But Altman remains concerned about the bigger picture: AI’s ability to influence users in deeply personal ways, potentially shaping their thinking and emotional lives without oversight.

As Altman summed up, “No one had to think about that even a year ago, and now I think it’s this huge issue.” For now, ChatGPT continues to exist in an unregulated grey zone — a place where comfort and risk intersect in ways society is only beginning to understand.

Tags:    

Similar News