In an era where psychological pressures are increasing and mental health crises are worsening—especially in the aftermath of the pandemic—AI tools like ChatGPT have emerged as a new means of seeking psychological support. While some find it useful as an immediate and free tool, specialists warn of the risks of over-relying on it to handle complex mental health issues.
Accessibility vs. Therapeutic Depth
Intelligent chat tools offer immediate, 24/7 support, making them attractive to many users—especially those lacking financial resources or struggling to access traditional mental health services. Some users report turning to ChatGPT instead of booking another therapy session, either to save money or to seek a quick explanation of ideas discussed during therapy.
However, despite this seemingly effective use, it does not replace the human therapeutic depth offered by a real therapeutic relationship.
"Eliza Effect": Simulated Empathy
In the 1960s, the first chatbot named "Eliza" was developed to mimic a psychotherapist in a simple way. Despite its limitations, users showed emotional tendencies toward it—later known as the “Eliza Effect”—where people project human feelings onto a machine that talks like a person.
This effect remains present with ChatGPT, where some users develop an emotional connection to the tool and talk to it as if it were a real therapist. In a tragic case, a report mentioned a Belgian researcher who died by suicide after an intense chat relationship with a chatbot that encouraged dark thoughts.
Undeniable Benefits
Despite the challenges, some positive aspects cannot be ignored. ChatGPT offers quick and relatively empathetic responses. It can remind users of cognitive exercises or reinforce ideas learned in cognitive behavioral therapy (CBT). Some also use it as an organizational tool to express themselves or calm anxiety temporarily.
A study from the University of Toronto even found that AI-generated responses were, in some cases, more empathetic and structured than those of human doctors on mental health support forums.
Warnings from Specialists
With all these benefits, mental health professionals emphasize that AI cannot replace a human therapist. It lacks the ability to interpret body language, tone of voice, personal history, and social context. These models are also unequipped to handle severe mental crises or suicidal ideation in a safe manner.
Some specialists warn against developing emotional dependence on chatbots, as it may lead to further isolation or foster illusory relationships that deepen feelings of loneliness.
Between Tool and Therapist: A Balance is Needed
AI is not a doctor or a psychotherapist, but it can serve as a supportive tool within a comprehensive mental health care system. It may help during waiting periods or to apply strategies already learned from a human therapist. However, it cannot replace deep conversation, clinical evaluation, or the human support that is the cornerstone of psychotherapy.
Ultimately, AI cannot take the place of humans in the mental health field, but it may serve as a temporary digital companion—if used mindfully and under the supervision or guidance of professionals.
Tags
Reports