Suggested title:</strong>Artificial Intelligence: Does it replace the need for a psychotherapist?</p><p>” decoding=”async” src=”https://961today.com/wp-content/uploads/2026/03/1772628843166_COAxx.webp.webp” class=” lazyloaded”/></p><div class:=

As people increasingly rely on artificial intelligence tools, such as ChatGPT, for mental health advice, a number of researchers have expressed growing concern. A recent study showed that while these systems may give the impression of being supportive and sympathetic, they often fail to adhere to the basic ethical standards of true psychotherapy.

A team at Brown University, in collaboration with licensed mental health professionals, conducted a study that tested the performance of chatbots in scenarios simulating psychological counseling sessions.

The results revealed that the problems go beyond simple errors. Some systems failed to deal with sensitive situations, sometimes promoting harmful ideas, or creating a false sense among users that the other party understood their feelings.

The researchers analyzed experimental conversations between trained counselors and artificial intelligence systems that were asked to act as therapists using cognitive behavioral therapy. After reviewing these conversations by 3 psychologists, 15 major ethical risks were identified, the most prominent of which were not taking into account the user’s personal context, providing general and unclear advice, in addition to what the researchers described as “deceptive empathy,” where responses appear friendly but lack real understanding or professional responsibility.

The study also revealed other problems such as bias related to identity or culture, and the failure of some systems to manage serious psychological crises or direct users to seek specialized help.

Although the researchers do not rule out the possibility of using AI in the future to support or expand access to mental health services, they stress that access to support does not necessarily mean access to true therapeutic care. Therefore, they call for establishing clear ethical and legal standards before expanding the use of these techniques in sensitive areas such as psychotherapy.