
The company believes that some users engage in deep and emotional conversations with the application, trying to express feelings of distress or looking for a listener, without realizing that the model has no feelings, consciousness, or emotional interaction, but only relies on language generation.
Mental health experts explain that this type of interaction may create a false relationship, where the user believes that they are talking to a party that shares feelings or empathizes with them, while in reality they are talking to an algorithmic system that is unable to feel or participate emotionally.
Specialists believe that using “ChatGPT” as a platform for venting emotions may lead to neglecting real support from family, friends, or therapists, in addition to the risks related to data privacy, as conversations can be used for analysis or commercial purposes.
Experts emphasize that direct human support remains an indispensable element, because emotional participation and emotional presence require a human who understands and shares the experience. Artificial intelligence is just a helper tool, but it is not a friend or a substitute for psychotherapy.