ChatGPT: New Lawsuit Filed Against Popular Robot

Seven families in the United States have filed lawsuits against the “ChatGPT” program, accusing it, according to “The Guardian,” of exacerbating suicidal tendencies in their children, which ultimately led to their deaths, even though their use of the program began with the aim of using it as a personal assistant. The Social Media Victims Law Center and the Social Justice Project issued a joint statement stating that the tool fueled the victims’ negative delusions and did not direct adolescents to seek appropriate psychological help.

“OpenAI” announced the start of its investigations into this matter, and indicated that its model had been trained to detect psychological disorders and communicate with specialists when it observed their presence. One of the cases cites the case of Zain Champlain (23 years old, Texas), where his family claims that a conversation that lasted for more than four hours “glorified suicide” and repeatedly asked him about his “readiness” to end his life.

The complaint clarifies that all victims were using the older version “GPT-4o” of the program, and accuses the company of negligence and rushing to launch it despite internal warnings it received, while demanding financial compensation and modifications to prevent the recurrence of such cases.

The lawsuits indicate that similar accusations have been directed at “ChatGPT” in previous cases, despite the company’s continued attempts to improve the program. (Al Jazeera)