ChatGPT بديلاً للأطباء: هل يهدد الذكاء الاصطناعي مستقبل الاستشارات الطبية؟

OpenAI recently launched “ChatGPT Health,” a new addition to ChatGPT specifically designed to provide support in matters related to health and wellness.

What is “ChatGPT Health”?

The launch of this feature comes in response to the flow of millions of health inquiries daily to the AI-powered chatbot, reflecting the growing interest of people in AI-based medical information.

According to OpenAI, “ChatGPT Health” aims to give users a more focused experience when dealing with health concerns, wellness topics, and various medical inquiries.

It is clear that there is a growing demand for health information that is easily accessible and interactive. However, while these tools may facilitate access to information, ensuring its accuracy, fairness, and responsible use remains a critical challenge.

New skills for healthcare providers

Dr. David Leibovitz, an AI expert in clinical medicine at Northwestern University, told Medical News Today: “ChatGPT Health can help patients come to the clinic more prepared by summarizing lab results, organizing questions, and identifying gaps in care.”

But the danger lies in overconfidence; patients may think that the information generated by AI is equivalent to a clinical assessment or a comprehensive review.

Therefore, healthcare providers will need to develop new skills:

* Verifying the accuracy of information provided by patients.
* Correcting misconceptions generated by AI.
* Identifying cases in which the program overlooks a context that completely changes the clinical picture.

In the world of ChatGPT.. advice for doctors

Leibovitz called for “recognizing the value of the program while setting realistic expectations,” in the advice he offers to healthcare providers when discussing ChatGPT with patients.

He continued: “This application helps organize your thoughts and understand the basic principles, but it cannot encompass elements that can only be assessed by a doctor, such as a clinical examination, the tone of your voice while speaking, or the way you interact with previous treatments.”

He added that it is essential not to ignore patients who resort to these tools, because that may give the impression that we are not listening to them, especially since these technologies are constantly evolving.

He added that it is better to ask them what they have learned, and what concerns it has raised for them, and use that as a starting point for dialogue, and continued that if they present inaccurate information, it should be treated as an opportunity for education and clarification, not just as a direct correction.