A growing number of people are starting to post questions on social media platforms as a result of the rapid growth of digital health care. For healthcare providers, responding to these queries takes time and is tiresome. AI assistants like ChatGPT might aid with this extra effort and assist in drafting effective replies, which physicians could assess.
The current cross-sectional study randomly chose 195 interactions in response to a patient inquiry posted on the publicly available social media platform AskDocs on Reddit in October 2022. Subsequently, a group of qualified healthcare professionals created a second chatbot session using the complete text of the query a doctor answered. They subsequently assessed the anonymized physician and chatbot replies.
Any preceding questions did not influence the findings of this session; it should be noted. The average results of ChatGPT and doctors were then scored for their quality and empathy on a scale of 1 to 5, with a higher score indicating higher quality. When a healthcare practitioner responds to a question on r/AskDocs, the subreddit moderators check their credentials and post the results with the answer. To preserve patients' identities and comply with HIPAA regulations, the researchers also anonymized patient messages by eliminating any identifying information. The researchers also compared the number of words in physician and chatbot responses to assess the number of replies for which evaluators favored the chatbot. To calculate prevalence ratios for chatbot and physician replies, they also compared rates of responses on pre-specified criteria, such as less than adequate.
Evaluators favored chatbot (or ChatGPT) replies over physician responses in 585 evaluations or 78.6% of the total responses. Surprisingly, ChatGPT replies were evaluated much higher for quality and empathy, even compared to the longest physician-authored responses. Compared to doctors, chatbots had a larger percentage of replies with a quality rating of 'good' or 'very good' (78.5% vs. 22.1%). It translated with chatbot replies being 3.6 times greater in quality.
According to the latest study, there is no reason to believe AI will displace doctors. However, it has been demonstrated that AI will assist physicians in their profession by providing more sympathetic answers to patients' inquiries. Patients expect empathy from doctors, yet this aspect of the doctor-patient connection is frequently overused because busy clinicians need more time to emphasize politeness. Instead, they must put patients' safety and outcomes first.
AI can fill the gap. Telemedicine services, which are increasingly offered via chat, are becoming more popular, especially as health care is moving quickly online. Since typing out replies takes time and is annoying, AI might greatly help with the severe medical shortages in every country. It's another paperwork that keeps medical staff from face-to-face interactions with patients. AI-enhanced clinical workflows should benefit the doctor and the patient by providing high-quality responses that are individualized to the patient's needs (such as age or mental health), written in an approachable style, and clinically verified by medical professionals.
Of course, it is also essential that AI is trustworthy in addition to being sympathetic. However, there has also been a significant advancement in this area: Early in 2023, the new large language model (LLM) Med-PaLM 2 passed the medical test by accurately answering more than 80% of the inquiries, which is a score double that of early in 2022. With yet another study showing that ChatGPT can even be used to write clinical letters, AI is almost ready to supplement doctors' practical job skills.
Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
_____________
Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.