news
New resources guide GPS through the practicality of using conversational AI in consulting, how new technologies work, and risks to recognize.
While artificial intelligence (AI) is becoming increasingly relevant to healthcare, at least 80% of GPS report that they are not completely familiar with certain AI tools.
To help GPS broaden their understanding of technology and assess the potential benefits and disadvantages of its use in practice, RACGP has announced a comprehensive new resource focusing on conversational AI.
Unlike AI Scribes, which translates conversations with patients into clinical notes that can be incorporated into a patient's health record, conversational AI is a technology that allows machines to interpret, process and respond to human language in a natural way.
Examples include AI-powered chatbots and virtual assistants that can support patient interactions, streamline appointment scheduling, and automate routine administrative tasks.
College resources provide actionable guidance on how conversational AI can be effectively applied in general practice, highlighting critical applications. These include:
- Answer patient questions about diagnosis, potential side effects of prescription drugs, or simplifying terminology in medical reports
- Provides treatment/medicine reminders and dosage instructions
- Provision of language translation services
- Get patients to the right resources
- Assist patients tracking and monitoring blood pressure, glucose, or other health markers
- Trial the patient before consultation
- Preparation of medical documents such as clinical letters, clinical notes, and discharge summary
- Provide clinical decision support by preparing a list of differential diagnoses, supporting the diagnosis, and providing optimization of clinical decision support tools (for research and treatment options)
- We propose treatment options and lifestyle recommendations.
said Dr. Rob Hosking, chairman of the RACGP Practice and Technology Management Specialist Committee. NewsGP In general practice, these tools have several potential benefits.
“Potential benefits include automating tasks, reducing administrative burdens, improving access to care, and personalized health education for patients,” he said.
Beyond the clinical setting, conversational AI tools can also have a variety of business, educational, and research applications, including automating claims and analyzing claims data, summarizing medical literature, and answering clinician medical questions.
However, despite many advantages, Dr. Hosking says it is also important to consider some of the potential disadvantages of its use.
“Conversational AI tools can provide responses that appear authoritative, but are vague, misleading or incorrect during reviews,” he explained.
'Biasing is specific to the data on which the AI tool is being trained, and therefore certain patient groups may be underestimated by the data.
“There is a risk that conversational AI will make inappropriate and discriminatory recommendations, rely on harmful and inaccurate stereotypes, and exclude or condemn vulnerable individuals who have already been marginalized.”
While some conversational AI tools, such as Google's Medpalm and Microsoft's Biogpt, are designed for medical purposes, Dr. Hosking pointed out that most are designed for general use and are not trained to produce results within a clinical context.
“The data on which these common tools are trained is not necessarily up to date and not from high-quality sources such as medical research,” he said.
The university addresses these potential issues as well as other ethical and privacy concerns associated with using AI in healthcare.
Regarding GPS, which decides whether to use conversational AI, Dr. Hosking said there are many considerations to ensure safe and quality care delivery, and that patients need to play an important role in the decision-making process regarding whether to use it in a particular consultation.
“GPS requires patients to engage in decisions to use AI tools and obtain informed patient consent when using AI tools for patients,” he said.
“Please also do not enter or identify data.”
However, before conversational AI practices the workflow, RACGP recommends training your GPS on how it is safe to use, including knowledge of the risks and limitations of the tool, and how and where data is stored.
“GPS must ensure that the use of conversational AI tools is compliant with relevant laws and regulations, as well as practice and professional coverage insurance requirements that may affect and ban or control their use,” University Resources said.
“It is also worth considering conversational AI tools to use, use, use, use, use, use, and use to provide more accurate and reliable information than that of popular open use tools.
“These tools must be registered as TGA as medical devices when making diagnostic or treatment recommendations.”
The university is tentatively aware that conversational AI could revolutionize some of its healthcare delivery, but at this time it is recommended that GPS be “very careful” when using technology.
“There are many questions remaining about the impact of patient safety, patient privacy, data security and clinical outcomes,” the university said.
Dr. Hosking, who has not yet implemented conversational AI tools in his own clinical practice, shared his feelings.
“AI continues to evolve and can make a huge difference in outcomes and time savings for patients with GPS,” he said.
“But that will not replace the important role of physician-patient relationships. We need to ensure that AI does not create health inequality through built-in bias.
“This will help GPS weigh the potential benefits and disadvantages of using conversational AI in practice and inform you of the risks associated with these tools.”
Log in below to participate in the conversation.
AI AI writes AI conversational AI for AI
NewsGP Weekly Poll
In general practice, how often do you use conversational AI tools such as ChatGpt, Gemini, and Copilot?
