What Doctors Need to Know About ChatGPT and Other AI Tools

AI News


For many physicians, the rise of artificial intelligence (AI) tools such as ChatGPT can be unnerving and even threatening, especially as patients start using these tools to get answers about their health. not. Experts say that every tool has its risks, and while AI is in many ways in its early stages of functioning, AI tools will live on. Physicians who have educated themselves and learned how to use it have the potential to stay ahead of the curve and further enhance their practice.

ChatGPT and AI Tools in Healthcare: ©ipopba – Stock.adobe.com

“The important thing to remember is that AI is a very powerful tool, and like a hammer that can be used to hit people, it can also be used to build houses,” he says. It is said to streamline data collection. “The fact that people run around and hit people upside down with a hammer is not a reason to fear the hammer itself, but a specific use of the hammer.

Hollingsworth says it’s not dangerous as long as users have a deep understanding of what they’re using. A more important question for doctors to ask, he suggests, is, “‘Am I getting value from this?'”

What is ChatGPT and should doctors use it?

There is currently a lot of talk about AI known as ChatGPT, a large-scale language model developed by OpenAI. Because its free version has recently become available to the general public. Simply put, Hollingsworth describes his ChatGPT as “a very flashy autocomplete.” At its core, AI is a predictive language model that mimics language patterns based on what it sees.

By 2021, ChatGPT was reading almost all written information on the internet, including books, according to data scientist and founder of the Institute for Strengthening Democracy with Artificial Intelligence, Adrian Zidaritz. . Therefore, it can respond to user questions with human-like answers as if it were thinking, but it does not actually employ any logic and has no ability to check facts.

“What doctors need to be aware of is that ChatGPT tends to hoax information right now,” said a software-as-a-service (SaaS) startup that helps medical professionals manage appointments, payments and patients. One Verso founder and CEO, Victor Cortes, said: Is called. “In some cases, asking for information gives you a link to the title of the article, which is not the real thing.” ChatGPT is trained to predict associations between word and sentence patterns. There is no way to guarantee accuracy.

ChatGPT isn’t programmed to perform math or make non-linguistic-based associations, so it can’t provide quantitative results by itself. But it can also be combined with other kinds of algorithms, such as his Mathematica, an AI tool used by scientists in academia, and is likely to be used for more complex applications in the future, says Gidaritz. says.

But even in its early stages, ChatGPT could still help doctors in some ways, Hollingsworth said. “ChatGPT is already widely useful as a writing tool. Doctors have to spend a lot of time typing out correspondence with patients and insurance companies.” A doctor asked his ChatGPT to aggregate basic information. , can save time by determining accuracy and specificity.

It also helps translate more specialized medical terminology to patients, Gidaritz says. “We’re all frustrated by what doctors describe in highly professional medical stories or write in medical records,” he says. “So we can have ChatGPT translate this in a more human and understandable way so[the patient]can understand what the doctor is saying,” he explains.

It could also act like a medical assistant or scribe by recording conversations between doctors and patients and summarizing them in reports, Gidaritz said.

And it could be used as a kind of virtual intern, says Cortez. “What if you need inspiration, additional research or brainstorming to come up with potential solutions, cases or diagnoses? ChatGPT may be a good alternative for that.”

The important thing to note is that ChatGPT makes so many mistakes that patients end up using it for health information research and self-diagnosis. Physicians should be aware of this. “Patients are already using it,” says Zidaritz. “And they are not trained to distinguish[correct information from incorrect information]which can create tension between patients and doctors.”

Other AI tools and their uses

AI tools are also useful for any application related to operations, says Hollingsworth. “Inventory management, patient scheduling, nurse scheduling, vendor management, contract management, etc.”

He feels that, with the right use of AI tools, the administrative overhead could be cut in half. “Where a nurse sits there and he’s messing around with PeopleSoft and inventory management systems, he has AI tools that greatly improve it,” he says.

AI is already being used in diagnostics, such as detecting cancer in patient tissue samples in the lab, an area of ​​ongoing research that holds promise. “For obvious reasons, these AI diagnostic tools must go through FDA approval and even undergo clinical trials,” says Hollingsworth.

AI tools could also help with insurance claims, Gidaritz said. “The entire administrative claims record will be carried over when making or adjudicating claims. Mistakes will be able to be found.”

AI will not replace doctors

AI tools will continue to become more sophisticated and transform some medical practices, but they will not replace doctors for a variety of reasons.

The obvious reason, according to Cortez, is that “the warm relationship between doctors and patients cannot be replaced by AI.” Healthcare is so humane that AI can help, but not replace the work of doctors. ”

Furthermore, according to Hollingsworth, “Most AI needs a well-trained babysitter who is an expert in something and can guarantee the quality of what you get from the tool. Because it is likely to do something, and the person needs to be trained to prevent it. ”

Pau Rué, vice president of artificial intelligence at Infermedica, an AI-based medical instruction platform, suggests that doctors who are unfamiliar with these tools begin to familiarize themselves with them. “My general recommendation is for physicians to familiarize themselves with AI tools. I warn you not, but it will continue to improve.

That said, due to potential legal concerns, these tools require clinical validation and regulatory approval to build trust with both physicians and patients, he said. I am warning you. “Health care providers should investigate the results of these models, especially for adverse incidents and complaints.”

Hollingsworth recommends assessing the tools physicians end up using and looking beyond buzzy new AI tools to find solutions to their problems. “Shop for value, not for AI. His 99% of the value generated by AI will be delivered through a service layer of experts who know how to use AI, I believe. I believe strongly.”

But Cortez feels that doctors using AI tools will be better prepared. “People who start using these tools will be better prepared for the new economy and new ways of doing things.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *