GPS uses artificial intelligence to record patient consultations, but how secure is personal data?

AI News


For the past 12 months, Dr. Grant Blashki has used what is called a “medical intern” in all appointments.

His internship is completely digital. It is an artificial intelligence scribe who hears everything his patients say.

“It's almost surprisingly accurate,” the GP told 7.30.

“Sometimes it misunderstands the name of something. Sometimes it misunderstands the diagnosis.”

He says that patient consent is essential when using AI scribes in clinical settings, but most people say they don't have problems.

Homepage of the website heidihealth.com.au

How Heidi's Health is sold online. (heidihealth.com.au))

“I ask for consent. Sometimes people don't want me to use it. It's absolutely fine, but almost everyone is happy with it and it simply streamlines the work,” Dr. Blashki said.

“I'm focused on them, so that's good for the patients.”

Dr. Blashki says he is so dependent on scribes that he will have a hard time making appointments without it.

“I use it in almost every consultation,” he said.

If you are about to forget about your stethoscope or scribe software, take the Scribe software. That's that part of my job now.

How safe is patient data?

A smiling man wearing glasses, a tie and a blazer.

Dr. Blashki says he will remove all transcriptions from the AI software. (Supply: Beyond Blue))

When the patient reveals intimate details about his medical history with Dr. Blashki, he says that the scribe is constantly collecting sensitive data.

“Maybe infectious diseases, social issues that they probably don't want to know about with their partners – any kind of complexity that could get into a note.”

He said.

“So, at the end of each consultation, we'll make sure we actually remove all transcriptions from the software.”

Dr. Blashki uses software from Melbourne-based Heidi Health, one of the leading AI Scribe tools used by Australian clinicians.

Heidi Health rejected the interview's request in the 7/30s, but its CEO and co-founder, Dr. Thomas Kelly, provided written responses to questions about patient privacy.

“Heidi currently supports around 2 million visits a week, from Australia to Canada to New Zealand to the US and the UK,” Dr. Kelly said.

Text on computer screen.

Doctors can remove patient notes from Heidi's health.

“In each region, data is stored in compliance with the local healthcare regulations and privacy policy.”

“Here we are the EU's Australian Privacy Principles (APP) which is the GDPR. In the US it will be HIPAA. All data is protected according to ISO 27K and SOC2 requirements. This is the highest corporate standard that exists.

Dr. Kelly says all data is protected according to “the highest corporate standards that exist.”

We are audited by third parties to protect our data and ensure the security we have.

Lyrebird Health is another AI Scribe software company based in Melbourne.

It is used by GPS, surgeons, psychiatrists and pediatricians. The company says the software was used in Australia's “200,000 consulting” last week.

“Anyone in Australian customers, all data is 100% stored in the Australian sovereign database. It's clearly different when overseas,” Lyrebird Health CEO Kai Van Lieshout told 7.30.

A man wearing a t-shirt.

Lyrebird Health CEO Kai Van Lieshout says all patient notes will be automatically deleted in 7 days. (ABC News: Dunfermer))

We've not been hacked before, but that's very important.

Patient notes are automatically deleted from Lyrebird Health's system in 7 days (medics should back up the notes if they want to keep them), but users have the option to manually extend this period to six months.

“For us, it's definitely really gone,” Van Leechute said.

“We had a doctor who needed what we had, so we know it would be removed in 7 days and we don't realize there's nothing we can do.”

John Laroll is an assistant professor of analysis and administration at Notre Dame, and warns that there is always a risk factor when storing digital data.

“Many of these models are very data-driven, so the more data you have, the better it usually is,” Lalor told 7.30.

Smiling man wearing suit jacket and tie.

John Laroll of Notre Dame says there is always a risk factor when it comes to digital data. (Supply: University of Notre Dame))

“On the one hand, if you have more data from a patient, you can usually improve the model, but on the other hand, there is a privacy risk that the data will be exposed if it is leaked or hacked.”

He says patients and doctors need to make sure that AI Scribe Companies is transparent about how data is stored and used.

“Make sure that you are clear about how your data is being used and how your company is being used.

“When you use an individual, if you're uncomfortable with using something like that, you can talk to your doctor to see if it's an option or see if you can get more information about what's going on when you get the data to the Scribe system.”

“Magic” notes

To demonstrate how Heidi Health's AI Scribe works, Dr. Blashki won 7.30 through a mock appointment on headaches.

We discuss that headaches were “on and off almost every morning” last month and that there is no history of migraines.

Heidi Health then handles the conversation. In the process, you will be called “Making Magic” and create a consultation note.

Text on computer screen.

Notes generated by Heidi Health after consultations in the 7.30s with Dr. Blashki. (ABC News: Richard Sidenham))

The software also proposes “differential diagnosis” including “tension type headache” and “neck headache.”

“We look at some medical software, and some AIs generally come up with differential diagnosis and suggestions. And doctors really have to turn their minds and look more of them as suggestions than answers,” Dr. Blashki said.

Dr. Kelly said the software “sought to be more than a literal summary, and it can identify the clinical basis that supports a set of questions.”

In response to a mock consultation in the 7th and 30s, Dr. Kelly stated:

Heidi does not provide a differential diagnosis in the absence of a clinician, and it is up to the clinician to review the document for accuracy.

Van Lieshout said Lyrebird Health has not produced any potential diagnosis after consultation.

“We don't try to tell the clinician what to do, if that makes sense,” he said.

“It's subjective: what did the patients explain? Did I have any form of testing? What was their blood pressure rating? What was my assessment of my diagnosis or situation? And what was the next step?

“Break that conversation into those categories.”

“Essential” tool

Brett Sutton is standing in front of a purple and black background.

Brett Sutton says AI Scribes have become “essential” for some GPS. (AAP: James Ross))

Dr. Blashki said that about 50% of GP clinic doctors working in Melbourne use AI Scribe software for all their consultations.

He also says he received referral letters from experts that appear to have been created by AI.

“I had one letter that thought, 'Oh, I don't think they checked this properly. They obviously didn't have one of the diagnoses at all,” he said.

It's like a car GPS. You are still a driver, there is a suggestion, but you need to check it.

Victorian chief health officer Brett Sutton admitted protecting patient data is the industry's biggest concern, but AI scribes believe it has become “essential.”

“I think regulators need to make sure they're safe,” Dr. Sutton said.

“Obviously, the clinicians using it are responsible for properly recording and keeping sensitive health information and keeping it safe so that other clinical notes are treated in the exact same way that they are historically handled.”

clock 7.30Monday to Thursday at 7:30pm ABC iview and ABC TV



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *