As ambulance leaders turn to technology, how does the NHS navigate AI's “Wild West”?

AI News


aWith the NHS diagnosed as “broken,” the government has made a big bet that Tech is an important treatment for the disease system, and has promised to be the most “AI-enabled” health system in the world.

With services facing the fight for finances and the lack of staff capable of meeting patient needs, health leaders have been investigating AI use for some time. The evidence is already there for use to read scans of patients. But more widely, how does the use of AI tools translate into emergency medicine?

Here, the ambulance leader says Independence The reality of using AI at a complex, fast pace And it's potentially dangerous environment.

“We have to get it right for the first time.”

Drone guidelines, traffic light prediction, diagnostic assistance, live language translation These are just a few ways AI can be used within the UK ambulance sector.

Graham Norton, the digital transformation lead at the Northern Ambulance Alliance, believes that AI will become a daily tool for ambulance staff.

“There's absolutely no reason why AI doesn't become a routine part of everyday activities across the ambulance sector, and that should be,” he said.

Norton and Johnny Sammut, directors of digital services at the Welsh Ambulance Services NHS Trust, both agree that AI has great potential to help healthcare workers combat increasingly challenging environments.

However, the pair say it comes with a heavy safety warning.

“Why we're different [in ambulance services, compared to the rest of the health service] This is real life and death, and in many cases, it certainly can't even call and even put a patient in the eyeball. So that's not to say there's no big enthusiasm [for AI] And huge and huge possibilities. But we have to get it right for the first time,” Norton says.

In the NHS realm, such as diagnostic services, AI is used to read patient scans. However, if concerns are flagged, these measurements are usually subsequently viewed by a medical professional to create a safety net.

However, Norton warned: “If you are using AI at the emergency medical level For example, talking about 999 and 111 phones. Due to the nature of what you are trying to do, you don't have the same level of safety net. ”

Addressing health inequality

Yorkshire Ambulance Services is currently one of the few trusts testing the use of AI within the service, with the main focus being tested A safe AI transcription tool.

These are what are called “ambient AIs,” which allow you to listen, record and transcribe the ambulance notes of scenes or call handlers. Norton said the device can also be used to translate non-English-speaking patients using Google Translate-type tools.

“If AI can help with translation and transcription, it will be possible to address true health inequalities. There are real health inequalities for people who do not speak English as their primary language,” he said.

Meanwhile, in Wales, Sammut said the service is already looking at “immediate time saving benefits” in terms of using AI to reduce the burden on staff managers.

Last month, Trust softly launched 111 online virtual agents, similar to the AI ​​chat feature, providing a way to have a conversation with patients about their symptoms.

In another completely different application, Sammut said there is work to link AI-enabled drones with dangerous area response teams. This is a team that responds to complex, major emergencies.

“This provides sky situational awareness, especially in complex and dangerous scenes. Now, AI is built into the technology, and those drones have something like intelligent tracking. Together, they will be able to elicit thermal and non-thermal imaging, allowing AI to use AI to investigate and track specific areas of the scene.

The service also hopes to develop AI that can help forecast ambulance demand. It can also help field paramedics by interpreting the patient's skin abnormalities.

“The risk of not doing this [using AI] It's much bigger [than not]. When we think about the NHS we are in today, the level of burden and funding that belongs to our staff… not chasing AI is frankly dangerous. ”

However, in these high-risk, fast-paced areas, ambulance executives pointed out several risks.

“Another thing that comes to mind at that point is what downstream risks you create with AI. I'm thinking from a cybersecurity perspective. So one of the very realistic concerns you have with AI is how to avoid, track and mitigate AI addiction.

“AI addiction is when someone supplies one of your AI models to an entire pile of fake information and fake data… You know that the price of what we're doing wrong isn't just money. It's life.

News stories from the past two years, including major cybersecurity attacks on the NHS and individual hospitals, show just how unstable this is.

From a risk management perspective, Norton also points out that there is a need for a way to evaluate AI providers.

The possibilities are “amazing,” he said, but service needs to be “a little slower.” “We have to avoid the wild west here,” he adds.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *