Canada’s provincial DPAs discuss approaches to medical AI writing instruction

Applications of AI


Artificial intelligence tools for note-taking and transcription services are becoming commonplace in healthcare settings. Data protection authorities in each province of Canada are monitoring this proliferation and are taking proactive steps, each issuing guidelines for their jurisdiction regarding the use of transcription tools.

The potential benefits of AI transcription in healthcare are significant and far-reaching, with some estimates that AI note-taking or scribing tools could save clinicians more than 10 hours per week, according to a 150-physician pilot program led by the Ontario Medical Association. However, this simplification for professionals can have a negative impact on sensitive patient data if the tools are not properly implemented.

In a breakout session at the IAPP Canada Symposium 2026, DPA representatives from British Columbia, Ontario, and Newfoundland and Labrador explained their respective guides, highlighting how statutory variations on the protection of sensitive health data impact each province’s recommendations.

Where regulators landed on their guidance

The Information and Privacy Commissioner of Ontario was the first representative body to issue guidance on AI scribes. This guidance applies to both AI developers and organizations looking to source and deploy AI scribes. IPC senior health policy adviser Nicole Minutti said Ontario’s guidance “focuses on the core functionality” of AI scribes, whose primary use is for clinicians to transcribe patient notes.

However, Minutti added that the guide was also developed to reflect how advances in AI can be further integrated into other medical applications. The potential for AI in clinical practice could evolve to the point where tools perform tasks such as electronic patient referrals, laboratory test recommendations, and broader “clinical and decision support.”

“While we remain vigilant in this direction, we needed to focus our guidance on core capabilities,” Minutti said. “Wherever possible, we have integrated into our guidance and addressed some of the emerging concerns that are emerging across these broader capabilities.”

In British Columbia, the Information and Privacy Commissioner issued its own AI writing guidance in January. IPC policy analyst Sarah McIntosh said British Columbia’s guidance is limited to procurers of AI scribes to ensure their models comply with the province’s privacy laws.

“There are many important considerations for these tools, including human rights and equity, but when we drafted these guidelines, we were determined to ensure privacy and access oversight,” McIntosh said. “Our focus was not on preventing the use of these tools, but rather on providing guidance on how to implement them in a legally compliant manner. This also promotes trust among the people whose personal information is collected by these tools, so we hope this guidance will be a practical roadmap and useful checklist for organizations.”

Newfoundland and Labrador’s Information and Privacy Commissioner follows Ontario and British Columbia in issuing AI writing guidance and using those guides to help develop its own recommendations. J. Ruth Marks, access and privacy analyst at IPC NL, said OPIC NL does not have formal “ordering” authority but is responsible for enforcing the state’s health information privacy law, the Personal Health Information Act.

IPC NL’s guidance focuses specifically on the relationship between data controllers and information controllers and AI scribe services.

“(PHIA) has certain types of contracts that are required when using the services of information controllers, and we also considered that other controllers might have some kind of unique relationship with an AI vendor, which we hadn’t considered,” Marks said. “We wanted administrators to think about their relationships with vendors, and we wanted people to think about that, because it’s tied to consent. So if administrators consider an AI vendor to be an agent or an information custodian, they don’t need explicit consent to use that tool, but the consent, even if it’s implicit, must be knowledgeable.”

legal boundaries

Some provinces, such as Newfoundland and Labrador and Ontario, have sector-specific laws governing how health information is used. By comparison, this is not the case in British Columbia, where sensitive health data is treated like any other personal data collected by private entities under the Private Sector Privacy Act.

Ontario’s Minutti said the enforcement powers of each province’s DPA range widely, including issuing enforcement actions, issuing remedial orders, and issuing recommendations. Despite the disparate powers, she noted that a “fragmented network of instruments” is emerging to regulate how the medical sector and AI are integrated. These documents range from federal requirements contained in the federal Private Sector Privacy Act and the Personal Information Protection and Electronic Documents Act to state statutes, court orders, and administrative tribunal decisions.

“While the use of artificial intelligence in the medical field is sometimes seen as lawless or unregulated by the state, it is in fact highly regulated,” Minutti said. “I don’t think enough is said about how many gaps and overlaps there are in AI, especially the overlaps that exist in regulation in this area.”

McIntosh said British Columbia’s lack of dedicated legislation specific to medical information has created a scenario where most primary care practices are subject to PIPA, while public entities such as hospitals are covered by a separate law, the Freedom of Information and Privacy Protection Act. As a result, the two legal frameworks “don’t always work well together,” she said, further complicating how to integrate AI scribes into medical settings.

“Having that fragmented system in our state, with different privacy and access, and being applied in different clinical settings depending on which clinical setting you’re in, it’s not really an ideal environment for the health sector,” McIntosh said.

He added that if the state’s legislators work toward passing a health information bill that would bring both hospitals and primary care clinics under one law, they also have an opportunity to achieve the best outcome for a potential modern regulatory framework for AI integration in health care.

“For more than a decade, my office has been pushing for an independent health information law, (because) consistent accountability, authorization, authorization, and interoperability of health information are critical,” McIntosh said. “Our state legislators have a real opportunity to take the lead by not only uniting all stakeholders under one law, but also by enacting a modern health information law built for modern health information needs, including AI.”



Source link