When instances of inappropriate use by corrections staff were identified, a privacy risk assessment was conducted.
photograph: RNZ / Bless Tom
Correctional staff have been warned about the use of artificial intelligence tools after some staff were found to be using them to generate formal reports.
Shokusha said it takes misuse of its technology “extremely seriously” and has made it clear to staff that the use of AI tools outside of their approved uses is “unacceptable.”
RNZ understands there have been instances where staff have used AI to produce formal reports such as Enhanced Supervision Order Reports.
In response to questions from RNZ, Chief Probation Officer Toni Stewart said Corrections’ use of AI is currently limited to Microsoft Copilot. Other publicly available AI applications are blocked on the Corrections network.
“This ensures the use of AI in correctional facilities within an environment where privacy and security controls can be controlled.”
Do you know more? email sam.sherwood@rnz.co.nz
Staff use of Copilot was governed by an AI policy in line with guidance from the government’s Chief Digital Officer.
“This policy makes clear that no personal information, including identifying details, health or medical information, or details about people in the custody of a correctional facility, may be entered into First Officer Chat.”
Stewart said Copilot’s adoption rate has remained “relatively low” since its introduction into correctional equipment in November 2025, with about 30% of correctional staff using the tool.
“Copilot is intended to be used only as an assistance tool to create and moderate content that does not contain confidential information. Remediation staff only have access to the free Microsoft Copilot chat feature, which is part of our existing Microsoft 365 license and is a standalone chat feature that is not integrated into our system data.”
Stewart said the policy is clear that under no circumstances should Copilot Chat be used to draft, compose, analyze or generate report or assessment content that contains personal information. Staff may be subject to audits and all prompts are searchable and exportable.
“We have recently become aware of a small number of incidents where staff have used Copilot to support their work in a manner that does not comply with our AI policies and guidance.
“As soon as we became aware of these cases, we took action to make it clear that any use of Copilot outside of its approved uses is unacceptable.”
If inappropriate use was identified, a privacy risk assessment was conducted.
“Our leaders, especially those in community corrections where staff produce a large number of reports, are actively engaged in ongoing dialogue with staff on the appropriate use of AI.
“Staff are regularly reminded of the AI policy and other relevant guidance.”
Stewart said Corrections is “actively working” to ensure the continued use of AI is “safe, secure and appropriate.”
“Corrections has an AI Assurance Officer, a function held by the Head of Cybersecurity who is responsible for guiding the safe and secure deployment of AI, including external reporting to the Government’s Chief Digital Officer.”
“Corrections is part of the Government-wide Community of Practice on AI, managed by the Government Chief Digital Officer. We have also established an AI Working Group to provide formal governance of AI, including embedding safe and ethical AI practices across the Department and providing consistent advice on safe use.”
Mr Stewart said misuse of the technology was taken “extremely seriously”.
“We are committed to protecting the privacy of those we work with and maintaining the professional integrity of our evaluations, reports, and litigation documents.”
As of Friday, the privacy commissioner had not been notified, a correctional spokesperson confirmed.
“In addition to our existing guidance, our privacy team is working with relevant working groups to provide further guidance on the use of Copilot in the community corrections space. All information entered into Copilot by a correction remains within the domain of the correction.”
A spokesperson for the Office of the Privacy Commissioner (OPC) said in a statement that the Privacy Act applies to the use of personal information, including by AI tools.
The spokesperson said it is the responsibility of government agencies to understand the technology they are using and to ensure that its use meets privacy requirements.
“According to the amendment, that policy prohibits staff from entering personal information into Copilot Chat or using Copilot to create reports or ratings that include personal information.
“If this is correct, privacy concerns would be limited to cases where correctional officers use Copilot in violation of correctional policy. If Copilot is used in a manner that violates correctional policy, the OPC would expect the Department of Corrections to take appropriate steps to correct this.”
Sign up for the daily newsletter “Ngā Pitopito Korero” Handpicked by our editors and delivered straight to your inbox every weekday.
