On a warm April afternoon this year, yellow school bus I rolled toward the Lowertown Community House to drop off the kids at the after-school program. Shortly after, the calm was shattered by reports of an active shooter in the neighborhood.
Community centers, where children argued over markers as exhausted parents signed out, quickly became makeshift emergency shelters. Staff rushed dozens of confused children inside and locked the doors as police scrambled to find the shooter.
Minutes later, officers requested access to the center’s surveillance footage. Residents responded. Such footage is now being fed into Ottawa Police Service’s technology systems, including body cameras that will be rolled out across the city starting in late 2025.
For Melissa Thibault-Canas, the social worker on duty, the incident crystallized growth. anxiety. She was shaken by the shooting and deeply worried for her children. but The introduction of body cameras and other AI-assisted tools added to her anxiety as she felt the area was becoming a closely monitored space.
“While I understand the need for safety, I am concerned that police AI will make Lower Town feel even more surveilled,” Thibault-Canas said. flow of capital. She worries that imperfect algorithms will misidentify residents and deepen the mistrust that exists in Lowertown, ByWard Market, Sandy Hill and Vanier..
I understand the need for safety, but I worry that the introduction of AI in law enforcement will make Lower Towns feel even more surveilled.
Melissa Thibault-Canas, Social Worker
Ottawa police are increasing their focus on tools such as facial recognition technology systems, automatic video analysis and body cameras, which promise to speed up investigations and identify suspects more quickly. Tense police relations and historic overpolicing already shape daily life in Lowertown, where about 40 percent of residents live below the poverty line.
The deployment of these technologies raises urgent concerns about privacy, equity, and civil liberties. Even small errors in AI can disproportionately undermine the trust of marginalized communities.
Experts warn that without proper regulation, AI tools could do more harm than good.
Sharon Polsky, president of the Privacy and Access Council of Canada, said Canada is not ready for the use of AI by police.
“Artificial intelligence is not artificial or intelligent. It doesn’t think,” Polsky said. “It repeats the patterns on the data it was trained on, and if that data reflects systematic biases, the output will reflect that as well.”
Polsky points to the failure of Bill C-27, the federal government’s attempt to modernize privacy law by creating rules for AI systems, including technology used in policing. The bill died when Congress adjourned in January.
As a result, Canada remains without a regulatory framework for privacy protections or police use of AI. Experts say this will widen the liability vacuum.
“The bill failed because it had more gaps than safeguards,” Polsky said. “There were exemptions for government agencies, weak oversight, and no clear mechanism for people to challenge decisions made by flawed technology. That’s a dangerous combination when you’re talking about policing,” she added.
She warns that without clear laws and oversight, Canadians risk being exposed to AI surveillance, fraudulent identity checks and vague handling of personal data.
“We rely on private companies for systems that touch every part of our lives, but there is little accountability, no disclosure, and no way for Canadians to challenge the decisions these tools make,” she said.
Her warning comes as the City of Ottawa increases investment in police technology.
Polsky’s concerns are echoed by other freedom advocates, who argue that AI could increase the risk of over-policing marginalized communities and influence how they are monitored and treated. They also say AI could deepen the risk of discriminatory enforcement.
Algorithms trained on biased historical data put people from marginalized communities at greater risk.
Tamil Israel, Canadian Civil Liberties Association
The Canadian Civil Liberties Association (CCLA) has repeatedly raised concerns about the use of facial recognition and predictive policing tools across Canada.
Tamir Israel, director of the Privacy, Surveillance and Technology Program at the Canadian Civil Liberties Union, said these tools can encrypt and reproduce decades of discriminatory policing practices.
“Algorithms trained on biased historical data are putting people from marginalized communities at greater risk,” Israel said. “All this information that reflects historical discrimination is packaged into so-called neutral algorithms.”
As police forces across Canada are currently experimenting with these tools, a biased system could impact communities across the country.
CCLA says facial recognition is an “indiscriminate, unreasonable and warrantless search” of citizens. The organization recommends a moratorium on the use of the technology until Canada enacts independent oversight and transparency requirements, as well as avenues for citizens to challenge algorithmic decisions.
A 2022 report from the National Institute of Standards and Technology (NIST) found that some facial recognition systems misidentify Black and Indigenous people 10 to 100 times more than white people. A 2025 study showed that blurry, low-resolution, or poorly lit images caused a spike in false positive rates, disproportionately impacting Black people, especially Black women.
These findings highlight the risk that racialized populations are at risk of being falsely reported or involved in police investigations.
“These systems don’t fail in a vacuum,” Israel said. “They are failing in ways that reflect who has historically been over-policed.”
The 30 Axon body cameras deployed by the Ottawa Police Service record interactions with the public and create a permanent record for investigation and potential accountability.
The pilot will run until early 2026, with full-scale rollout planned for 2026-2027, pending budget approval.
Israel warns that cameras do not automatically guarantee liability.
“Body-worn cameras can promote transparency,” he said. “But when combined with AI, to Analyzing behavior and identifying individuals in real time creates exponential privacy risks, especially for marginalized communities. ”
By contrast, law enforcement officials say these tools could be useful in criminal investigations. Supporters say this will speed up responses and help police interpret large amounts of data.
These technologies allow us to identify suspects faster, locate missing persons, and respond more effectively to emergencies. They are not a substitute for human judgment. they support it.
Jean-Claude Lemonde, Ottawa Police Chief Information Officer
Jean-Claude Lemond, chief information officer for the Ottawa Police Service, said AI-assisted tools can help investigators process evidence more efficiently, especially in critical cases involving hours of video footage.
“These technologies will allow us to identify suspects faster, locate missing persons, and respond more effectively to emergencies,” Le Monde said. “They do not replace human judgment; they support human judgment.”
Lemonde highlighted that all AI-related tools are: The ones used by Ottawa Police follow provincial and federal guidelines.
“We take privacy seriously and ensure that all of our tools comply with legal requirements,” he said.
Ottawa’s proposed 2026 budget increases the OPS budget by $484 million, bringing the total net operating total to $414.9 million, an increase of $26.1 million and 5.0 per cent over 2025, the force said. The funding will be used to hire 25 new full-time positions in 2026 and fund the full-scale rollout of body cameras.
Police say the increase is part of an effort to modernize policing in Ottawa “to improve community relationships, visibility and trust.”
“None of these reforms, including AI, are aimed at reducing police violence,” said Robin Brown, a community advocate at 613-819 Black Hub. “They are intended to legitimize policing, increase funding, and enable surveillance of vulnerable communities.”
Brown said body cameras and facial recognition software A new iteration of this surveillance. He says any reform, including AI, comes at a cost and is often used to justify further budget expansion.
Whether it’s hiring more Black officers, deploying neighborhood models, or introducing AI, police departments always need more funding. …All reforms do the same thing: justify budgets and expand oversight capabilities.
613-819 Robin Brown of Black Hub
“Police departments always need more funding, whether it’s adding more Black officers, deploying district models, introducing AI, whatever the reforms are,” he said. “AI costs a lot of money, and all reforms do the same thing: justify budgets and expand surveillance capabilities.”
“If they really want to build trust, they need to move a lot of money away from the police and into things that actually keep us safe: housing, employment programs, mental health services. That’s what really makes us safe,” he added.
This financial shift furthers the debate around transparency and accountability.
For community members like Melissa Thibault-Canas, this discussion isn’t just abstract.
By allocating more funding to policing rather than critical community support, Thibault-Canas said., The city has shown that its priorities are out of sync with the needs of the community.
“We need to invest in people, not just cameras and algorithms,” Thibault-Canas said.
