From Clearview to Bodycams: The Normalization of Mass Surveillance in Canadian Law Enforcement

February 20, 2026

Shaquille Morgan

In a haste to leverage recent advancements in AI, law enforcement institutions across Canada are now integrating facial recognition technology. The latest police services to make this move include the Halton Regional Police ServicePeel and York Regional Police, and the Edmonton Police Service (EPS), who will use facial-recognition-enabled bodycams for a pilot period.

While leveraging emerging technologies is important, facial recognition can easily lead to mass surveillance and has proven to be inaccurate when used on Black and racialized people. This is why federal and provincial governments must urgently create legislation to govern facial recognition technology.

Past Use of Facial Recognition in Canada Through Clearview AI

This may come as a surprise to some, but the use of facial recognition by Canadian law enforcement agencies is not new. Just under five years ago, reports revealed that facial recognition company Clearview AI was being used by multiple Canadian law enforcement agencies and organizations. The company’s software allowed police to feed photos of suspects and victims that were then compared to billions of images scraped from the internet and social media accounts without consent. What then emerged were widespread privacy concerns, prompting Canadian privacy commissioners to call for Clearview AI to halt all operations in Canada.

At the time, despite being deemed useful by law enforcement, the use of facial recognition seemed like a gross misuse technology. The angst surrounding its use was tied to its privacy violations which subjected Canadian residents to unreasonable search or seizure. Put differently, all Canadians in the database were unreasonably treated as suspects (or victims) and virtually searched without cause or knowing.

This angst was also owed to the potential for a shift to mass surveillance. Often tools, techniques, and practices are implemented based on their impact on outcomes. If they contribute to a desired goal, they become permanent fixtures in systems where the scope of its use broadens and muddies over time. What this means is facial recognition can be easily compromised based on shifting government objectives, and positive intentions can lead to negative long-term outcomes and unintended consequences that the public is oblivious to.

Since the tech began scaling the most troubling point about facial recognition was its demonstrated inaccuracies. A 2018 study by Joy Buolamwini, a Black M.I.T. researcher, found that facial recognition technology was accurate 99 per cent of the time with white men, but wrong about 35 per cent of the time for darker-skinned women. This happened because the datasets the model was trained on were predominantly lighter-skinned people. Subsequently, not only was there space for coded bias, but there was a clear indication that it was not made with Black and racialized people in mind. More troubling is the fact that facial recognition developers and law enforcement agencies did not find it necessary to test its limitations on diverse groups. This leads me to question where would we be if Buolamwini didn’t question the system?

Problems with facial recognition identifying Black women in a 2018 study, and in a modern U.K. police institution.

Likely, the answer is concerns would have centered on the violations of constitutional privacy and security rights. In Canada, because Section 8 of the Canadian Charter of Rights and Freedoms guarantee all Canadians “the right to be secure against unreasonable search or seizure”, there is a broader relevance to Canadian society. This broader relevance translates to a more eminent threat warranting greater attention and urgency because it violates the Charter.

In addition, concerns would have likely centered on how police integrated the technology. With Clearview AI, consider how there were no internal policies to regulate its use, and transparency was non-existent. So, evidenced by today’s police use of facial recognition technology, the main problem was how police were using it, not necessarily that police were using it in general. This suggests that understanding the limitations of its use and its application to Black and racialized people would not have been a priority. And that is problematic.

Present Use of Facial Recognition in Canadian Law Enforcement

So, what’s changed?

Yes, AI has improved since the Clearview AI fiasco, and from a policy perspective precincts have implemented regulations that only permit the scanning of legally obtained images from crime scenes.

From a procedural standpoint, images from crime scenes can be compared to existing databases of mugshots taken when criminal charges are laid. The results are used to develop a list of investigative leads that trained facial recognition analysts review. This system is consistent across all precincts.

However, there’s a distinction with EPS’s system, whose facial-recognition-enabled bodycams will collect new footage and images during investigations or enforcement. EPS is the first police force in the world to use this type of technology — a perfect example and foreshadow of how scope of use is expanded over time. Their bodycams will record and detect faces of people within four metres, after which the data is sent to a cloud to compare against a database of people of interest. If no match is found, the facial data is immediately deleted.

Past and present use of facial recognition technology in Canadian law enforcement institutions.

In comparison to Clearview AI, it sounds like these law enforcement agencies are taking the necessary precautions to mitigate risk. Still, we do not have a full understanding of whether they can achieve their intended result or what the unintended consequences are. So, mistakes, limitations, and misuse will be realized in real-time, and some unsuspecting Canadian citizens will be on the losing side of these outcomes. Funds are also being used for, in most cases, full integration of facial recognition technology which represents massive financial commitments. This comes despite numerous law enforcement agencies experiencing funding gaps. Some will argue this is why the technology is essential — it will solve cases faster and free up personnel. My rebuttal is it is too early of a commitment as we need time to understand whether facial recognition’s inaccuracies are worse than anticipated, and if it causes societal harm. This could particularly be relevant for Black and racialized people given that there are no recent studies testing facial recognition’s inaccuracies with Black and racialized people. Without independent studies, we are pushed to accept the accounts of facial recognition developers and police precincts. Pushed, despite the software being created without Black and racialized people in mind and the lack of wherewithal to remedy this.

On a secondary level, what is concerning is the implementation of facial recognition technology increases the potential for a shift to mass surveillance. As its integration becomes more standardized an inevitable process of normalization will take place. With this come subtle shifts like what we see with EPS. What materializes is a gradual transition from the use of facial recognition only in suspect profiles to its integration in bodycams that are recording new footage.

EPS’s facial recognition bodycams are only active during investigations and eliminate irrelevant footage. As time goes on, there may be a desire to have facial recognition bodycams recording at all times and saving all footage. Certainly, we may want to believe this will never happen; but once the infrastructure is fully set up, it merely takes a shift in government personnel and beliefs for mass surveillance to become a reality. We could also see a shift where facial recognition is used to scrape information from social media and live streams. The ends will be used to justify the means and suddenly, what we escaped with Clearview AI once again becomes our reality.

The potential for mass surveillance without federal legislation in Canada.

Comprehensive Canadian Legislation as a Solution

So, what do we do? As it stands, Canada does not have a single, comprehensive law specifically dedicated to governing the use of facial recognition technology by police. Instead, its use is managed through a “patchwork” of existing statutes and guidelines. This means is a collection of general laws and common law principles that were never intended specifically for facial recognition technology (but are broad enough in scope) are currently used to regulate it by default.

Canada's current legislative patchwork for facial recognition technology.

At the federal level this patchwork consists of Section 8 of the Charter, the Privacy Act, and the Criminal Code (specifically section 487.01). The only province with a law explicitly naming and regulating biometric data is Quebec, through the Act to Establish a Legal Framework for Information Technology. The Act requires organizations to obtain express consent before verifying a person’s identity and mandating that any biometric database be disclosed to the province’s privacy regulator, the Commission d’accès à l’information (CAI), at least 60 days before deployment. The Canadian government can use this as a model for federal legislation by creating a centralized notification system for police databases and legally codifying “no-go zones” for mass surveillance. Experts suggest the federal government could adapt Quebec’s “broad and liberal” interpretation of privacy to establish a national standard that requires police to prove a specific, evidence-based need for facial recognition, rather than general “usefulness”, and mandates independent audits for accuracy and racial bias before any system is brought into service.

Facial recognition is here to stay. And with adoption scaling, if the government truly supports the privacy and security rights of Canadians, the time to for the Canadian government to act is now.

Leave a Comment

Your email address will not be published. Required fields are marked *