Addressing the expansion of police surveillance technology is crucial to fighting structural racism

Please chip in to support more articles like this. Support for as little as $5 per month!

Image: Lianhao Qu/Unsplash

The killing of George Floyd in Minneapolis has led to calls to de-fund, reform or completely dismantle the police in the United States. Such discussions have also been amplified here in Canada.

With the public gaze now facing law enforcement agencies and their role in systemic racism, mainstream assumptions about the police are being challenged, including the view constructed by popular culture that portrays police as crime-fighting heroes who represent the thin blue line between order and chaos.

An important aspect of re-imagining the police -- though receiving less attention in Canada -- is addressing their expansion of high-tech surveillance technologies.  

We live in a data-driven society. As we go about our daily lives, we leave behind digital footprints. Many are becoming aware that our searches on Google, our interactions through social media, our visits to friends and family, and our credit card purchases are collected and analyzed by corporations to provide more information about us, including where we've been and what we like.

This awareness is partly driven by people's increased knowledge of social media companies' business models that rely on algorithms to direct users to content and promote targeted advertisements. 

What may have started with corporations profiling and making decisions about "consumers" to maximize profits has evolved into governments around the world increasingly collecting and analyzing digital trails as well.

Although such practices in certain applications may be used for the benefit of individuals or wider society, in other areas, including criminal justice and law enforcement, they have proven controversial in recent years.

The proliferation of data-driven technologies have led to what American law professor Andrew Ferguson calls "big data policing." Research reveals how such practices increase digital surveillance, posing a growing threat to civil liberties, and exacerbating bias, overreach and abuse in policing.  

Although discussions about the expansion of digital surveillance technologies within law enforcement have primarily arisen from the United States and Europe, the extent of these practices in Canada remains scantly explored. Vancouver, British Columbia, and London, Ontario have reportedly adopted software that uses data to try to predict where and when property crime is likely to take place.

Police agencies in the U.S., including Chicago and Los Angeles, have shut down their predictive policing programs after audits revealed their discriminatory impact and practical failure. 

Police agencies remain secretive over their use of surveillance technologies that invisibly pierce through boundaries designed to protect liberty.

For instance, only after documents leaked did police agencies in Canada admit to using Clearview AI's facial recognition software, with the RCMP initially completely denying but later admitting to its use. Lying tarnishes the public image of police and erodes trust, which becomes very difficult and time-consuming to rebuild.

Amid a privacy probe, Clearview AI indefinitely suspended its dealings with Canadian police agencies. However, some police agencies, including Peel and York, are likely still procuring facial-recognition technology. 

Similarly, when news broke about American police using the data-mining software MediaSonar to track keywords on social media, including "Black Lives Matter," documents showed Canadian police agencies were also using the software, though many chose not to publicly confirm its use.

The RCMP have too been involved in wide-scale monitoring of social media activity, known as Project Wide Awake, which expanded to proactively identify potential crimes online before they occur. 

Police in Ontario and Saskatchewan have been using a "risk-driven tracking database," which involves sharing information on vulnerable groups of people between police, schools, health care workers and social workers to track "negative behaviour" and identify those potentially "at-risk."

Increased interconnectedness of technologies is expected to grow through the internet of things, expanding the potential of surveillance. If left unchecked, there is a risk that this flow of information between devices will find its way to law enforcement. Smart watches, smart cars, smart appliances and devices like Google Home or Amazon Alexa provide further ways to collect and analyze information on people.

Countless studies, inquires and commissions on Canadian policing have identified Black and Indigenous peoples as disproportionately vulnerable to police surveillance and violence. Whether it be facial recognition, social media analysis, predictive policing software or analyzing the risk of "negative behaviours," these technologies rely on algorithms.

Research has shown that algorithms harbor biases against disadvantaged groups, reinforcing structural discrimination and deepening social inequality. Recognizing their potential consequences, New Zealand recently announced it will be setting standards for how public agencies use algorithms, including requiring them to identify any biases within them.

The adoption of data-driven technologies that expand the scope and depth of surveillance by police therefore moves beyond privacy and involves questions related to our democracy, including human dignity and the right to be free from discrimination. This makes the issue a concern for all Canadians.    

The killing of George Floyd sparked renewed outrage and activism against institutional racism and police brutality. Though often shrouded by notions of tech-neutrality, objectivity, efficiency and progress, uncovering and re-structuring the silent and invisible role that surveillance technologies play within these institutions must be an important part of the path forward.

Joe Masoodi is a policy analyst on technology, cybersecurity and democracy at the Ryerson Leadership Lab at Ryerson University. He has previously conducted research at the Surveillance Studies Centre at Queen's University on surveillance, technology and policing, and has completed degrees from the Royal Military College of Canada and Queen's University. 

Image: Lianhao Qu/Unsplash​

Thank you for reading this story…

More people are reading than ever and unlike many news organizations, we have never put up a paywall – at rabble we’ve always believed in making our reporting and analysis free to all, while striving to make it sustainable as well. Media isn’t free to produce. rabble’s total budget is likely less than what big corporate media spend on photocopying (we kid you not!) and we do not have any major foundation, sponsor or angel investor. Our main supporters are people and organizations -- like you. This is why we need your help. You are what keep us sustainable. has staked its existence on you. We live or die on community support -- your support! We get hundreds of thousands of visitors and we believe in them. We believe in you. We believe people will put in what they can for the greater good. We call that sustainable.

So what is the easy answer for us? Depend on a community of visitors who care passionately about media that amplifies the voices of people struggling for change and justice. It really is that simple. When the people who visit rabble care enough to contribute a bit then it works for everyone.

And so we’re asking you if you could make a donation, right now, to help us carry forward on our mission. Make a donation today.


We welcome your comments! embraces a pro-human rights, pro-feminist, anti-racist, queer-positive, anti-imperialist and pro-labour stance, and encourages discussions which develop progressive thought. Our full comment policy can be found here. Learn more about Disqus on and your privacy here. Please keep in mind:


  • Tell the truth and avoid rumours.
  • Add context and background.
  • Report typos and logical fallacies.
  • Be respectful.
  • Respect copyright - link to articles.
  • Stay focused. Bring in-depth commentary to our discussion forum, babble.


  • Use oppressive/offensive language.
  • Libel or defame.
  • Bully or troll.
  • Post spam.
  • Engage trolls. Flag suspect activity instead.