rabble blogs are the personal pages of some of Canada's most insightful progressive activists and commentators. All opinions belong to the writer; however, writers are expected to adhere to our guidelines. We welcome new bloggers -- contact us for details.

Facial recognition technology requires better oversight

Please chip in to support more articles like this. Support rabble.ca today for as little as $1 per month!

People on the streets of Tokyo. Image: Jason Ortego/Unsplash

Walk around any city and your face will be caught on camera and might even be added to a facial-recognition database. That data can now be processed in real-time. Regulations about how it can be used are minimal and generally weak.

The military, law-enforcement agencies, and commercial corporations are exploiting facial recognition and artificial intelligence (AI) to collect personal data. Yet, the legal frameworks controlling how that data can be used have not kept pace with the technology.

In May 2019, San Francisco became the first U.S. city to ban the use of facial recognition by its authorities. However, the city ordinance did not prevent private companies from using facial ID in ways that people find objectionable.

In July 2019, the first independent evaluation of the use of facial recognition by London's Metropolitan police warned it is "highly possible" the system would be ruled unlawful if challenged in court.

The use of facial recognition technology has faced a backlash in Canada. In May 2019, privacy and civil liberties advocates called for an immediate moratorium on the Toronto police's use of facial recognition technology, saying it was done without public knowledge and without proper checks and balances.

"There is no transparency associated with this," former information and privacy commissioner of Ontario Ann Cavoukian told the CBC. "And you can't hold people accountable if you don't know what's going on…And while people may not be aware of it, there's a very high false-positive rate for facial recognition." Toronto police ran the pilot project from March 2018 to December 2018 and reported that it had been an "immediate success" in terms of identifying criminal offenders and previously unknown suspects.

As face recognition becomes more and more common, there are also growing concerns about the gender and racial bias embedded in many systems. Writing in The Atlantic, Tiffany C. Li, a fellow at Yale Law School's Information Society Project, puts the onus on tech companies themselves:

"Developers need to go further and build actual privacy protections into their apps. These can include notifications on how data (or photos) are being used, clear internal policies on data retention and deletion, and easy workflows for users to request data correction and deletion. Additionally, app providers and platforms such as Apple, Microsoft, and Facebook should build in more safeguards for third-party apps."

All well and good, but misuse, misappropriation, and mistaken identity require legislation and regulation that includes better privacy laws that address the potential for harms inherent in these technologies. In Li's opinion:

"To deal with privacy risks in the larger data ecosystem, we need to regulate how data brokers can use the personal information they obtain. We need safeguards against the practical harms that invasions of privacy can cause; that could mean, for example, limiting the use of facial-recognition algorithms for predictive policing. We also need laws that give individuals power over data they have not voluntarily submitted."

In short, global corporations play by their own rules and require oversight. The problem is how to guarantee compliance.

Philip Lee is WACC general secretary and editor of its international journal Media Development. His publications include The Democratization of Communication (ed.) (1995), Many Voices, One Vision: The Right to Communicate in Practice (ed.) (2004); Communicating Peace: Entertaining Angels Unawares (ed.) (2008); and Public Memory, Public Media, and the Politics of Justice (ed. with Pradip N. Thomas) (2012).

WACC Global is an international NGO that promotes communication as a basic human right, essential to people's dignity and community.

Image: Jason Ortego/Unsplash

Thank you for reading this story…

More people are reading rabble.ca than ever and unlike many news organizations, we have never put up a paywall – at rabble we’ve always believed in making our reporting and analysis free to all, while striving to make it sustainable as well. Media isn’t free to produce. rabble’s total budget is likely less than what big corporate media spend on photocopying (we kid you not!) and we do not have any major foundation, sponsor or angel investor. Our main supporters are people and organizations -- like you. This is why we need your help. You are what keep us sustainable.

rabble.ca has staked its existence on you. We live or die on community support -- your support! We get hundreds of thousands of visitors and we believe in them. We believe in you. We believe people will put in what they can for the greater good. We call that sustainable.

So what is the easy answer for us? Depend on a community of visitors who care passionately about media that amplifies the voices of people struggling for change and justice. It really is that simple. When the people who visit rabble care enough to contribute a bit then it works for everyone.

And so we’re asking you if you could make a donation, right now, to help us carry forward on our mission. Make a donation today.

Comments

We welcome your comments! rabble.ca embraces a pro-human rights, pro-feminist, anti-racist, queer-positive, anti-imperialist and pro-labour stance, and encourages discussions which develop progressive thought. Our full comment policy can be found here. Learn more about Disqus on rabble.ca and your privacy here. Please keep in mind:

Do

  • Tell the truth and avoid rumours.
  • Add context and background.
  • Report typos and logical fallacies.
  • Be respectful.
  • Respect copyright - link to articles.
  • Stay focused. Bring in-depth commentary to our discussion forum, babble.

Don't

  • Use oppressive/offensive language.
  • Libel or defame.
  • Bully or troll.
  • Post spam.
  • Engage trolls. Flag suspect activity instead.