Candian flag. Image credit: Hermes Rivera/Unsplash

“Today, anyone can own a corner of the internet. More people than ever can benefit from enhanced access to knowledge, community and collective action. But there is also the darker side. Along with a more open and accessible public square has come a less trustworthy and safe one. This represents one of the central paradoxes and challenges of our times.”

These words are taken from the Canadian Commission on Democratic Expression’s (CCDE) final report, a three-year initiative designed to offer insights and policy options that support the cause of Canada’s democracy and social cohesion.  

After nine months of study and deliberations, the commission has identified a series of functional steps to enable citizens, governments, and platforms to deal with the matter of harmful speech in a free and democratic, rights-based society like Canada. It recognizes “the complexity of the issues at play and offer these as a path forward and with the knowledge they will be subject to further debate and moulding.”

The commission based its work on the generally accepted principle that free speech is fundamental to a democratic society and that the internet is a means of enabling more people to participate in public discussions and debates. At the same time, it sees the rise of hatred, disinformation, conspiracies, bullying, and other harmful communications online as undermining these gains and having a corrosive impact on democratic expression in Canada.

“There is no doubt Canadians are troubled by the situation and are looking for action,” said the commission, noting a Ryerson University survey that nearly half of 3,000 respondents report seeing “both deliberately false information and divisive content at least once a week” on the internet. “Six out of 10 also believe that the government should require social media companies to fix the problems that they have created.”

In its own work with communities in the global south whose communication rights have been impinged upon, attacked, or denied, the World Association for Christian Communication has noted similar concerns. In today’s digital world, it is relatively easy for governments and corporate interests to suppress social dissent and peaceful activism simply by controlling the internet and censoring its social media platforms.

Civil society is rightly concerned that policy responses often fail to put citizens first, or to find adequate and balanced means of reducing online harms and guarding against the potential for over-censorship of content.

The CCDE is calling for a statutory “duty to act responsibly” imposing an affirmative requirement on platforms under legislation and regulation, including social media companies, large messaging groups, search engines, and other internet platforms involved in the dissemination of user-generated and third-party content. In addressing harms, the details of this duty must take account of principles, such as the fundamental nature of free speech.

It wants a new regulatory body, operating within legislated guidelines, that represents the public interest and moves content moderation and platform governance beyond the exclusive preserve of private sector companies.

It calls for a social media council to serve as an accessible forum in reducing harms and improving democratic expression on the internet and a world-leading transparency regime to provide the flow of necessary information to the regulator and the council. This will also assist researchers, journalists, and members of the public to access the information required for a publicly accountable system.

There will have to be a functioning mechanism — an e-tribunal — to facilitate and expedite dispute resolution and a process for addressing complaints swiftly and lightly before they become disputes. Crucially, there must also be a mechanism to quickly remove content that presents an imminent threat to a person.

None of these recommendations is particularly contentious to observers of countries like China, Russia, Poland, the Philippines, and several others whose governments are adept at controlling the flow of information and knowledge and whose heavy-handed responses frequently infringe human rights.

Of course, understanding the “behind-the-scenes” of how the internet and digital technologies function, including data harvesting, and issues of privacy, security, and surveillance has to be a part of public education at all levels. However, the CCDE warns that it does not provide full immunity against sophisticated systems of disinformation and hatred:

“Public trust, social cohesion and an educated and informed public represent the ultimate defence against the spread of falsehoods, conspiracies and their offspring: division, disorientation and polarization. The health of our democracy ultimately depends on citizens having the capacity, willingness and opportunity to participate in our public life.”

These recommendations are far-sighted and, if implemented, could go a long way towards reinforcing and protecting the communication rights of Canadians and providing a beneficial model for societies elsewhere in the world.

Philip Lee is WACC general secretary and editor of its international journal Media Development. His edited publications include The Democratization of Communication (1995), Many Voices, One Vision: The Right to Communicate in Practice (2004); Communicating Peace: Entertaining Angels Unawares (2008); and Public Memory, Public Media, and the Politics of Justice (ed. with Pradip N. Thomas) (2012). WACC Global is an international NGO that promotes communication as a basic human right, essential to people’s dignity and community. 

Image credit: Hermes Rivera/Unsplash

Philip Lee

Philip Lee is WACC general secretary and editor of its international journal Media Development. His edited publications include The Democratization of Communication (1995), Many Voices, One Vision: The...