February 10, 2021
In the next few weeks, the government of Canada plans to introduce legislation designed to combat “online harms”. What exactly this will mean is still unclear, but at a minimum we can expect that our online communications will be subject to new rules as the government attempts to impose order in a sphere that has been, until now, largely free of regulation. In principle, many Canadians are likely to welcome this kind of change. CCLA does not oppose regulation per se, but as usual the devil is in the details – all of which are currently unknown. In any event, it is important to be realistic about what regulation can achieve and aware of the very real pitfalls that we need to avoid as Canada embarks on this experiment.
We all recognize that communication in the online world, particularly via social media, can have significant impacts “in real life”. Examples abound. The pandemic and U.S. election have highlighted the danger posed by misinformation – how challenging it can be to counter and how quickly it can spread. While voices for racial justice have been raised loud in the public sphere – and significantly amplified by social media platforms in some cases – we also know that online spaces can be dangerous ones for racialized people, and that threats and hatred in all its forms (racism, misogyny, Islamophobia, antisemitism, homophobia, transphobia, etc) can fester and flourish online.
Recognizing the problem is easy. Solving it? Not so simple.
The truth is that we already have laws that limit how we communicate and what we can communicate. It is a criminal offence to utter threats, to harass, and to promote hatred against identifiable groups. Some provincial human rights codes also restrict hate speech. Defamation laws allow people to sue those who tell harmful lies about them, damaging their reputation. But our supreme law, the Charter of Rights and Freedoms, guarantees freedom of expression and says that limits on that freedom must be reasonable and justified. So Criminal Code offences, human rights provisions and the law of defamation must all strike the right balance. Canadian courts have weighed in many times to affirm this principle and tweak rules when necessary. The balance is a delicate one.
Both our Constitution and our courts explicitly and clearly acknowledge the importance of free expression – how vital it is to democracy and to the promotion and defence of other rights. Our courts have affirmed again and again that if we value free expression, we must tolerate expression we find offensive and even harmful. What you or I might categorize as hate speech is not the same as what the law says can be restricted under hate speech laws. Under those laws, generally, only the most extreme and vile attacks on identifiable groups are subject to legal punishment. As the Supreme Court said in its most recent decision on the issue:
Hate speech legislation is not aimed at discouraging repugnant or offensive ideas. It does not, for example, prohibit expression which debates the merits of reducing the rights of vulnerable groups in society. It only restricts the use of expression exposing them to hatred as a part of that debate. It does not target the ideas, but their mode of expression in public and the effect that this mode of expression may have. (Saskatchewan (Human Rights Commission) v. Whatcott, 2013 SCC 11, para 51, emphasis added)
Our courts have also recognized the danger that might come from attempts to prohibit the spreading of “false news”. In the case that struck down this Criminal Code offense – which had been used to prosecute infamous Holocaust denier Ernst Zundel – the Supreme Court majority recognized that truth and falsity may be obvious and easy in some cases, but much less so in others, particularly where complex social and historical facts are at issue.
All of this points to a somewhat restrained role for the law in addressing many of the harms that flow from communication, regardless of where or how that communication takes place. It suggests that a significant portion of the responsibility for addressing these harms will necessarily rest on those who consume information online – we must learn the critical thinking skills necessary to assess what is fact and what is fiction. We must respond and counter harmful speech that may lie outside the law’s reach.
We will have to see if the federal government sees things this way. It is likely that they will create a new regulator, like the CRTC, to try to address some of the harms that occur on social media platforms. There may be some obligations imposed on the platforms to remove certain categories of content quickly. This is the model that has been followed in some European countries and while there is not good evidence about how that is working, it is likely that platforms that risk huge fines for failing to remove “obviously illegal” content are going to err on the side of more censorship, not less. A new regulatory regime for social media should not attempt to reinvent the wheel and redefine what is lawful expression in Canada. It should fill gaps and carefully circumscribe any powers conferred on a new body that would stifle, silence or censor lawful speech. And it must recognize and adhere to the lines in the sand that the law has already drawn.
CCLA does not oppose experimenting with narrow and clear rules that could effectively address some of the damage that social media can and has done, but we need to recognize that we are trying to tinker with one of the most fundamental freedoms we have. The government should tread carefully, and CCLA will be watching and ready to challenge regulations that would undermine or damage this freedom.
Cara Zwibel is the Director of Fundamental Freedoms at Canadian Civil Liberties Association
For further comments, please contact us at firstname.lastname@example.org
Please keep referring to this page and to our social media platforms. We are on Instagram, Facebook, and Twitter
Back to all updates