Gender Identity and the Human Rights Act (former Bill C-16)

Because the Learn section of TalkRights features content produced by CCLA volunteers and interviews with experts in their own words, opinions expressed here do not necessarily represent the CCLA’s own policies or positions. 

Since its tabling in 2016, Bill C-16 (whose official title is An Act to Amend the Canadian Human Rights Act and the Criminal Code  and which received royal assent in June 2017) has been the subject of great controversy. Whereas the expressed intent of the law is to add “gender identity and gender expression to the list of prohibited grounds of discrimination” as well as amending the Criminal Code to “extend the protection against hate propaganda set out in that Act to any section of the public that is distinguished by gender identity or expression”, some have argued that the law, under its commendable purpose, hides an agenda with the potential to curtail fundamental freedoms.

This piece gives a summary explanation of the frameworks through which the law will be interpreted and applied.

The Canadian Human Rights Commission and Tribunal

Canada is a federal system. This means that the federal and provincial governments are vested with respective competencies over which than can enact laws. As such, the Canadian Human Rights Act (CHRA) outlaws discriminatory practices committed within fields of federal competency. The CHRA creates and regulates two organs, the Canadian Human Rights Commission and the Canadian Human Rights Tribunal.

The Commission has a wide range of duties and functions, the most ubiquitous of which is receiving human rights complaints. It may appoint an investigator to examine disputes as well as a councillor to attempt resolving the complaint before going before the Tribunal. Its administrative functions also include the drafting of policy guidelines and proposals for the harmonization of federal human rights law with provincial human rights standards across Canada.

The Tribunal takes up complaints referred to it by the Commission and holds all the powers of a superior court under the Constitution Act, 1867, which means it has similar power and authority to hear cases and render judgments. The only difference is its specialisation in federal human rights law. It is composed of at most fifteen members, four of which, including the Chairperson and Vice-chairperson, must have been members of a provincial bar for at least ten years. Other members without such qualifications must have relevant experience, expertise and interest in human rights.

The Tribunal can impose different sanctions aimed at remedying at discriminatory practices. This normally includes ceasing the discriminatory practice if it is ongoing and taking measures to ensure that it ceases. The Tribunal can impose other sanctions such as making available on the first reasonable occasion, the rights, privileges, or opportunities denied as a result of discriminatory practice, and compensatory damages of various forms (a list of possible remedies is enumerated at s. 53(2) of the Canadian Human Rights Act, R.S.C., 1985, c. H-6).

In order for a complaint to succeed, it must be based on one of the statutory grounds for discrimination stated in s. 3(1) of the Act. Bill C-16 adds gender identity and expression to this list. This is mostly symbolic considering discrimination against transgender individuals has been jurisprudentially outlawed by human rights tribunal judges in Canada who have interpreted discrimination against transgender people as falling under the statutorily proscribed ground of “sex”. We should not underestimate, nevertheless, the symbolic value of this enactment, which strengthens the case law and protection extended to transgender people and clearly positions gender-based harassment as illegal behaviour in the federal sphere. In doing so, the federal government brought its human rights standards in line with the provinces’, which have all enacted this form of amendment.

In short, this is a fairly innocuous addition to the law. Again, as was alluded to earlier, critics have opposed the amendment, claiming it can lead to “compelled speech”, which amounts to unconstitutional action. These claims will be addressed in the next piece published in this section. For the time being, it suffices to say that the changes to the law are not major by any means. They reflect already settled practice and in any case, the Human Rights Tribunal’s encroachment on liberty is minimal. It cannot sentence someone to imprisonment. Moreover, its decisions are aimed at remedying discriminatory practice, not channelling societal opprobrium the way a criminal court does. This being said, let us turn to C-16’s Criminal Code amendments.

The Criminal Code of Canada

Contrary to Human Rights Tribunals where proceedings are instigated and led by a private complainant, criminal prosecutions are initiated and conducted by the Canadian federal government.

Bill C-16 also adds “gender identity and expression” to the list of identifiable groups under subsection 318(4) of the Criminal Code. This list constitutes part of a subdivision of the Code called “Hate Propaganda”, which includes advocating genocide, public incitement of hatred, and wilful promotion of hatred. As will become evident, these are narrow crimes that will rarely be prosecuted.

Advocating genocide means advocating or promoting the killing of members of an identifiable group, or the deliberate infliction on the group conditions life calculated to bring about its physical destruction, both with the intent to destroy in whole or in part any identifiable group. Public incitement of hatred means to communicate statements in any public place that incite hatred against an identifiable group of people and such incitement is likely to lead to a breach of the peace. Willful promotion of hatred, which is similar but different, simply means to willfully promote hatred.

For public incitement of hatred to be found, there must be proof of several elements: (1) statements must be communicated in public, (2) they must incite hatred against an identifiable group, and finally, (3) they must lead to breach of the peace that the originator of the statements intended or foresaw as likely to happen and remained willfully blind to as events unfolded. Several defences exist in the case of willful promotion of hatred. Under subsection 319(3), an accused can show his innocence by showing (a) that the statements communicated were true; (b) if, in good faith, the person expressed or attempted to establish by an argument an opinion on a religious subject or an opinion based on a belief in a religious text; (c) if the statements were relevant to any subject of public interest, the discussion of which was for the public benefit, and if on reasonable grounds he believed them to be true; or (d) if, in good faith, he intended to point out, for the purpose of removal, matters producing or tending to produce feelings of hatred toward an identifiable group in Canada.

For advocating genocide, or willful promotion of hatred charges to be laid, consent of the Attorney General of Canada is required.

Will Bill C-16 Bear Fruit?: A Trans Scholar and Activist’s Perspective

In a very recent paper published in the University of Toronto Law Journal, LL.M student and trans activist Florence Ashley argues that laws such as Bill C-16, while symbolically valuable and appreciated by the transgender community, will be largely inconsequential in improving transgender peoples’ well being due systemic obstacles and foundational flaws in the legislation’s premises. She posits, among other things, that human rights lawsuits are costly and the symptomatic poverty faced by many transgender individuals (55% of transgender people have an annual income of less than $25,000 according to Sandy E James et al, The Report of the 2015 US Transgender Survey (Washington, DC: National Center for Transgender Equality, 2016) at p. 56) prevents most from taking their harassers to court. She also argues, pointing to empirical evidence, that “violence against transgender people is not taken as seriously as violence against cisgender people, leading anti-trans assaults and murders to be less investigated, perpetrators to be less frequently apprehended, and accused to face reduced punishment”.  Most importantly, she argues anti-discrimination laws only respond to acts committed by an archetypal transphobe. In other words, they are premised on stereotypical ideas of transphobic behaviour, which have limited applicability given transphobia (or “transantagonism” as Ashley writes), is articulated through institutions, and conditioned behaviours and manifests in more subtle ways than laws like the Canadian Human Rights Act presume.

A Conclusion

This piece analyses the frameworks of the Canadian Human Rights Act and the Criminal Code, which Bill C-16 amends. The main takeaway at this point should be that these laws have very circumscribed applicability. This does not mean of course that possible outcomes should not be probed and debated. However, claims to the law’s authoritarianism, from the above demonstrations, seem tenuous at best.

“Cyberspace law” and web privacy violations

Because the Learn section of TalkRights features content produced by CCLA volunteers and interviews with experts in their own words, opinions expressed here do not necessarily represent the CCLA’s own policies or positions. For official publications, key reports, position papers, legal documentation, and up-to-date news about the CCLA’s work check out the In Focus section of our website.


Amidst the many cyberspace laws that are enacted worldwide to respond to the constant growth of new issues faced in the online world, some of the biggest crimes committed online, including those related to privacy, seem to go unpunished [1]. For instance, in 2014 Yahoo stated that more than 500 million of its user accounts had been hacked and the personal data they contained sold [2]. Yet, although it is alleged that Yahoo had concealed its knowledge of the massive breach and only revealed it two years later in 2016, the consequences faced by the Telecom giant for this massive cybersecurity breach were financial and reputational, not legal [3]. Indeed, enacting proper legislation to regulate online behavior is only half the battle: this is because while new laws are created to face new problems that arise, enforcing cyberspace Law is intrinsically more difficult than the enforcement of “traditional laws”, for a number of reasons listed below [4]:

What is Cyberspace Law?

Cyberspace Law, or “Cyberlaw”, is a term that refers to the aggregation of legal issues arising over the Internet [5]. Cyberspace Law regulates many areas including e-commerce (i.e. corporate liability of Internet providers and social media outlets), cyber crimes (crimes committed over the Internet, which in most cases have an impact on the victim’s information privacy) and privacy concerns [6].

  1. Jurisdictional Issues

The difficulty in prosecuting online behaviour stems first and foremost from issues pertaining to jurisdiction[7]. The term “jurisdiction” refers to the authority of a sovereign state to regulate certain conduct[8]. This is due to the fact that before a law enforcement agency can investigate a cybercrime case, it has to have jurisdiction [9]. Essentially, the challenge when it comes to jurisdiction issues lies in determining which agency or court has the authority in a particular matter to administer justice [10]. This is significant because if a court lacks jurisdiction or is not competent to try a case the proceedings are nullified, no matter how strong the evidence is or how well conducted the case may be [11].

Conflicts regarding jurisdiction can be based on a number of different things, including: the branch of law (ex: criminal law vs. international law), Type of case (i.e. criminal cases vs. international law cases), the “Grade” (or seriousness) of  the offense (i.e. summary offenses vs. indictable offenses), monetary damages (i.e. different courts handle cases based on the minimum or maximum monetary damages that can be obtained from a particular type of lawsuit), and the level of government (i.e. there are separate laws, law enforcement agencies and court systems for the different levels of government, for example, Federal vs. Provincial. Therefore, the difficulty here is determining which laws apply to a particular case or offense based on the level of government)[12]. Finally, jurisdiction and jurisdiction issues are largely based on the geographical area [13].

Geographical jurisdiction is one of the most significant factors that make it hard for law enforcement authorities to regulate online behaviour; this is because a particular law enforcement agency or court has jurisdiction only over crimes that take place in the geographic location where the agency or court has authority [14]. This generally includes the location of the perpetrator, the location of the victim, or the location where the crime actually occurred [15]. In the context of the Internet, this can often be much more difficult to determine than in the real world because, as we will elaborate later on, the anonymous aspect of the Internet and the digital nature of the evidence makes it much more difficult for authorities to bring wrongdoers to justice [16]. For instance, determining geographic jurisdiction for cybercrimes when it comes to Cyberspace law is more difficult than for “traditional” (or “terrestrial”) crimes because often the perpetrator is not in the same city or country as the victim[17].


To determine whether a law agency or court has jurisdiction, it must first determine whether a crime has actually taken place at all; In some cases, there is no law that covers the particular circumstance, and in others, the wrongful action that took place is a civil matter, not a criminal one [18]. For instance, if you entrusted your data to a company (eg. Yahoo and other social media platforms) and this company lost it, although the consequences for one can be disastrous there is no criminal recourse available to the victim [19]. Second, if a criminal offense has occurred, then the next step is to determine what law was violated [20].


Why is Geographic Jurisdiction Such a Big Problem When it Comes to Legislation?

Geographic jurisdiction is a big problem when it comes to regulating online behaviour because laws differ from country to country, and even from province to province [21]; therefore, an act that’s illegal in one location may not be against the law in another [22]. For instance, if a perpetrator is in a location where what he is doing is not against the law but is a crime in the location where the victim is, it will be difficult (if not impossible) to prosecute him [23]. This is because law enforcement agencies are only authorized to enforce the law within their jurisdictions [24]. For instance, a police officer in Ontario doesn’t have the authority to arrest someone in Quebec, and the Royal Canadian Mounted Police (RCMP) doesn’t have the authority to arrest someone in the United States [25].

Moreover, certain measures exist in international law that render possible the prosecution of individuals who commit crimes in another country, but the process is difficult, expensive and long to accomplish[26]. For instance, even though international law does not require countries to surrender a criminal or suspect to another, some countries have what are called “Mutual Legal Assistance Treaties” by which they mutually agree to do so[27]. However, those treaties generally require “double criminality”, which means that the action by the perpetrator must be considered a crime in both the jurisdiction where the perpetrator is located and the jurisdiction where the act was committed[28]. In addition, determining where the act was committed in the online context is another challenge in and by itself[29].


  1. Anonymity and Identity

Anonymity is also considered to be one of the biggest obstacles against global efforts to regulate online behaviour [30]. Over the Internet, concealing one’s identity is far easier to do than in the real world, which makes it more difficult for law enforcement authorities to track down perpetrators and uncover their identity [31]. When it comes to legislation, this is significant because if the identities of wrongdoers are incapable of being traced, any law that is put into place to hold individuals or businesses accountable for their actions over the Internet, no matter how well crafted or intended, cannot work and is therefore completely useless [32]. At the same time, anonymity is an essential privacy protection, relied upon in particular to human rights defenders, vulnerable groups or individuals in oppressive regimes, and everyday citizens wishing to communicate privately.


  1. Nature of the Evidence (Digital Evidence)

The digital nature of the evidence is another factor that makes cyber crimes more difficult to prosecute [33]. Indeed, digital evidence can be easily lost or changed as compared to “real world” evidence [34]. For instance, cyber criminals have greater ability to erase evidence, and investigators can accidentally lose or destroy the evidence simply by examining or trying to access it [35]. At the same time, it is also possible that digital evidence can be restored in some circumstances, and standards can be developed to establish the evidentiary value of digital evidence over time.

The branch of the law that comprises all the rules regarding the presentation of facts and proofs in proceedings before a court, including the rules that determine which evidence is admissible or not, is the law of evidence [36]. In addition, the law of evidence varies across countries and legal systems, which further explains why jurisdiction is such a big problem when it comes to prosecuting wrongful behavior online [37].

  1. Legal Conception and Expectation of Privacy

The reliance of some nations on outdated conceptions of privacy can also complicate the prosecution of cybercrimes[38]. In some countries, the legal conception of privacy is still based (in some respects) on pre-Information Age concepts and precedents that do not reflect our actuality[39]. For example: in California v. Greenwood, 486 U.S. 35 (1988), the defendant’s trash which was left outside on the curb was searched without a warrant and the drug paraphernalia found inside was used as evidence to convict him [40]. The Court of Appeal affirmed that Greenwood’s privacy had not been invaded by the search because “there was no expectation of privacy for things left [outside] that the public could access, and therefore privacy did not apply to the trash left for the public to see” [41]. Although this ruling took place in 1988 (i.e. before the Internet started becoming accessible to the public in 1991), it still applies to current computer technology and the Internet [42]. Some scholars argue that this significant because it basically means that personal information left out in a public domain or on the Internet, whether it is trash on the curb or a web site for example, is viewed as the same from a legal perspective [43]. Canadian courts, however, are developing a more nuanced view in recent cases[44].


Although many countries enact cyberlaws in order to keep up with the development of computer technology and the ever-changing online environment, Cyberspace law is a relatively new field that contains loopholes that, as explained above, can render the application of those laws quite difficult [45]. Furthermore, the concept of “reasonable expectation of privacy” is being debated, re-framed, and problematized in light of the changing ways in which information online can be collected, shared, stored, and accessed.





[1] Shinder, Deb. 2011. « What Makes Cybercrime Laws So Difficult To Enforce ». Published January 26th 2011 on Accessed January 4th 2018 :

[2]  Fiegerman, Seth. 2016. “Yahoo Says 500 Million Accounts Stolen”. Published September 23rd 2016 on the CNN official website. Accessed January 8th 2017:

[3] Supra Note 2

[4] Valiquet, Dominique. 2011. “Cybercrime: Issues”. Library of Parliament Research Publications, Background Paper No. 2011-36-E. Published April 5th 2011 on the Parliament of Canada website. Accessed January 9th 2018:

[5] LaManche. 2017. « Lawyer Library : Guide to Cyberspace Law ». Published on the  Legal Match official website. Accessed January 7th 2017 :

[6] Supra Note 1

[7]  Ajayi, E.F.G. 2016. “Challenges to Enforcement of Cyber-crimes Law and Policy”. Journal of Internet and Information Systems Vol. 6 (1). DOI: 10.5897/JIIS2015.0089. Published August 2016. Accessed January 9th 2018 : Page 4.

 [8] International Telecommunications Union (ITU). 2015. “Understanding Cybercrime: Phenomena, Challenges and Legal Response”. Published April 28th 2015 on the ITU official website. Accessed January 9th 2018: Page 228.

[9] Supra Note 7, page 5.

[10] Supra Note 7, page 5.

[11] Supra Note 7, page 4.

[12] Supra Note 1

[13] Supra Note 1

[14] Supra Note 7, page 5.

[15] Supra Note 1

[16] Supra Note 8,  pages 80, 227-228.

[17] Supra Note 1

[18] Supra Note 1

[19] Supra Note 2

[20] Supra Note 1

[21] Supra Note 1

[22] Supra Note 1

[23] Supra Note 8, page 235.

[24] Supra Note 7, page 5.

[25] Supra Note 1

[26] Supra Note 5

[27] Supra Note 5

[28] Supra Note 5

[29] Supra Note 5

[30] Supra Note 7, page 4.

[31] Supra Note 7, page 4.

[32] Supra Note 7, page 4.

[33] Supra Note 7, page 7.

[34] Supra Note 7, page 7.

[35] Supra Note 8, pages 227 and 228.

[36] Supra Note 7, page 7.

[37] Debesu, Kahsay and Andualem Eshetu. 2012. “Evidence in Civil and Common Law Legal Systems”. Published September 4th 2012 on the Abyssinia Law official website. Accessed January 9th 2018:

[38] Subramanian, Ramesh and Steven Sedita. 2006. “Are Cybercrime Laws Keeping Up With the Triple Convergence of Information, Innovation and Technology?” Communications of the IIMA Vol. 6, Issue 1, Article 4. Accessed January 9th 2018: Page 43.

[39] Supra Note 38, page 43.

[40] Supra Note 38, page 43.

[41] Supra Note 38, page 43.

[42] Supra Note 38, page 43.

[43] Supra Note 38, page 43.

[44] See for example R v Marakah 2017 SCC 59 or R v Jones 2017 SCC 60.

[45] Supra Note 7, pages 1-2.

Cell phones and Canadian Courts: Some Recent Cases

Because the Learn section of TalkRights features content produced by CCLA volunteers and interviews with experts in their own words, opinions expressed here do not necessarily represent the CCLA’s own policies or positions. For official publications, key reports, position papers, legal documentation, and up-to-date news about the CCLA’s work check out the In Focus section of our website.


Cell-phones today contain some of our most private and personal information. This has undoubtedly increased our privacy interest in our cell-phones, particularly from unwanted searches from the police. In this article, I present some leading and other recent Canadian cases regarding one’s reasonable expectation of privacy in their cell phone and the police’s ability to search one’s cell phones without a warrant.

Legal Framework

Section 8 of the Canadian Charter of Rights and Freedoms provides that: “Everyone has the right to be secure against unreasonable search or seizure.”

The s. 8 guarantee, however, is not an absolute right against all searches— s. 8 protects an individual from only those searches deemed to be “unreasonable”. In R v Edwards, the Supreme Court of Canada offered the following two-part framework to determine whether s. 8 of the Charter has been violated in a search conducted by the police:[1]

  1. First, does the individual have a reasonable expectation of privacy with respect to the item in question under the totality of circumstances?
  2. Second, if the individual has a reasonable expectation of privacy, did the police conduct the search reasonably?


Cell Phones and a Reasonable Expectation of Privacy

The Canadian jurisprudence has typically found that users of cell-phones containing private and personal information have a reasonable expectation of privacy. That said, it is important to note that a determination of whether an individual has a reasonable expectation of privacy proceeds on a case-by-case basis with regard to the totality of circumstances.

  • In  v. Polius (2009), the court found that an individual has a reasonable expectation of privacy in the contents of his or her cell phone since “[t]he information in a cell phone … may relate to aspects of life that are deeply personal.” The information includes photographs, videos, text messages, E-mail messages, call logs, etc.[2]
  • v. O. (T.) (2010) involved an accused who left his cell phone unattended to charge in the stairwell of an apartment building. The phone was subsequently searched by the superintendent of the building and handed to the police for incriminating evidence. The court found that the fact that the phone was in charging and contained private information indicated that the accused intended to retrieve the phone and therefore the phone was not considered abandoned. The court concluded that the accused had a reasonable expectation of privacy in the cell phone based on the fact that it contained highly personal information.[3]
  • In v. Artis (2016), the court noted that exclusive access to one’s cell phone is not required to establish a reasonable expectation of privacy. Accordingly, the fact that the accused shared his cell phone with another person did not itself preclude the accused from maintaining a reasonable expectation of privacy in the content of his or her cell phone.[4]

Although one has a reasonable expectation to privacy to his or her own cellphone, it was until recently an open question whether one’s reasonable expectation of privacy could extend to the text messages that are sent to a recipient and contained on the recipient’s phone.

  • In R v. Marakah (2017), the Supreme Court of Canada  ruled that an individual may have a reasonable expectation of privacy in an electronic conversation in some cases, and that text messages that have been sent and received may therefore be protected against unreasonable search and seizure”[5]
  • In R. v. Jones (2017), heard as a companion case to Marakah, the court considered whether the sender of a text message has a reasonable expectation of privacy in records of that message stored in the service provider’s infrastructure. The decision was that there is such an expectation. [6]


Cell Phone and Police Search Incidental to Arrest

In R. v. Fearon (2014), the Supreme Court of Canada maintained that there is a lower expectation of privacy during times of lawful arrest.[7] Accordingly, the Court found that a police officer may search a cell-phone incidental to an arrest where:[8]

  • The arrest was lawful;
  • [T]he police have a reason based on a valid law enforcement purpose to conduct the search, and that reason is objectively reasonable. The valid law enforcement purposes in this context are:
    1. Protecting the police, the accused, or the public;
    2. Preserving evidence; or
    3. Discovering evidence, including locating additional suspects, in situations in which the investigation will be stymied or significantly hampered absent the ability to promptly search the cellphone incident to arrest;
  • The nature and the extent of the search are tailored to the purpose of the search; and
  • The police take detailed notes of what they have examined on the device and how it was searched.

The Court in Fearon cautioned that the police power to search a cell phone incident to arrest is not a “license to rummage around in the device at will.”[9] The Court also held that one’s decision not to protect his or her cell phone with a password does not “indicate any sort of abandonment of the significant privacy interests one generally will have in the contents of the phone.”[10]

In v. Hiscoe (2013), the Nova Scotia Court of Appeal upheld a trial judge’s finding that, given the near month-long delay between the initial arrest and the cell-phone search, a complete “data dump” of all the information contained on the accused cell-phone went beyond the scope of a search incident to arrest. The court upheld the trial judge’s finding of a s. 8 violation.[11]

Hiscoe was decided prior to Fearson. Nevertheless, under similar circumstances, the court in v. Powell (2017) also found that the complete “data dump” of all the information contained on the accused cell-phone conducted thirteen days after the initial arrest was beyond the scope of a search incident to arrest.[12]

In v. Hiscock (2016), the court suggested that although a police officer may seize one’s cell-phone incidental to arrest, the accused is not required to reveal the passcode and any detail about the password must be obtained voluntarily and consensually.[13]



 In R. v. Rogers Communications Partnership (2016), the court held that Canadians have a reasonable expectation of privacy in the records of their cell phone activity. Accordingly, the court held that the Crown’s production orders (POs) compelling cell phone providers to provide personal information of cell phone users that have used the cell tower nearby 21 addresses given by the police for investigative purposes was overly broad and infringed s. 8 of the Charter.[14]

 In addition to criminal search powers incident to arrest and search powers authorized by warrants, the Canadian Border Service Agency believe they may also search one’s cell phone upon an exit or entry at the border pursuant to the Customs Act.[15]  They base this on an interpretation of  s. 99(1)(a) of the Customs Act which states that an officer may:

 (a) at any time up to the time of release, examine any goods that have been imported and open or cause to be opened any package or container of imported goods and take samples of imported goods in reasonable amounts.[16]

In R. v. Gibson (2017), the court held that the definition of “goods” included, for the purpose of s. 99(1)(a) of the Customs Act, data stored in any electronic device, including cell-phones, that is in “actual possession of or in accompanying baggage of traveller at time they arrive at border and commence dealings with customs officers.”[17] However, “goods” did not include data that is stored on the cloud or stored remotely on devices that are not in possession of traveller.[18] This definition of a cell phone as a “good” is challenged by the Canadian Civil Liberties Association and others.



Cell Phones and a Reasonable Expectation of Privacy


  1. v. Polius (2009), 196 CRR (2d) 288 (Ont. SCJ).
  2. v. O. (T.), 2010 ONCJ 334.
  3. v. Artis, 2016 ONSC 2050.
  4. R v. Marakah, 2017 SCC 59
  5. R v. Jones2017 SCC 60


Cell Phone and Police Search Incidental to Arrest


  1. v. Fearon 2014 SCC 77.
  2. v. Hiscoe, 2013 NSCA 48.
  3. v. Powell, 2017 ONSC 6482.
  4. v. Hiscock (2016), 136 WCB (2d) 502 (NL Prov Ct (Crim Div)).



  1. v. Rogers Communications Partnership, 2016 ONSC 70.
  2. v. Gibson, 2017 BCPC 237.



 1] R v Edwards, [1996] 1 SCR 128 at para 45 (SCC).

[2] R. v. Polius (2009), 196 CRR (2d) 288 at para 50 (Ont. SCJ).

[3] R. v. O. (T.), 2010 ONCJ 334 at para 42, 46, [2010] O.J. No. 3717.

[4] R. v. Artis, 2016 ONSC 2050 at para 12.

[5] R v. Marakah, 2017 SCC 59.

[6] R. v. Jones, 2017 SCC 60 at para 55.

[7] R. v. Fearon 2014 SCC 77 at para 56.

[8] Citing from original, ibid at para 83.

[9] Ibid at 78.

[10] Ibid at 53.

[11] R. v. Hiscoe, 2013 NSCA 48 at 79.

[12] R. v. Powell, 2017 ONSC 6482 at para 63.

[13]  See R. v. Hiscock (2016), 136 WCB (2d) 502 at para 41 (NL Prov Ct (Crim Div)).

[14] R. v. Rogers Communications Partnership, 2016 ONSC 70 at para 31, 42, 43.

[15] Customs Act, RSC 1985 c.1 (2nd Supp) [Customs Act].

[16] Customs Act, RSC 1985 c.1 (2nd Supp), s 99(1)(a).

[17] R. v. Gibson, 2017 BCPC 237 at para 95.

[18] ibid at para 92.



What’s a VPN? How does it work? And what…

Because the Learn section of TalkRights features content produced by CCLA volunteers and interviews with experts in their own words, opinions expressed here do not necessarily represent the CCLA’s own policies or positions. For official publications, key reports, position papers, legal documentation, and up-to-date news about the CCLA’s work check out the In Focus section of our website.


The vast majority of the time on computers or smartphones is spent connected to the internet browsing websites, watching online videos, using social media, etc. An individual’s internet history can reveal some of his or her most personal information, including sexual orientation and political ideology. Despite the interest in keeping our internet activity private, much of it can be logged or monitored to varying degrees by the internet service providers (ISP), law enforcement agencies, and the very websites that we visit. Fortunately, virtual private networks (VPN) can add an additional layer of privacy while browsing the internet. This article explains how VPNs work and explores their privacy implications.

What is a VPN and How Does It Work?

To understand how a VPN service works, it is instructive first to understand the basics of a standard internet connection.[1] A computer connects to the internet via a modem that is often provided by one of the ISPs, such as Rogers or Bell. Each ISP issues a unique series of numbers known as an IP address to each one of their customers’ modems. An ISP is able to trace a modem’s IP address back to the customer under whom the modem is registered. An IP address can also be used by anyone to track the general geographical location of a device that is connected to the internet.

Our interactions with a website typically consist of our modem sending electronic requests to receive information that is stored on the website’s server. This process involves several steps. First, an initial request is sent from our device to our ISP which then routes the request to the web server storing the information. After receiving the request, the web server sends the information back to our IP address. The information transmitted during the entire process can be monitored and logged by the ISP.[2] The web server can also keep a record of the information that it sends to any particular IP address.

A VPN service essentially acts as a secure middle-man between our ISP and the website server.[3] When a VPN service is turned on, our ISP connects our device to a VPN server. Unlike a normal internet connection, all of the information that travels between the VPN server and our device is encrypted and unable to be deciphered by the ISP. This secure connection is known as a “VPN tunnel”. When you use a VPN, an encrypted request is routed by the ISP directly to the VPN server— not to the web server. After receiving the request, the VPN server sends its own request using a new IP address to retrieve the information from the website’s server. From the perspective of the web server, it is the VPN’s IP address—not the IP address of the end user— that is requesting the information. The VPN server encrypts the information received from the website’s server and sends it back to our IP address.  Not only is the ISP unable to decrypt the information, but since the encrypted information is sent directly from the VPN to our device, the ISP is also unaware of the website from which the information originated.

Privacy Implications 

There are three main privacy implications of using a VPN while browsing the internet. First, to comply with the Copyright Act, Canadian ISPs keep a temporary record of each IP address that it assigns to a modem.[4] In addition, an ISP is able to maintain a record of the websites visited and the content downloaded by an IP address, although the extent to which any ISP engage in such practices is unclear and it is unlikely that an ISP is actively monitoring their customer’s internet activity. Nonetheless, the information transmitted along a standard internet connection can be monitored and made available to law enforcement upon a warrant. By using a VPN service, one’s internet activity will be encrypted and unable to be deciphered by the ISP. From the perspective of an ISP, the IP address assigned to a customer’s modem is simply receiving encrypted data from a VPN server. It should be noted, however, that just as an ISP is ordinarily able to monitor the information received from a web server, the VPN service is able to similarly monitor the requested information that it receives from the web server. This information is only later encrypted and sent from the VPN server back to our device. Nevertheless, many VPNs have an express policy of not recording a user’s internet activity (but make sure you read and understand that policy when you’re choosing a VPN!).

Second, unlike the United States, Canadian internet service providers are not able to share a customer’s personal information such internet history with third parties without their express consent. However, laws are often subject to change, and it is entirely possible that ISPs in the future will be able to share such information without consent.[5] Since a VPN ensures that a user’s internet activity is encrypted, ISPs will be unable to sell information about a customer’s internet history.

Finally, websites themselves maintain a record of the IP addresses that visit their site. The IP address, in turn, allows these websites to track the general location of the user’s device. By using a VPN service, it is the VPN’s IP address—not the user’s own IP address—that is requesting the information from the web server, effectively masking the identity of the end user.

Just as we may expect mail to remain private from mail carriers or phone conversations to remain private from telecom providers, there ought to be an expectation that our internet activity to remain private. Although the extent to which an ISP monitors and records one’s internet activity is unclear, the fact that ISPs are often not forthcoming and transparent about their practices is troubling. For those who are privacy conscious and want their internet activity to remain private, a VPN is a useful tool to add an additional layer of privacy while browsing the internet.



[1] For more information see Shuler, Rus. How Does the Internet Work? Pomeroy IT Solutions, 2002,

[2] Websites that use encryption such as HTTPS will encrypt the data that is sent to and from the website and the user’s device. However, unlike the case with a VPN service, an HTTPS encryption does not prevent an ISP from recording that your IP address has visited the site. is still able to monitor the fact that your device is connected to the website.

[3] For more information see Tyson , Jeff, and Stephanie Crawford . “How VPNs Work.” HowStuffWorks, 14 Apr. 2011,

[4] The current law is unclear as to whether Canadian VPN service providers are required to retain IP address logs. Many VPN’s have an expressed policy of not maintaining a record of the customer’s IP addresses.

[5] For information about the recent American Senate vote to repeal an FCC ruling preventing American ISPs from selling their  consumer’s data to third parties see Fung, Brian. “What to Expect Now That Internet Providers Can Collect and Sell Your Web Browser History.” The Washington Post, 29 Mar. 2017,

For more information regarding the situation in Canada see Braga, Matthew. “No, Your Canadian Internet Service Provider Can’t Sell Your Information as in the U.S.” CBCnews, CBC/Radio Canada, 31 Mar. 2017,




In the News: Canada’s Privacy Commissioner Proposes Tools to…

Because the Learn section of TalkRights features content produced by CCLA volunteers and interviews with experts in their own words, opinions expressed here do not necessarily represent the CCLA’s own policies or positions. For official publications, key reports, position papers, legal documentation, and up-to-date news about the CCLA’s work check out the In Focus section of our website.


On January 26, 2018, the Office of Privacy Commissioner of Canada (“OPC”) released a draft policy position paper on the issue of protecting one’s “online reputation.” In the draft paper, the commissioner interprets the Personal Information Protection and Electronic Documents Act (PIPEDA) as providing individuals with the right to request search engines to “de-index” any links to websites that contain demonstrably inaccurate, incomplete, or outdated information about the individual and providing individuals the qualified right to request the owner of a commercial website to modify or remove any such misinformation. This article provides a detailed summary of the two key proposals made by OPC in the draft paper and offers some brief critical commentary.

Competing Interests: Protection of One’s Online Reputation and Freedom of Expression

Allowing unrestricted access to personal information on the internet that is inaccurate, misleading, or outdated, may have a harmful and long-lasting impact on the individual’s life, especially if that misinformation is used to make decisions. Many employers, for example, use internet search engines as part of their job application screening process. That said, the OPC also recognizes that a person’s interest in protecting his or her online reputation needs to be balanced against a countervailing interest of another person’s right to freedom of expression—a value that is enshrined in the Charter.

Accordingly, by its own admission, the OPC does not seek to propose any new, independent right to protect one’s online reputation in the draft policy paper. Instead, the OPC argues that, properly interpreted, PIPEDA implies certain protections for an individual’s online reputation.


Proposal #1: De-Indexing

 PIPEDA is the primary Canadian statute governing the way private sector organizations may collect, use or disclose personal information in the course of their commercial activities. Pursuant to paragraph 4(1)(a), PIPEDA applies to “every organization in respect of personal information that […] the organization collects, uses or discloses in the course of commercial activities.” The OPC argues that the activities of search engines fall within the scope PIPEDA.

When a user searches the name of a person, the search engine’s algorithm uses all of the personal information about the person being searched that exists on the internet to create a list of websites containing the most relevant information about that person. This, according to the OPC, constitutes “collection, use or disclosure” of personal information within the meaning of PIPEDA. Secondly, the search engine’s primary aim in using the personal information is to generate revenue from advertisers. In other words, personal information is being used by the search engine in the course of commercial activity. Therefore, since search engines “collect, use, or disclose” personal data in the course of a commercial activity, the OPC argues that search engines are subject to the provisions found in PIPEDA.

According to the OPC, the existing principals under PIPEDA, namely, principle 4.6, 4.9 and 4.10, imply a right to de-index. De-indexing is a means by which search engines remove certain links from the search results of specific search terms (such as a person’s name), or lower the ranking of certain websites in the search results. It is important to note that de-indexing does not remove the content itself from the source website nor does it prevent the content from being accessed by navigating directly to the website.

The relevant principles from PIPEDA read as follows:

  • Personal information shall be as accurate, complete, and up-to-date as is necessary for the purposes for which it is to be used.
  • When an individual successfully demonstrates the inaccuracy or incompleteness of personal information, the organization shall amend the information as required. Depending on the nature of the information challenged, amendment involves the correction, deletion, or addition of information
  • The extent to which personal information shall be accurate, complete, and up-to-date will depend upon the use of the information, taking into account the interests of the individual.

Recall that a search engine’s algorithm uses the personal information about an individual that is publically available on the internet to render a relevant search result of the person’s name to the user. According to the OPC, it is therefore incumbent upon the search engine under principle 4.6 to ensure that the search engine is using accurate, complete, and up-to-date information. Moreover, where an individual is able to demonstrably show that the personal information used by the search engine is inaccurate, incomplete, or outdated, principle 4.9.5 implies that the individual has a right to have the search engine amend the search result for accuracy, completeness, and currency. According to the OCP, the most readily available means to make such an amendment is to de-index the website from the search result or lower the ranking of the website in the search results. That said, the OPC is sensitive that not all inaccurate information on a website will warrant it being de-indexed. According to the OPC, principle 4.61 requires that search engines be sensitive to the use of the personal information and the interests of the individual. For example, where personal information on a website is inaccurate for satirical purposes, it seems unlikely that de-indexing that website from the search results would be appropriate.

Although de-indexing appears to be an innovate solution to protect one’s online reputation in a manner that also balances the search engine’s freedom of expression (since de-indexing neither removes the website from appearing on other search results nor does it remove the actual content from the source website), there is a potential problem with the OPC’s interpretation of PIPEDA as it relates to search engines. In particular, it is unclear whether PIPEDA intended to bring search engines within the ambit of certain provisions. It is important to remember the search engines search through a collection of personal information that is already publicly available on the internet and then simply presents the links to websites in an order that the engine’s algorithm finds most relevant to the search term. It is not clear, therefore, whether the search results themselves (i.e., the very links to websites or even the order in which the links are presented) can be measured for its accuracy or currency under principle 4.9.5 in PIPEDA. Given the interpretive challenge of finding search engines to fall within the scope of PIPEDA, it would be advisable to seek legislative clarity on the matter.


Proposal #2: Source Takedown/Amendment

In addition to a right to de-index, the OPC argues that the provisions under PIPEDA imply a qualified right to seek modification or removal of inaccurate, incomplete, or outdated information about themselves that is contained on commercial websites. OPC provides two situations where this proposed right would be applicable: (1) where a user of a website, such as a social media platform, posts information about herself that she later wishes to modify or remove and (2) where an individual wishes to request an owner of a commercial website to remove or modify inaccurate, incomplete, or outdated information posted about her by a third-party on the website.

The relevant provisions in PIPEDA to source takedown/amend are as follows:

  • An individual may withdraw consent at any time, subject to legal or contractual restrictions and reasonable notice. The organization shall inform the individual of the implications of such withdrawal.
  • Personal information that is no longer required to fulfil the identified purposes should be destroyed, erased, or made anonymous. Organizations shall develop guidelines and implement procedures to govern the destruction of personal information.
  • When an individual successfully demonstrates the inaccuracy or incompleteness of personal information, the organization shall amend the information as required. Depending on the nature of the information challenged, amendment involves the correction, deletion, or addition of information

In the first situation where the user of a commercial website, such as a social media platform, wishes to modify or remove information about herself that she has posted earlier, the OPC claims that principle 4.3.8 and 4.5.3 taken together imply that the individual has a near absolute right to remove the information, except to the extent that it is subject to legal or contractual restrictions.

The OCP recognizes that the second situation, where a third-party posts information about an individual on a commercial website that is inaccurate, misleading, or outdated, raises more difficulties and requires a balance between one’s interest in protecting his or her online reputation with the author’s freedom of expression. Nevertheless, according to OCP, principal 4.9.5 provides that, at a minimum, there be some formal mechanism whereby an individual is able to request that such information is removed or amended from the website.  This, according to the OPC, provides an appropriate balance with freedom of expression.

Unfortunately, the OPC does not offer any guidance as to how one’s freedom of expression should be weighed against another person’s interest in his or her online reputation other than to claim that such a balance should be done with regard to the public interest. However, this balancing exercise creates a false parity between the right to freedom of expression and the right to online reputation. Whereas the former is a constitutionally enshrined right, the latter has thus far gained limited recognition in the context of defamation or libel.[1] Requiring organizations under PIPEDA to remove or modify content that is otherwise lawfully published may raise difficult Charter challenges.

Given the interpretive challenge of finding search engines to fall within the scope of PIPEDA and the potential charter challenges against source takedown/modification, it would be advisable to seek legislative clarity on the issue of protecting one’s online reputation. Perhaps it is more appropriate to create an independent right to protect one’s reputation or seek an amendment to PIPEDA to make such a right more explicit.  Nevertheless, despite these challenges, the OPC draft paper is an important step in furthering the discussion on protecting one’s online reputation, which is becoming an increasingly contentious issue as more of our personal information makes its way onto the internet.


[1] See Hill v. Church of Scientology of Toronto, [1995] 2 SCR 1130, at para. 121 and Grant v. Torstar Corp. [2009] 3 SCR 640, para 51