Picking Up Where Bill C-10 Left Off: The Canadian Government’s Non-Consultation on Online Harms Legislation | Unpublished
Hello!
×

Warning message

  • Last import of users from Drupal Production environment ran more than 7 days ago. Import users by accessing /admin/config/live-importer/drupal-run
  • Last import of nodes from Drupal Production environment ran more than 7 days ago. Import nodes by accessing /admin/config/live-importer/drupal-run

Unpublished Opinions

Michael Geist's picture
Ottawa, Ontario
About the author

Dr. Michael Geist is a law professor at the University of Ottawa where he holds the Canada Research Chair in Internet and E-commerce Law. He has obtained a Bachelor of Laws (LL.B.) degree from Osgoode Hall Law School in Toronto, Master of Laws (LL.M.) degrees from Cambridge University in the UK and Columbia Law School in New York, and a Doctorate in Law (J.S.D.) from Columbia Law School.  Dr. Geist is a syndicated columnist on technology law issues with his regular column appearing in the Toronto Star, the Hill Times, and the Tyee.  Dr. Geist is the editor of several copyright books including The Copyright Pentalogy: How the Supreme Court of Canada Shook the Foundations of Canadian Copyright Law (2013, University of Ottawa Press), From “Radical Extremism” to “Balanced Copyright”: Canadian Copyright and the Digital Agenda (2010, Irwin Law) and In the Public Interest:  The Future of Canadian Copyright Law (2005, Irwin Law), the editor of several monthly technology law publications, and the author of a popular blog on Internet and intellectual property law issues.

Dr. Geist serves on many boards, including the CANARIE Board of Directors, the Canadian Legal Information Institute Board of Directors, the Canadian Internet Registration Authority, and the Electronic Frontier Foundation Advisory Board. He has received numerous awards for his work including the Kroeger Award for Policy Leadership and the Public Knowledge IP3 Award in 2010, the Les Fowlie Award for Intellectual Freedom from the Ontario Library Association in 2009, the Electronic Frontier Foundation’s Pioneer Award in 2008, Canarie’s IWAY Public Leadership Award for his contribution to the development of the Internet in Canada and he was named one of Canada’s Top 40 Under 40 in 2003.  In 2010, Managing Intellectual Property named him on the 50 most influential people on intellectual property in the world and Canadian Lawyer named him one of the 25 most influential lawyers in Canada in 2011, 2012 and 2013.

Click here to view Dr. Geist’s full CV.

 

Like it

Picking Up Where Bill C-10 Left Off: The Canadian Government’s Non-Consultation on Online Harms Legislation

July 29, 2021

The Canadian government released its plans yesterday for online harms legislation with a process billed as a consultation, but which is better characterized as an advisory notice, since there are few questions, options or apparent interest in hearing what Canadians think of the plans. Instead, the plans led by Canadian Heritage Minister Steven Guilbeault pick up where Bill C-10 left off, treating freedom of expression as a danger to be constrained through regulations and the creation of a bureaucratic super-structure that includes a new Digital Safety Commission, digital tribunal to rule on content removal, and social media regulation advisory board. When combined with plans for a new data commissioner, privacy tribunal, and the expanded CRTC under Bill C-10, the sheer amount of new Internet governance is dizzying.

While there is clearly a need to address online harms and to ensure that Internet companies are transparent in their policies, consistent in applying those policies, and compliant with their legal obligations, this proposed legislation goes far beyond those principles. The government has indicated that these rules apply only to Internet services (dubbed Online Communications Services or OCSs), citing Facebook, Youtube, TikTok, Instagram, and Twitter as examples. It notes that there will be an exception for private communications and telecommunications such as wireless companies, Skype and WhatsApp (along with products and services such as TripAdvisor that are not OCSs). Yet during a briefing with stakeholders, officials were asked why the law shouldn’t be extended to private communications on platforms as well, noting that these harms may occur on private messaging. Given that the government previously provided assurances of the exclusion of user generated content in Bill C-10 only to backtrack and make it subject to CRTC regulation, the risk that it could once again remove safeguards for basic speech is very real.

The perspective on OCSs is clear from the very outset. After a single perfunctory statement on the benefits of OCSs which says little about the benefits of freedom of expression – the document does not include a single mention of the Charter of Rights and Freedoms or net neutrality – the government proceeds to outline a series of harms, including spreading hateful content, propaganda, violence, sexual exploitation of children, and non-consensual distribution of intimate images. The proposed legislation would seek to address these forms of harmful content through a myriad of takedown requirements, content filtering, complaints mechanisms, and even website blocking.

How does the government intend to address these harms?

The general obligations would include requiring OCSs to implement measures to identify harmful content and to respond to any content flagged by any user within 24 hours. The OCSs would be required to either identify the content as harmful and remove it or respond by concluding that it is not harmful. The OCSs can seek assistance from the new Digital Safety Commissioner on content moderation issues. The proposed legislation would then incorporate a wide range of reporting requirements, some of which would be subject to confidentiality restrictions, so the companies would be precluded from notifying affected individuals.

The government envisions pro-active monitoring and reporting requirements that could have significant implications. For example, it calls for pro-active content monitoring of the five harms, granting the Digital Safety Commissioner the power to assess whether the AI tools used are sufficient. Moreover, the OCSs would face mandatory reporting requirements of users to law enforcement, leading to the prospect of an AI identifying what it thinks is content caught by the law and generating a report to the RCMP. This represents a huge increase in private enforcement and the possibility of Canadians garnering police records over posts that a machine thought was captured by the law.

In order to enforce these rules, the public could file complaints with the Digital Safety Commissioner. The new commissioner would be empowered to hold hearings on any issue, including non-compliance or anything that the Commissioner believes is in the public interest. The Digital Safety Commissioner would have broad powers to order the OCSs “to do any act or thing, or refrain from doing anything necessary to ensure compliance with any obligations imposed on the OCSP by or under the Act within the time specified in the order.” Moreover, there would also be able to conduct inspections of companies at any time:

“The Act should provide that the Digital Safety Commissioner may conduct inspections of OCSPs at any time, on either a routine or ad hoc basis, further to complaints, evidence of non-compliance, or at the Digital Safety Commissioner’s own discretion, for the OCSP’s compliance with the Act, regulations, decisions and orders related to a regulated OCS.”

In fact, the inspection power extends to anyone, not just OCSs, if there are reasonable grounds that there may be information related to software, algorithms, or anything else relevant to an investigation.

The proposed legislation includes administrative and monetary penalties for non-compliance, including failure to block or remove content. These penalties can run as high as three percent of global revenue or $10 million. If there is a failure to abide by a compliance agreement, the AMPs can run to $25 million or five percent of global revenues. The AMPs would be referred to the new privacy tribunal for review. Given that liability for non-compliance could run into the millions, companies will err on the side of taking down content even it there are doubts that it qualifies as harmful.

If the OCS still doesn’t comply with the order to remove certain content, the proposed legislation introduces the possibility of website blocking with orders that all Canadian ISPs block access to the online communications service. The implications of these provisions are enormous, raising the likelihood of creating a country-wide blocking infrastructure within all ISPs with the costs passed on to consumers in the form of higher Internet and wireless bills. Moreover, the proposal is the answer to those who may argue that Canada does not have the power to compel this level of content blocking on foreign services as the government says it will simply order those services blocked from the country if they fail to abide by Canadian content takedown requirements.

Where a company declines to take down content, the public can also file complaints with the new Digital Recourse Council of Canada. This regulatory body would have the power to rule that content be taken down. Hearings can be conducted in secret under certain circumstances. Layered on top of these two bodies is a Digital Safety Commission, which provides support to the Commissioner and the complaints tribunal.

Who pays for all this?

The Internet companies of course. The proposed legislation will create new regulatory charges for OCSs doing business in Canada to cover the costs of the regulatory structure as the companies will pay for the Digital Safety Commissioner, the Digital Recourse tribunal, and the Digital Commission. As part of the payment requirements, the Digital Safety Commissioner can demand financial disclosures from OCSs to determine ability to pay and Canadian revenues.

Far from constituting a made-in-Canada approach, the government has patched together some of the worst from around the world: 24 hour takedown requirements that will afford little in the way of due process and will lead to over-broad content removals on even questionable claims, website blocking of Internet platforms that won’t abide by its content takedown requirements, a regulatory super-structure with massive penalties and inspection powers, hearings that may take place in secret in some instances, and regulatory charges that may result in less choice for consumers as services block the Canadian market. Meanwhile, core principles such as the Charter of Rights and Freedoms or net neutrality do not receive a single mention.

The government says it is taking comments until September 25th, but given the framing of the documents, it is clear that this is little more than a notification of the regulatory plans, not a genuine effort to craft solutions based on public feedback. For a government that was elected with a strong grounding in consultation and freedom of expression, the reversal in approach could hardly be more obvious.