Unpublished Opinions
Dr. Michael Geist is a law professor at the University of Ottawa where he holds the Canada Research Chair in Internet and E-commerce Law. He has obtained a Bachelor of Laws (LL.B.) degree from Osgoode Hall Law School in Toronto, Master of Laws (LL.M.) degrees from Cambridge University in the UK and Columbia Law School in New York, and a Doctorate in Law (J.S.D.) from Columbia Law School. Dr. Geist is a syndicated columnist on technology law issues with his regular column appearing in the Toronto Star, the Hill Times, and the Tyee. Dr. Geist is the editor of several copyright books including The Copyright Pentalogy: How the Supreme Court of Canada Shook the Foundations of Canadian Copyright Law (2013, University of Ottawa Press), From “Radical Extremism” to “Balanced Copyright”: Canadian Copyright and the Digital Agenda (2010, Irwin Law) and In the Public Interest: The Future of Canadian Copyright Law (2005, Irwin Law), the editor of several monthly technology law publications, and the author of a popular blog on Internet and intellectual property law issues.
Dr. Geist serves on many boards, including the CANARIE Board of Directors, the Canadian Legal Information Institute Board of Directors, the Canadian Internet Registration Authority, and the Electronic Frontier Foundation Advisory Board. He has received numerous awards for his work including the Kroeger Award for Policy Leadership and the Public Knowledge IP3 Award in 2010, the Les Fowlie Award for Intellectual Freedom from the Ontario Library Association in 2009, the Electronic Frontier Foundation’s Pioneer Award in 2008, Canarie’s IWAY Public Leadership Award for his contribution to the development of the Internet in Canada and he was named one of Canada’s Top 40 Under 40 in 2003. In 2010, Managing Intellectual Property named him on the 50 most influential people on intellectual property in the world and Canadian Lawyer named him one of the 25 most influential lawyers in Canada in 2011, 2012 and 2013.
Click here to view Dr. Geist’s full CV.
Regulating What Canadians See Online: Why Bill C-10 Would Establish CRTC-Approved TikTok, Youtube and Instagram Feeds
The uproar over Bill C-10 has rightly focused on the government’s decision to remove safeguards for user generated content from the bill. Despite insistence from Canadian Heritage Minister Steven Guilbeault that users will not be regulated and Prime Minister Justin Trudeau that users will not be required to make Cancon contributions, the reality is that the removal of Section 4.1 from the bill means that all user generated content is treated as a “program” under the Act and therefore subject to regulation by the CRTC.
That regulation is extensive and can include “discoverability” requirements that would allow the regulator to mandate that platforms prioritize some users’ content over others. Section 9.1(1)(b) of the bill states:
The Commission may, in furtherance of its objects, make orders imposing conditions on the carrying on of broadcasting undertakings that the Commission considers appropriate for the implementation of the broadcasting policy set out in subsection 3(1), including conditions respecting
(b) the presentation of programs for selection by the public, including the discoverability of Canadian programs;
Since the government is now treating user generated content as a program under the Act, this effectively reads that the CRTC can establish conditions respecting the presentation of user generated content for selection by the public, including the discoverability of user generated content.
This aspect of the CRTC powers and the government’s plans has not received much attention, but it raises the prospect of CRTC-approved feeds for services such as TikTok, Youtube, and Instagram. The government has said it plans new amendments that will address concerns about regulating user generated content, but it has also maintained that it wants to retain the discoverability requirements. Indeed, the Prime Minister specifically referenced those requirements in the House of Commons yesterday.
The government is trying to have it both ways, arguing that it doesn’t want to regulate user generated content and then proceeding to regulate it by establishing conditions on what content users may access in their social media services. This has direct implications for free expression as it will fall to a regulator to determine which speech is prioritized online. As David Fraser recently noted, “any regulation of how a platform presents expressive content to an audience implicates the content itself.”
I’ve written previously about the claims related to discoverability in Bill C-10, including the lack of evidence that there is a discoverability problem (the Yale report found very little) and the fact that finding Canadian content on a service such as Netflix only requires typing Canada into a search box. Yet beyond the ease with which Canadian content can be found on audio and video-on-demand services, no one – literally no other country – thinks that mandating domestic content requirements on a user generated content platforms makes any sense whatsoever (as far as I can tell, no one does what Guilbeault and the government want to mandate. Some have pointed to Pakistan, which has extensive regulations, but they appear to primarily target content blocking rather than government-mandated content prioritization).
Guilbeault has frequently (and misleadingly) claimed that Bill C-10 is similar in approach to European Union regulation of audio-visual services. This is claim is false as I discuss in this post. But it should be noted that even the European Union approach – which involves considerable regulation – does not contemplate creating domestic content requirements for user generated content. Indeed, the directive explicitly treats audiovisual media services (such as Netflix) and video sharing platform services (such as Youtube) differently. Audiovisual media services that engage in curating content face content requirements similar to those found for conventional broadcasters. Video sharing platform services face rules with respect to removing certain illegal or harmful content, but there are no quotas or no positive obligations to prioritize some content over others.
Not only is such an approach unworkable (how do regulators even identify what counts as domestic user generated content), but it would represent an exceptionally heavy-handed regulatory approach where a government-appointed regulator decides what individual user generated content is prioritized in order to further “discoverability”, a term that isn’t even defined in Bill C-10. There is a need for greater transparency of the algorithms used by social media companies, but to turn over the content choices of social media feeds of millions of Canadians to the CRTC is madness and an abdication of the government’s professed support for freedom of expression.
Comments
Be the first to comment