LawBytes Podcast – Episode 22 was automatically transcribed by Sonix with the latest audio-to-text algorithms. This transcript may contain errors. Sonix is the best way to convert your audio to text in 2019.
Michael Geist:
This is Law Bytes, a podcast with Michael Geist.
Dan Albas:
Minister, number nine of your guidelines: free from hate and vile extremism. One, the prime minister has been obviously talking a lot, refers to protecting Canadians from hate, violent extremism as well as disinformation. Now, I believe no one here defends hate speech and all Canadians deserve to feel safe in their communities and online. My question is, how will you enforce this measure? How will you monitor these platforms while also protecting free speech?
Navdeep Bains:
So free speech is absolutely essential. It’s part of our charter rights and freedom. This is why I became a liberal. And this is really core to our democracy and what it means to be Canadian. But at the same time, there’s clear limitations to that when it comes to hate, for example. And we see newspapers and broadcasters that hold themselves to account when it comes to not spewing that kind of hate on their platforms. So clearly, these digital platforms that have emerged also have a responsibility. We all are very aware of the 51 individuals that were killed in New Zealand, in Christchurch. And that really prompted this call to action where the prime minister was at Paris to say platforms need to step up. If they had the technology, if they had the ability to bring people together, to connect people and then investing in A.I. and all these different technologies, they need to deploy those technologies to prevent those platforms from being used as means to disseminate extremism, terrorism or hate. And so that’s what we’re trying to do with the government as a government is really apply pressure to these platforms to hold them to account. And those platforms recognize they need to step up as well. And that’s one key mechanism of how we want to deal with this.
Michael Geist:
The debates over intermediary liability, which focus on what responsibility should lie with Internet platforms and service providers for the content they host that’s posted by their users has been taking place around the world in Parliament’s op ed pages and the broader public debate. Much like the exchange you just heard between Canadian Conservative MP Dan Albas and Innovation, Science and Economic Development Minister Navdeep Bains from earlier this spring, there are no easy answers with policy choices that have implications for freedom of expression, online harms, competition and innovation. To help sort through the policy options and their implications, I’m joined on the podcast this week by Daphne Keller, the director of Intermediary Liability at Stanford Center for Internet and Society. Daphne, who served as associate general counsel for Google until 2015, worked on groundbreaking intermediary liability litigation and legislation around the world while at the company, and her work at Stanford focuses on platform regulation and Internet users rights. She recently posted an excellent article on the Balkanization blog that provided a helpful guide to intermediary liability lawmaking and agreed to chat about how policymakers can adjust the dials on new rules to best reflect national goals.
Michael Geist:
Daphne, thanks so much for joining me on the podcast.
Daphne Keller:
Thank you. It’s good to be here.
Michael Geist:
Great. So as you know, there’s been a lot of momentum lately towards regulating online speech and establishing more liability for the large Internet platforms. That’s an issue that we’ve been grappling with, really, I think, since the popularization of the Internet. Back in the 1990s. But today there seems to be more and more countries expressing concern about online harms and looking especially to the large platforms the Google and Facebook and others to do more with real legal and regulatory threats if they don’t. So before we get into some of the challenges inherent with this kind of do something demands. I thought we could set the groundwork a little bit from a couple of perspectives, both with the law says now and with the platforms have been doing. Why don’t we start with the laws and recognizing their differences, of course, between lots of different countries. Where have we been for the last number of years anyway, even going back a couple of decades with respect to some of these liability questions?
Daphne Keller:
Well, a l