By

August 4, 2021

-Western Standard

 

Online harms legislation proposed by the federal government concerns free speech advocates who say the framework could limit the discourse that happens on social media platforms.

The framework proposes a Digital Safety Commission of Canada that would include three bodies: the Digital Safety Commissioner of Canada, the Digital Recourse Council of Canada (DRC), and an Advisory Board.

Together, they would police what the proposal terms Online Communications Services (OCS) such as Facebook, Youtube, TikTok, Instagram, Twitter, and Pornhub.

The ostensible goal of the framework is to eliminiate hate speech, terrorist content, content that incites violence, intimate images shared without consent of the participants, and child sexual exploitation.

OCSs would be required to implement measures that would identify harmful content and respond to complaints flagged by any user within 24 hours. The OCSs would have proactive and reactive reporting requirements with confidentiality restrictions, some of which would preclude the platforms from notifying affected users.

Platforms that did not comply could face fines of up to $10 million or 3% of an entity’s gross revenue—whatever is higher, from the Commissioner. Alternatively, the Commissioner could refer offenses to prosecutors, in which case the fines could be $25 million or 5% of an entity’s gross global revenue.

The Commissioner could also apply to the Federal Court that would require Telecommunications Service Providers to block or filter access to a service that has repeatedly refused to remove all content the Commissioner dictated. He or she would also collect and share information with other government departments and agencies. The discussion paper calls for Canada’s spy agency to have streamlined ability to get judicial authority to receive basic subscriber information of “online threat actors.”

The commissioner could even apply for a warrant to send inspectors into workplaces and homes to acquire documents, software, and information related to their algorithms.

Political correctness seems strongly in mind. Section 35 (a)(ii) of the proposal’s technical paper tasks the commissioner with “Engaging with and considering the particular needs of and barriers faced by groups disproportionately affected by harmful online content such as women and girls, Indigenous Peoples, members of racialized communities and religious minorities and of LGBTQ2 and gender-diverse communities and persons with disabilities.”

The DRC would receive complaints against the Commissioner’s rulings and would consist of three to five members. The Governor in Council appointing members to the DRC are to consider “the importance of diverse subject-matter experts” from the aforementioned minority groups in making its appointments. Hearings of the DRC could be held in secret if it was deemed to be in the public interest for privacy concerns, confidential commercial interests, national security, national defense, or international relations.”

In an interview, Cara Zwibel of the Canadian Civil Liberties Association, expressed concerns with the legislation.

“It’s got some things in it that we, of course, were hoping it would not. It’s got 24 hour-takedown requirements. It allows for website blocking. So there’s a lot in there that we’re pretty concerned about and we think Canadians will be concerned about,” Zwibel said.

“The big issue with the proposal is that there’s a potential to interpret these things very broadly, and by creating these 24-hour takedown requirements, you’re incentivizing social media companies to err on the side of removal, which is, obviously a problem for freedom of expression.”

Zwibel is also concerned the task of dealing with such large volumes of content could create a bloated bureaucracy, but ultimately not complete the job.

“This content just moves around. People try to get it taken down off this platform, it shows up on a different one, try to get to take them off that one, it shows up on another one. So I’m not sure about the effectiveness of these tools,” Zwibel said.

“One of the most troubling things in the proposal has to do with the mandatory sharing of information between social media companies and law enforcement…. Co-opting of private companies as forms of law enforcement [is] a concerning development that we need to pay pretty close attention to.”

CLICK HERE TO SUBSCRIBE TO THE WESTERN STANDARD, AND READ THE FULL STORY.