Court rules widows not allowed to double-dip on pensions


Online harms legislation proposed by the federal government concerns free speech advocates who say the framework could limit the discourse that happens on social media platforms.

The framework proposes a Digital Safety Commission of Canada that would include three bodies: the Digital Safety Commissioner of Canada, the Digital Recourse Council of Canada (DRC), and an Advisory Board.

Together, they would police what the proposal terms Online Communications Services (OCS) such as Facebook, Youtube, TikTok, Instagram, Twitter, and Pornhub.

The ostensible goal of the framework is to eliminiate hate speech, terrorist content, content that incites violence, intimate images shared without consent of the participants, and child sexual exploitation.

OCSs would be required to implement measures that would identify harmful content and respond to complaints flagged by any user within 24 hours. The OCSs would have proactive and reactive reporting requirements with confidentiality restrictions, some of which would preclude the platforms from notifying affected users.

Platforms that did not comply could face fines of up to $10 million or 3% of an entity’s gross revenue—whatever is higher, from the Commissioner. Alternatively, the Commissioner could refer offenses to prosecutors, in which case the fines could be $25 million or 5% of an entity’s gross global revenue.

The Commissioner could also apply to the Federal Court that would require Telecommunications Service Providers to block or filter access to a service that has repeatedly refused to remove all content the Commissioner dictated. He or she would also collect and share information with other government departments and agencies. The discussion paper calls for Canada’s spy agency to have streamlined ability to get judicial authority to receive basic subscriber information of “online threat actors.”

The commissioner could even apply for a warrant to send inspectors into workplaces and homes to acquire documents, software, and information related to their algorithms.

Political correctness seems strongly in mind. Section 35 (a)(ii) of the proposal’s technical paper tasks the commissioner with “Engaging with and considering the particular needs of and barriers faced by groups disproportionately affected by harmful online content such as women and girls, Indigenous Peoples, members of racialized communities and religious minorities and of LGBTQ2 and gender-diverse communities and persons with disabilities.”

The DRC would receive complaints against the Commissioner’s rulings and would consist of three to five members. The Governor in Council appointing members to the DRC are to consider “the importance of diverse subject-matter experts” from the aforementioned minority groups in making its appointments. Hearings of the DRC could be held in secret if it was deemed to be in the public interest for privacy concerns, confidential commercial interests, national security, national defense, or international relations.”

In an interview, Cara Zwibel of the Canadian Civil Liberties Association, expressed concerns with the legislation.

“It’s got some things in it that we, of course, were hoping it would not. It’s got 24 hour-takedown requirements. It allows for website blocking. So there’s a lot in there that we’re pretty concerned about and we think Canadians will be concerned about,” Zwibel said.

“The big issue with the proposal is that there’s a potential to interpret these things very broadly, and by creating these 24-hour takedown requirements, you’re incentivizing social media companies to err on the side of removal, which is, obviously a problem for freedom of expression.”

Zwibel is also concerned the task of dealing with such large volumes of content could create a bloated bureaucracy, but ultimately not complete the job.

“This content just moves around. People try to get it taken down off this platform, it shows up on a different one, try to get to take them off that one, it shows up on another one. So I’m not sure about the effectiveness of these tools,” Zwibel said.

“One of the most troubling things in the proposal has to do with the mandatory sharing of information between social media companies and law enforcement…. Co-opting of private companies as forms of law enforcement [is] a concerning development that we need to pay pretty close attention to.”

The proposal is billed as the regulatory complement of Bill C-36. Put together, Lisa Bildy, of the Justice Centre for Constitutional Freedoms (JCCF), sees trouble.

“This is, frankly, one of the most egregious attacks on the free society in living memory. It undermines the liberal legal order, which protects freedom of expression, the marketplace of ideas, constitutional neutrality, and important legal protections like the presumption of innocence,” Bildy told the Epoch Times in a written statement.

Bildy said the proposal dovetails with other “dangerous legislation” that the JCCF is already preparing to challenge as unconstitutional.

“They all appear to be related. Bill C-36 proposes to punish a much broader range of “hate speech” in disturbing ways, and Bill C-10, with the help of the new digital safety bureaucracy, will ensure that it is pulled down from the internet immediately. The whole scheme treats freedom of expression as a threat to, rather than a feature of, a liberal democracy.”

In a recent blog post, University of Ottawa law professor Michael Geist also condemned the legislation.

“Far from constituting a made-in-Canada approach, the government has patched together some of the worst from around the world,” Geist wrote.

“The government says it is taking comments until September 25th, but given the framing of the documents, it is clear that this is little more than a notification of the regulatory plans, not a genuine effort to craft solutions based on public feedback.”

Zwibel believes the expected federal election could provide another valuable opportunity for Canadians to weigh in.

“It maybe will be a topic of discussion for if there is an election, what we want to see happen with regulating the social media companies,” Zwibel said.

“There is an opportunity for Canadians to say, ‘This isn’t what we want. This isn’t something we think will be effective,’ or ‘We think it will have dangerous consequences.’”

Harding is a Western Standard correspondent based in Saskatchewan





Read More: Court rules widows not allowed to double-dip on pensions

2021-08-04 18:00:00

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments