Site icon IT World Canada

Understanding current cybersecurity challenges in law: balancing responsibilities in digital content censorship (Article 4)

Source: alexskopje | Getty Images

Public broadcasting regulations and legislation have long provided the regulations for information which is shared in the public domain, including newspaper, radio, and television. During the COVID-19 pandemic we saw many of our family, friends, colleagues, and acquaintances shift to online forums in order to quickly and efficiently access news information, real-time statistics, and entertainment media, as well as to connect with each other during long periods of isolation. Prior to the pandemic, the government had already been exploring options available to apply broadcasting and media law to online forums. With the global economic, social, and human impact of the COVID-19 pandemic, in combination with our collective shift to doing more things online, there was likely a significant catalyst in expediting the legislative process. Indeed, many governments have looked to expand or revise the role of broadcasting and similar legislation in response to the expanded use of the internet, and the extreme level of public reliance on online sources for news and other digital media.

In our previous articles in this series, we have discussed the concept of data sovereignty – as it relates to both law and cybersecurity – as a legislative challenge in our global digital world, particularly highlighting the challenge of addressing cross-border data transfers. We have also examined the role of digital governance and governance strategies in relation to the concept of digital social responsibility, particularly with respect to user-generated digital content. Finally, we previously explored the complexities inherent in assigning jurisdictional authority for the purpose of addressing activities performed, and content hosted, online. You can view our previous articles here:

We have now arrived at our fourth article in our six-part series. In this article, we will outline the foundations of digital content censorship, the arguments for and against the use of legislation for censorship, and describe the polarizing challenges which continue to arise in legislative reform work, using Canada as an example of the approaches being considered.

Digital content and online censorship

We previously defined digital content and discussed some of the legal challenges that arise when addressing user-generated content in our second series article. As a refresher, recall that “digital content” is an umbrella term for all shared electronic, digital, and/or online content, referring to any information which is made available for download or distribution on electronic media. Put simply, digital media encompasses all forms of media content such as: images, videos, audio files, and text-based content. In effect, this includes: movies, music, ebooks, mobile games, news media, blogs, video uploads, every social media share, retweet, reblog, upload, status update, and more.

Flowing from our definition of digital content, user-generated content (UGC) – including user-generated digital content – refers to any content that is created by individuals rather than brands. Marketable user-generated digital content is born when customers create and disseminate ideas about a product, or the firm that markets it, online. Examples can include social media posts that mention a brand or company by name, online product reviews, ratings on online forums, consumer images, videos of a product being used by a customer, etc. By engaging with user-generated content, a corporation or brand can open up larger and more numerous consumer communication channels – allowing for more extensive customer interaction. This has enabled the high valuation of, and demand for, user-generated content within the corporate sphere.

While previously we have grouped digital content stakeholders into three primary groups, as we are approaching the issue through a legislative lens, we must instead consider four stakeholder groups: 

  1. content creators
  2. content consumers
  3. content hosts
  4. the wider social community in which the content has – or may have – an impact

Censorship involves the control or suppression of speech, public communication, or other information. This may be done on the basis that such material is considered objectionable, harmful, sensitive, or “inconvenient”. Censorship can be – and frequently is – conducted by governments, private institutions and other controlling or regulating bodies. Digital or internet censorship, correspondingly, is the control or suppression of what can be accessed, published, or viewed online. 

While individuals and organizations may choose to engage in self-censorship for moral, religious, or business reasons, to conform to societal norms, due to intimidation, or out of fear of legal or other consequences, censorship legislation expands the reach of censorship to a regional or national scope – encompassing all people to whom the legislation would apply – and removing the choice from the individuals in favour of a common or unified decision made on behalf of the larger society, often executed by governmental authorities.

The extent of internet censorship varies on a country-to-country basis. While some countries have moderate Internet censorship, other countries go as far as to limit the access to information such as news and suppress and silence discussion among citizens. Cases of internet shutdowns and censorship have also occurred in response to – or in anticipation of – events like elections, protests, riots, and general political instability.

Digital social responsibility – the view in favour of censorship legislation

Digital social responsibility extends the social obligations related to governance to include those which are connected to their position and influence online – that is, digital governance.

User-generated content adds an extra layer of complexity to digital governance and social responsibility, as it is the intersection of the creator, the consumer, and the host. This blurs the line between corporate or governmental social responsibility as a form of oversight and individual civil responsibility, adding uncertainty to the mix. All of this becomes even more complex when we are faced with user-generated digital content which is transmitted, uploaded, or otherwise shared between legal jurisdictions. The legal issues which intersect with and relate to user-generated content can be wide-ranging: from determination of ownership and copyright to personal privacy protection and individual freedom of expression, to issues of cyberbullying, illegal content distribution, and large-scale corporate data breaches.

The intersection of digital social responsibility and user-generated content is a challenge to navigate within the law, as evidenced by the strong polarizing of opinions between those who advocate for the rights of the content owners – and those who access the content – and those who opine in favour of increased governmental oversight and responsibility to act on behalf of the prescribed interests of the larger community. 

The growth of digital content hosting platforms has catalyzed reactive legislative propositions, as online enterprises continue to retain significant profit from the hosting of such content and as governmental oversight committees are villainized for their lack of preventative actions in addressing illegal, legally questionable, or descriptively “immoral” digital content.

Freedom of expression and access to information – the view against censorship legislation

“Broadcasting was one of a number of areas – the professions such as teaching, medicine and the law were others – in which special pleading by powerful interest groups was disguised as high-minded commitment to some greater good.” 

– Excerpt from Margaret Thatcher’s memoir, The Downing Street Years

When we discuss ideas of rights and freedoms, we are often examining a balance between “the individual” and the “society”. In this case, we are looking at the competing obligations of “negative rights” and “positive rights” in law and social order. “Negative rights” are rights that oblige inaction – or the right from an action – such as interference, while “positive rights” are rights which oblige action – or the right to something – such as the right to legal counsel.

Negative rights can include civil and political rights such as: freedom of speech, life, private property, freedom from violent crime, protection against being defrauded, freedom of religion, habeas corpus, a fair trial, and the right not to be enslaved by another. As a “negative right”, freedom of speech (or expression) requires that the government stay out of the way in terms of individuals exercising speech, which people share in common that enable them to come to a certain degree of agreement in certain circumstances and disagreement in others.

When it comes to censoring digital content, those whose rights and/or freedoms are potentially infringed by censorship regulations are primarily the content creators and the content consumers. The rights of the content hosts – the online platforms or websites which host user generated content – are also impacted, if not infringed upon.

Legislative reformation attempts in Canada – Bill C-11

The Canadian Federal Government’s current attempt to legislate online content has taken on the form of Bill C-11, otherwise known as the Online Streaming Act. Bill C-11 came about following the highly controversial introduction, disastrous evaluation, and eventual dissolution of its predecessor, Bill C-10, in 2021.

First introduced by the now-former Canadian Heritage Minister Steven Guilbeault, Bill C-10 proposed amendments to the Canadian Broadcasting Act, including a variety of related and consequential amendments to other Acts. The Standing Committee on Canadian Heritage (CHPC) began its study on this Bill after receiving an Order of Reference in February 2021, eventually conducting a total of 44 official meetings and hearing from 142 experts and witnesses regarding the potential impacts of the Bill. The Committee put forward many proposed amendments prior to submitting the report based on their study in June of 2021, following which the Bill was adopted by the Canadian House of Commons. It was later dissolved, along with parliament, following the call for a federal election, which was held on September 20, 2021.

Rising from the ashes of C-10, Canadians are now faced with Bill C-11, also called the Online Streaming Act, which – like its predecessor – expands the Broadcasting Act that grants the CRTC regulatory powers over radio and television to cover all audiovisual content on the Internet, including content on platforms like TikTok, YouTube, Spotify, and podcast clients

Under Bill C-11, all platforms hosting audiovisual content that are not specifically excluded must make financial contributions to producing officially recognized CanCon (or “Canadian Content”). They must also make CanCon “discoverable” by filling our feeds and search results with a mandatory quota of official CanCon content or face stiff financial penalties from the CRTC. Unfortunately for our content creators, “CanCon” is currently defined by a 1980s era points system built around legacy media broadcast media, largely excluding small and digital-first Canadian content creators. 

Bill C-11 also gives the CRTC unprecedented regulatory authority to monitor all online audiovisual content. This power extends to penalizing content creators and platforms and through them, content creators that fail to comply. While Section 4(1) establishes a limited exception from regulation for some types of online audiovisual content, most audiovisual content will still be subject to CRTC regulation under the current draft of the bill, as the three criteria used by the CRTC are so broad. The three criteria are:

  1. Whether the content generates revenue for someone, indirectly or directly.
  2. Whether any part of the content has been broadcast on a more traditional broadcasting platform.
  3. Whether the content has been assigned a “unique identifier” under any international standards system.

Unfortunately, once again, for our content creators, most content on the Internet generates revenue in some form, for someone, somewhere – and has unique identifiers tagged to content. As a result, most online audiovisual content is still at risk of being regulated and taxed by the CRTC, including all the things which fall under the umbrella label of “digital content”, that is, all shared electronic, digital, and/or online content which is made available to others online.

This has raised significant concerns among not only legislators, but also non-political individuals and entities including, but not limited to, academics; researchers; students; activists; industry leaders; small business owners; journalists; bloggers; artists; musicians; writers; social media influencers; and other content creators, content consumers, and content hosts.

Significant debate has ensued. Indeed, the public controversy which has been unfolding around these activities has likely been a heavy influence leading to the federal government’s decision to have a blanket non-reading of the amendments to Bill C-11 – which were, at the time, being presented to the House of Commons for a vote – with the ongoing governmental attempts to curb the debate even being described as “draconian” by some Members of Parliament.

Conclusion

The issue of content censorship has long divided communities throughout modern history, from newspapers to radio broadcasting to television broadcasting and now to our digital broadcasting forums. It is neither unexpected nor unprecedented; we still hear of many cases – even in our modern world – in which reporters and journalists have been arrested, jailed, sued for libel, stalked, threatened, physically attacked, even murdered, for the things about which they have written, spoken, or otherwise made available to the public domain. 

While the concept of content censorship and the practice of broadcasting regulations are not new to our nations, the novel component in this challenge is the forum to which censorship regulations or legislation might apply – the Internet. How individual nations will eventually decide to address their content regulations remains to be fully seen. What is certain, however, is the pressing need for governmental transparency and citizen involvement in the creation of these novel structures.

In our next article in this series on “Understanding current cybersecurity challenges in law”, we will venture into a comparative analysis of a few of the different strategies undertaken by governments to address some of the many emergent online challenges.

Exit mobile version