Issues like the EU’s latest regulatory push or recent U.S. Supreme Court hearings may have dominated recent tech policy headlines, but less attention has been paid to serious legislative tinkering in the United Kingdom that could have global impacts. Despite a high turn-over rate, the government and Parliament have made a series of significant amendments to the already highly controversial Online Safety Bill (OSB). Taken together these changes could significantly change the way the internet is experienced in the United Kingdom, and establish deeply problematic precedents that would further embolden governments around the world. 

The OSB is the result of five years of on-again, off-again government attention to the issue of “online harms,” having wound its way through a green and a white paper, the re-branding of the Department of Digital, Culture, Media and Sport and now the creation of a new digital ministry, developments that together have spanned the tenures of four prime ministers. The original “online harms” approach, initially set out in 2019, centered around an ill-defined “duty of care” standard requiring companies to prevent certain harmful user-generated content, including so-called “lawful but awful” content. Through recent amendments, the government has simultaneously narrowed and expanded the Bill’s scope by limiting liability for failure to remove content that is not illegal, while broadening the range of priority offenses (illegal content) covered and expanding the severity, scope, and reach of enforcement powers. The Bill now sits with the sometimes feckless House of Lords, which represents the last chance for further amendments. 

Of the many questions and concerns that the OSB raises, several aspects deserve attention for how they resemble tactics traditionally associated with regimes seeking to increase their leverage over tech companies in order to surveil their populations and censor information – efforts that are also often “justified” in the name of “online safety.” These provisions unnecessarily risk validating and encouraging further government repression and should be reconsidered by the House of Lords. Three aspects in particular deserve to be called out and considered carefully against the UK’s “constitutional” framework and international commitments: extraterritorial application, individual liability, and the use of digital laws to regulate offline behavior.

The Sun Will Never Set on British Online Enforcement

Governments around the world have expressed frustration for years about their inability to get the attention of (not to mention obedience from) tech companies. Until recently, the countries that typically acted most aggressively to expand leverage over tech companies were those seeking to exert more control over data and information for repressive purposes. With the OSB, the UK would join several other democratic countries that have recently added themselves to that list by expanding enforcement powers and enacting personnel localization (aka “hostage taking”) provisions. While the instinct to establish jurisdiction and enforcement power is understandable, the Peers should ask themselves if such an approach is actually, in the language of human rights, “necessary and proportionate” to the legitimate objectives that the law is seeking to address. 

The OSB’s “safety duties” apply to any internet user-to-user or search service, regardless of its location, that targets UK users or has a significant number of UK users, as well as if “there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the [UK]” presented by the service. While the user-base and targeting criteria are relatively uncontroversial (if underdefined), the “material risk of significant harm” prong raises questions as to how such a determination might be made. The Bill also makes clear that certain “information offenses” will apply regardless of whether they take place in the UK “or elsewhere.” These “information offenses” include failure to comply with a requirement included in an “information notice” from Ofcom, the OSB-empowered regulator, as well as providing information in response to a notice that is “encrypted such that it is not possible for Ofcom to understand it.” While the latter offense contains a mens rea provision (“intent”), that may be shallow comfort since the innate purpose of end-to-end encryption (E2EE) is to make underlying content unreadable to anyone but the sender and recipient. (It is worth noting that other provisions in the Bill have been criticized for potentially undermining the ability of private messaging services to deploy E2EE.) For good measure, the Bill also makes clear that it also applies extraterritorially to any officer for any offense committed by a covered entity that “is attributable to any neglect on the part of [the] officer of the entity.”

While it may feel good and be politically savvy for the government to assert that it will reach to the ends of the earth (i.e., Silicon Valley) to hold companies accountable for harms that occur in the UK, the likelihood that it will have to go after individual employees located abroad is incredibly slim. That’s in part because it will be unnecessary for it to do so because the rest of the OSB delivers extensive, effective penalties to ensure compliance. Specifically, the Bill allows Ofcom to assess fines of up to 10 percent of global revenue on covered entities (which, based on last year’s reported revenues, would amount to fines of $120 million, $16 billion, and $28 billion for Twitter, Meta, and Google respectively), as well as to obtain court orders against third parties to deny non-compliant entities business facilities or, as a last resort, to block them within the UK. Based on last year’s reported revenues. 

The more likely impact of this flexing of jurisdictional muscle will be to empower compliance lawyers within companies, whose job is to minimize legal risks, however slim. These unheralded corporate actors will increasingly bring experience with financial and export-control regulations to bear on content regulations and are likely to recommend aggressive approaches to complying with the broad provisions in the OSB. It will also contribute to the ongoing competition among countries around the world, regardless of their democratic credentials, to take unnecessarily expansive approaches to their own regulatory powers. Collectively, these efforts make conflicts of law between jurisdictions more likely and could force companies to make radical decisions, such as providing significantly different versions of products and services, or even withholding them altogether, in distinct countries. This in turn will limit benefits that we take for granted, such as the way the internet has been used to foster the development of cross-border communities, trends, and commerce.

Getting a Bit Too Personal

Beyond extraterritorial application, in what appears to be another cathartic but unnecessary move, recently proposed amendments would extend personal, criminal liability to “senior managers” for persistent breaches of their duty of care to children. The text of the Bill that passed out of the House of Commons earlier this year already included provisions holding any “officer” (defined as “a director, manager, associate, secretary or other similar officer”) accountable for any information offenses by the entities they work for, if it is committed with their “consent or connivance” or “attributable to any neglect” on their part. This includes criminal liability for “failure to take all reasonable steps” to avoid presenting false or encrypted information, or to avoid destruction of documents.

On top of this, a new amendment proposed by the government would provide criminal sentences of up to two years for senior managers who fail to comply with the child protection duties set out in the Bill. This amendment was apparently made to satisfy Conservative party “back benchers” upset with the removal of “lawful but harmful” liability. Given this political context, it is not surprising but still disappointing that the government has failed to articulate a clear case for why institutional liability is insufficient or why personal, criminal sanctions are necessary. As Jacqueline Rowe of Global Partners Digital has recently pointed out, the lack of any requirement to exhaust alternative remedies before pursuing criminal sanctions is out of keeping with other countries’ approaches, the focus on content targeting children could lead to disproportionate impacts on children’s free expression rights, and the lack of definitional clarity around the conduct being criminalized raises questions about its consistency with the UK’s international commitments.

Locally-based, tech company staff have long been targets for non-democratic governments seeking to compel compliance with their censorship or surveillance orders. While authoritarian governments are likely to continue to engage in such “hostage taking” regardless of what the UK does, having a leading, democratic country ratify this approach undermines the moral weight of arguments made in response to such pressure. 

Regulating the Analog World, One Post at a Time

Late last year, in what it framed as a safeguard for freedom of expression, the UK government removed provisions providing liability for failure to remove “lawful but harmful” content. However, since then the government has significantly expanded the list of “priority offences” covered by the Bill (and its associated, expanded enforcement mechanisms). In addition to creating new categories of illegal content, the OSB would bring a wide range of existing crimes into the regulatory and enforcement scheme that it creates. 

The Council of Europe’s Convention on Cybercrimes (“Budapest Convention”) is a 20-year-old treaty that defines and seeks to harmonize the scope of “cybercrime,” as well as related criminal procedural and mechanisms for intergovernmental cooperation. The UK ratified the Convention in 2011 and has been a staunch advocate for its approach since then, including most recently as part of the UK’s pushback against the Russia-led efforts to define cybercrime more broadly through the U.N. Ad-Hoc Committee on Cybercrime. The OSB incorporates a range of cybercrimes that fall into categories covered in the Budapest Convention, such as misuse of data or computer systems, computer-related crimes, and child pornography. 

In addition to these crimes that are committed through the use of computers, the government has been stuffing more and more “analog” crimes that cannot be committed online into the Bill. Most controversially, the government recently introduced amendments that declare “Modern Slavery” and “immigration offenses” as “priority offenses.” As Secretary of State for Digital, Culture, Media and Sport Michelle Donelan has explained, “[a]lthough the offences . . . cannot be carried out online . . . aiding, abetting, counselling, conspiring etc those offences by posting videos of people crossing the channel which show that activity in a positive light could be an offence that is committed online and therefore falls within what is priority illegal content.” 

As the Budapest Convention makes clear, there is nothing wrong with using digital evidence to convict people who break the law. However, it is harder to justify why a website or platform should be held legally responsible for removing or prohibiting content related to border crossing or sex trafficking. At the same time, it isn’t hard to imagine how such an expansive approach would result in companies erring on the side of caution, which could limit journalistic content, pro-immigration content, or even anti-trafficking content that gets caught up in imprecise filters. As observers in the United States have pointed out, the last effort by Congress to allow intermediary, criminal liability for sex trafficking (through FOSTA/SESTA) has resulted in a range of unintended consequences. 

This application of digital law to analog behavior is even more problematic considering that the OSB introduces an apparently novel and legally untested threshold for companies to remove content: “reasonable grounds to infer” illegality. This, combined with the limited defenses provided, exacerbates the risk that covered entities will choose to be very conservative about what content they allow UK users to see. This in turn means that UK users may have less opportunity to understand and participate in global conversations about sensitive topics that are mediated online, which amounts to quite a few conversations these days.

While such an approach would likely be read as the kind of “prior restraint” prohibited under the U.S. First Amendment (or “prior censorship” prohibited under Article 13 of the American Convention on Human Rights), it is not clear how this will be interpreted under UK law, which grants wider latitude for limitations on expression. What is clear is that authoritarians have long coveted similar powers to repress politically inconvenient or contentious content and would welcome the opportunity to respond to any critiques thereof with claims of hypocrisy or moral equivalence.

Time to Focus

Governments should consider how best to incentivize tech companies to be responsible for the harms that occur on, through, and as a result of their products, services, and processes. However, if governments are serious about their commitments to protecting freedom of expression, privacy, and the open, interoperable, secure, and global internet, they must act thoughtfully and responsibly. 

The provisions addressed above are neither necessary, nor proportionate to the valid purposes that the OSB seeks to address. The UK regularly imposes significant monetary penalties on companies in other regulatory contexts, including foreign tech companies. And there is no evidence that UK regulators have any systemic issues with compliance. For a country like the UK, which all the major tech companies have staff in, and which has been granted privileged access to data held by U.S.-based tech companies, to use the OSB to grab unnecessary powers is unseemly and, in the long run, counter-productive. Similarly, the extension of the Bill’s scope to regulate offline conduct is a step too far that the Peers should walk back.

There is much in the OSB that is both thoughtful and responsible, including its provisions around transparency, non-discrimination, and fairness. These aspects are in-line with the kinds of human rights-compliant approaches that digital rights groups have advocated. But those are overshadowed and at risk of being negated by some of the more politically-motivated, hyperbolic aspects. The House of Lords must take advantage of its review and opportunity to amend the Bill to focus it and strip away unnecessary and problematic provisions. This will not only result in a more efficient, less legally-vulnerable regulatory framework, but also ensure that the UK government can continue to hold its head-up high and assert itself as a global leader on matters of international human rights law, due process principles, and economic and technological progress. 

This post is written in the author’s personal capacity.

IMAGE: In this photo illustration, a teenage child looks at a screen of age-restricted content on a laptop screen on January 17, 2023 in London, England. The Online Safety Bill aims to protect young and vulnerable viewers by introducing new rules for social media companies which host user-generated content, and for search engines, which will have tailored duties focused on minimizing the presentation of harmful search results. (Photo by Leon Neal/Getty Images)