The Online Safety Bill: is the proposed law too drastic?

With the best of intentions, the government has now published the draft Online Safety Bill (OSB). It is designed to protect free speech and enhance democracy. But at the same time, it is designed to prevent harmful content that could take the form of hate speech, cyberbullying, terrorist propaganda and child abuse. The government’s stated objective is to make the United Kingdom the safest place in the world to be online.

The OSB will almost certainly become the Online Safety Act 2021. In its present form, the OSB may look good on paper: a significant section of the public consider that the major internet service providers do not do enough to protect us from harm or even pay enough tax. But when we examine the mechanisms in the OSB that are supposed to reconcile restrictions on online speech with free speech in a democratic context, significant difficulties in striking this balance are apparent.

What are regulated services?

Not all online businesses will be affected by the OSB. It will affect online “regulated services”. Most “user-to-user” services, in which one party shares content with other internet users are caught, as are most “searching services” (anything which utilises a search engine and which is not a user-to-user services). All regulated services must have “links” to the United Kingdom in terms of the location of itself and its users and if United Kingdom residents are its primary target audience. Most providers of email and message platforms will not be within then OSB’s ambit.

The burden of compliance

The OSB has a noble legislative precedent; the General Data Protection Regulation (GDPR) which similarly makes few black-and-white rules. Instead, it imposes broadly-drawn principles that organisations must decide how to implement on the basis of best practice guidance all under the watchful eye of the Information Commissioner’s Office (ICO). The OSB similarly gives expression to wide-ranging objectives which regulated services must adhere to by following yet-to-be devised codes of practice. The nominated regulator will be OFCOM which will have the power to impose fines as high as the greater of £18 million or 10% of global annual turnover of the relevant regulated service.

All in all, this is a most unsatisfactory mixture. The vague general objectives approach will create huge risk uncertainties on top of the not-unreasonable fear of swingeing fines. And this is all in addition to the need for sky-high investment in compliance mechanisms and potential new avenues of civil litigation.

The worry for lawyers acting for regulated services entities is that advice on risk mitigation in itself carries its own risk; it is quite impossible to recommend any zero risk strategy. Advisers can only tell their clients to pull from digital space allegedly harmful but otherwise lawful content that could give rise to liability.

Obligations on regulated service providers

Rather like a GDPR-style “data protection impact assessment” stipulated by the GDPR, the OSB establishes a requirement to initiate and maintain “illegal content risk assessments” in relation to the targets for its content. It is in this specific context that the balance between elimination of truly harmful content and free expression becomes very difficult to achieve.

“Harmful content” is regulated content that might create a material risk of such content “having, or indirectly having, a significant adverse physical or psychological impact” on an adult or child of “ordinary sensibilities”. This will be often quite difficult to assess without evidence and the possibility of private litigation may well piggyback off the OSB’s wording if there is a finding that negative impacts on individuals have taken place.

Regulated services will therefore always need to remove harmful content but at the same time have regard for freedom of expression and the preservation of content having “democratic importance” (when all of these concepts are spectacularly vague and difficult to apply in practice). Regulated service providers will need to implement effective decision-making procedures to remove content that might be deemed harmful under the legislation.

In essence, a complaints procedure must be set up to ensure that any journalistic or other responsible content that has been removed out of caution is swiftly restored and republished. Aggrieved parties, not just journalists, will complain of restrictions on their freedom to bring matters of importance to the attention of the public if understandably overcautious service providers insist on deleting content and failing to reinstate it after the complaints procedure has been exhausted.

Is there an impact on your business?

Unless you are Google, it is sometimes a difficult exercise to decide if you operate as a user-to-user or as searching service because of the opacity of the definitions and exceptions. But if you are OSB-regulated, the impact will be the high cost of setting up compliance mechanisms before it becomes law, as well as the penalties and litigation risk if you get it all wrong. The government estimates that the likely impact on online businesses could be approximately £17 billion.

That’s fine in the case of Big Tech service providers but prohibitive for new players in the market or modest-sized providers who simply cannot raise the investment before launch or bear the risks afterwards. Precious little legal clarity will be apparent until key areas of the OSB, even after likely Parliamentary amendment, have been litigated.

Conclusions

There is much in the OSB which is unlikely to survive in its present draft form as the OSB progresses through the legislative process. Parliamentarians will be very wary of loose concepts, a costly compliance burden and fear of regulatory penalties as well as civil litigation will effectively amount to censorship.

Everyone will agree that the usual sort of online harms – child sexual exploitation, extreme pornography, trolling, terrorist and extremist content and online sale of drugs and weapons – must be curtailed but the OSB may simply just not work in practice. The burden on OFCOM, which is expected to draft codes of practice, is a monumental and open-ended undertaking. Whether or not OFCOM does a good job, the outcomes are likely to be extensively challenged in the courts.

Perhaps even more importantly, will society accept the restrictions on free speech? Despite the language of the proposed legislation as it now stands the OSB is anti-democratic because it represents a setback for edgy or controversial online expression, the result being a bland internet that is not in the public interest.

How Moore Barlow can help

If you or your business need legal representation and advice in this area, please contact our expert team today.


Share