Online Safety Bill (UK): Compliance could cost a medium sized business £250k per year

The Department for Digital, Culture, Media & Sport (DCMS) has published its Draft Online Safety Bill. The legislation requires providers of some digital services, defined in the bill as user-to-user services and search services, to take steps to shield the young from illegal, inappropriate or harmful content and puts safeguards in place to protect free speech online. Here’s what business leaders need to know:

  • Implementing this and similar measures in other jurisdictions will be a challenge
  • Fines to be calculated on similar basis as those for GDPR breaches
  • Over 24,000 businesses will fall within scope of the measures
  • Collectively, businesses will shoulder over £2b in additional costs to comply
  • There are steps that can be taken to reduce compliance related risks
  • What is and isn’t likely out of scope

Purpose of the Bill

Most would agree that legislation requiring digital service providers to exercise a duty of care to protect children from online harms is long overdue. Having spent the past two decades devising and implementing digital platforms involving user-to-user interactivity, I’ve seen first-hand the very real potential for children and the vulnerable to come into harm if such services are poorly policed: for many years, I was the BBC’s go to person for dealing with message board users thought to be in vulnerable situations and was involved in an industry steering group working with the Foreign Office to address the use of social media for recruitment by terrorist organisations.

Many businesses already take their duty of care towards children and vulnerable people seriously, but some don’t. The fact that many of the businesses likely to fall into scope of the measures aren’t based in the UK and don’t have any assets here will make enforcement against those businesses difficult if not impossible without blocking of the sort found in Russia, China, Saudi Arabia and elsewhere. Businesses that do have assets in the UK may simply decide to switch off interactive features to users here, essentially limiting our access as a way of side-stepping the costs of compliance. The Government says the bill will make the UK the safest place in the World to use the internet but it may inadvertently make the UK a place where user-to-user interactivity and search is limited in comparison to what is available to users elsewhere.

What is clear is that with a punitive regime of fines similar to those put in place to enforce GDPR, as well as potential custodial sentences for managers whose businesses fail to comply, the Online Safety Bill needs to be at the top of the board room agenda.

Impact on Global Business and the “Tech Giants”

Government optics around the bill are highly suggestive that the measures target only a handful of social media giants. In an Op-Ed for the Telegraph, Digital Secretary Oliver Dowden refers to global social media platforms by name or category nearly a dozen times. He also writes of curbing the powers of “a tech CEO or work campaigner” to “silence” political discourse.

Long before this bill, many reputable brands invested in the deployment of human moderators and AI content screening tools to identify and remove the type of content to which this bill applies. Here, the UK has strong pedigree, with three of the World’s leading outsourced providers of these services headquartered here.  The Government’s Impact Assessment takes the investments already being made by industry into account in suggesting that only 26 large businesses will be required to make significant changes to their content moderation programmes to comply, albeit with increased annual costs of £3.3-13.4m each (Impact Assessment: Table 19).

Nonetheless, for global digital business and social media channel providers, complying with the fragmented web of regulations originating in different markets is an ever-increasing challenge. We spoke with Laura Berton, an English lawyer who spent nearly a decade advising fast moving technology businesses in Silicon Valley as a Partner at a leading global law firm before recently returning to Europe, who said:

“The controversy that started with the Online Harms White Paper over legal but harmful content illustrates how complex the introduction of the new Online Safety Bill will be for global online platforms. Beyond the complexity of having to deal with numerous overlapping new pieces of legislation across both the UK and the European Union on artificial intelligence, data privacy, data security, digital services, etc. Social media platforms will also have to confront the philosophical gap between the US and Europe, and decide how to reconcile in one global policy (and internal process) the holy grail of freedom of speech and protection of their users.”

Impact on Medium Sized Businesses

The Online Safety Bill, however, won’t just impact the giants of Silicon Valley. According to the Government’s own impact assessment, under the preferred option (2) approximately 45,000 businesses with a digital presence accessible to users the UK will incur costs in understanding whether the bill affects them, with an estimated 24,300 falling into scope of the measures. Collectively, the cost burden to business is estimated to be a staggering £2.12 billion over the next ten years.

There’s a particular sting in the tail for the 236 medium sized business (Impact Assessment: Table 19) who will face an annual additional content moderation cost of £255,662 each before other costs of compliance are factored in.

Reducing Compliance Risks

Whilst some businesses will comply, others may well decide that the costs, and potential risk of heavy penalties – fines of up to a maximum of £18m or 10% of qualifying worldwide revenue and criminal conviction, with the possibility of custodial sentences for responsible managers – is simply too burdensome to take on. Those businesses face hard choices between:

  • Removing the functionality that brings them within scope of the bill’s measures
  • Disabling such functionality for UK users
  • Blocking all access from the UK (which is exactly what happened when GDPR was introduced)

Some Businesses Out of Scope

The good news is that some types of service and some formats of user generated content and online discussion have been excluded from the scope of the bill, for example in-line comments on blogs and the websites of news publishers and broadcasters. It also appears that low-or-no risk services that are unlikely to attract children, for example a financial products website requiring registration, would not be required to comply with age verification procedures outlined in the bill. But there are also many grey areas – for example, it isn’t entirely clear whether brands operating a presence on a 3rd party social media service such as Facebook, could be held responsible for compliance failures that occur on their branded fan page.

What is absolutely clear is that businesses need to review and understand the bill, determine whether they fall within scope of its measures, and understand the steps (and costs) they’ll need to take to comply.

Hume Brophy Support for Clients

Whilst the draft bill has been published, there is still time for businesses concerned about the potential costs of compliance, which may also require alterations to businesses models, to make their voices known. Experts from Hume Brophy’s Digital Practice and Regulatory Affairs team, along with specialist legal partners, stand ready to assist clients seeking insight into how the Online Safety Bill (along with other legislation such as the EU’s Digital Services Act and Digital Markets Act) might impact their business.

 

NOTES:

The information provided in this article is for information purposes only and must not be construed as, or relied upon as, actionable legal advice.

The author of this article, Robin Hamman, is Hume Brophy’s Group Director of Digital Strategy. He holds a degree in English Law and was a Non-Residential Fellow of The Center for Internet and Society at Stanford Law School. Robin has worked with brands including the BBC, NHS England, Shell, Sony Playstation and others to introduce content moderation services aimed at reducing incidents of online harms.

Laura Berton, quoted in the article, is qualified to practice English law and spent nearly a decade advising fast moving technology businesses in Silicon Valley as a Partner at a leading global law firm. She’s recently moved to France and she continues to consult with digital and technology businesses.