Skip to main content
Technology

We analyse the key features of the UK Online Safety Bill, the current legislation and how the Bill aims at making the online world safer for all

Ethical use of technology and the internet

Ethical use of technology and data are key points of interest for consumers, providers and regulators of digital services and the Online Safety Bill (OSB) is an interesting new piece of legislation with lofty aims for the UK tech-scape.

The increasing role of the internet and technology has in many ways had a beneficial effect on society by allowing easier communication and access to information. However, the prevalence of online activity also brings risks, in that accessible information is not always beneficial. It has also never been easier for children to use the internet to communicate with others or access content.

There is an ongoing struggle between the inherent lack of internet censorship, which facilitates freedom of speech and debate, and the need to prevent the dissemination of harmful content. If the use of the internet is to continue to have a beneficial impact on our lives, it is becoming increasingly essential that internet ethics are laid out to encourage responsible usage. The OSB marks an important step in the regulation of the internet to move from wilderness to a place where service providers are required to behave ethically and take responsibility for the content that they host.

Current Legislation

Currently, the Communications Act 2003, as amended by the Audio-visual Media Services Regulations 2020, governs video-sharing platforms (VSPs). VSPs are required to implement various measures to protect minors from harmful content and protect the public from incitement of violence or hatred and from criminal content which provokes terrorism, or other offences.

More generally, for businesses that host online user-generated content, the Government continues to follow the approach of the EU e-Commerce Directive (ECD). This approach is known as the ‘notice and take down’ regime under which a service provider only risks liability if it has actual knowledge of illegal activity or information on its platform and fails to remove it. The OSB will repeal the existing legislation applicable to VSPs, and it will also be applicable to other online platforms.

What is the OSB and what effect will it have?

The Government published the OSB in draft form in May 2021 and it is likely to put the OSB before Parliament later this year, with the intention stated as being “to make the UK the safest place in the world to be online while defending free expression.”  

The OSB will impose a duty of care on businesses that provide online services which are ‘in scope’ of the OSB to prevent the spread of illegal content and activity online and to ensure that children and adults who use their services are not exposed to harmful content. In general terms, the OSB will require those ‘in scope’ businesses to implement systems which minimise the presence of illegal material (such as terrorist content) and harmful content (such as abuse) on online platforms, and will reduce the timeframe in which illegal and harmful content must be removed.

The OSB will apply to any business providing the following ‘in scope’ services to UK users (wherever the business is based):

  1. Services that host user-generated content (UGC) (for example TikTok)
  2. Services that facilitate online interaction between users (for example Twitter)
  3. Search engines (for example Google). 

The above are referred to as ‘regulated services’ for the purpose of the OSB and this article. The first two regulated services are ‘user-to-user’ regulated services since they enable users to generate, upload or share content that other users can encounter, while the third is a regulated ‘search service’.

Exemptions

A business will potentially fall in scope of the OSB where it offers the ability for users to interact with each other or to conduct searches on its systems, for example messaging boards for employees. However, an internal business exemption is available for internal company message boards where this is an internal resource accessible only by a closed group of people who work for that business. Exemptions can also apply to business intranets, collaboration tools and database management software. The following will also not be affected by the OSB:

  • business-to-business services (for example where one business grants a software licence to another business which enables the latter business to host user-to-user content. The first business will not be subject to the OSB in connection to any UGC as it has no control over the UGC that the second business hosts)
  • internet service providers (these providers are excluded as they lack control over UGC)
  • low-risk business use, (where users’ comments are related to digital content directly published by a platform/service. This will include reviews and comments on products and services directly delivered by a business, as well as ‘below the line comments’ on articles and blogs)
  • e-mail or text messaging services
  • content published by a news publisher on its own website including user comments on that content.

As well as the types of services that are not in scope, there will also be types of content which, although sent or posted on a service which is in scope of the OSB, will be viewed as ‘excluded’. This exemption is for content which is of journalistic or democratic importance. In practice, this means that news platforms will not face an increased legal burden as a result of the OSB becoming law, with the intention of balancing the protection of individuals against the need to avoid censorship and encourage democratic political debate.

New duties under the OSB

The OSB outlines separate sets of duties for user-to-user services such as TikTok and search services such as Google.

1. User-to-user regulated services

General duties
For all providers of user-to-user regulated services the duties imposed by the OSB include:

  • undertaking risk assessments on the risk of illegal content
  • maintaining systems to minimise the presence of illegal content and the length of time for which it is present
  • protecting users’ right to freedom of expression within the law and protecting users from unwarranted infringements of privacy when implementing safety policies
  • a duty to operate the platform using systems that allow users to easily report illegal and harmful content
  • a duty to make and keep a written record of every risk assessment carried out and every step taken to comply with any duty.

In practice, this means that a business providing regulated user-to-user services will have to assess the likelihood of a user of the service encountering any illegal content, having regard to the software used by that business in organising and presenting search results. A business will also have to maintain a system whereby users can report illegal and harmful content for it to be removed.

Additional duties for services likely to be accessed by children
In the case of regulated user-to-user services which are likely to be accessed by children, enhanced duties also apply in addition to those outlined above:

  • children's risk assessment: businesses must identify the number of users who are children in different age groups and the likelihood that any user in those age groups will see harmful content on the platform, as well identifying the severity of harm that the user may suffer as a result
  • protection of children's online safety: where a children’s risk assessment of a service identifies the presence of content that is harmful to children, the business must notify Ofcom.

Category 1 services
Compliance requirements will also be enhanced if a user-to-user regulated service is a ‘Category 1’ service under the OSB. The OSB will have three categories of user-to-user services: 1, 2A and 2B. Category 1 services are seen as being popular sites with many users which will be required to comply with the following additional duties:

  • adults' risk assessment: a duty to assess the user base of the service, the likely exposure of any individual user to harmful content and the effect this may have on the adult. In contrast to the general risk assessment duty, this requires a business to assess the risk of an adult incurring any harmful content, not just illegal content
  • protection of adults' online safety: a duty to notify Ofcom of the type of adult content found and how often it appears.
  • rights to freedom of expression and privacy: businesses are compelled by the OSB to assess the likely effect that any safety policy would have on users’ freedom of expression
  • protection of content of democratic importance: businesses must consider the importance of free expression before deciding whether to take down any content and whether to restrict the ability of a user to share content in future, with these policies to be set out in the terms of service
  • protection of journalistic content: in addition to the duties regarding content of democratic importance, businesses will have to establish an expedited complaints system for users to appeal any removal of content on the basis that it is journalistic content
  • reporting and redress: businesses will have to go further than merely providing a system of reporting illegal and harmful content and also have to provide a means for complaining to the business if it is not complying with its safety duties under the OSB, or if the business has removed content that the user does not consider to be restricted by the OSB.

It is anticipated that most user-to-user services will fall under category 2, meaning that businesses will have to take proportionate steps to address illegal/harmful content to protect children if their platform is likely to be accessed by children, but do not have to comply with the enhanced duties for category 1 services.

2. Regulated search services

General duties
All providers of regulated search services must comply with the following duties:

  • undertaking an illegal content risk assessment
  • mitigating the risk of any harm arising from any illegal content identified in the risk assessment
  • providing systems that allow users to easily report illegal and harmful content
  • protecting rights to freedom of expression, meaning that providers of search services will have to be careful that their systems to remove harmful content do not remove content that is journalistic or of democratic importance
  • keeping a written record of every risk assessment carried out and every step taken to comply with any duty.

Search services likely to be accessed by children
Where search services are likely to be accessed by children, additional duties will apply:

  • children's risk assessment: identifying risks to children having regard to the age groups of users and the way in which that particular search services, organises and presents results of a search
  • protection of children's online safety: where a children’s risk assessment of a service identifies the presence of content that is harmful to children, a duty to notify Ofcom.

The Category 1 enhanced duties set out above in relation to user-to-user services will not apply to regulated search services.

The OSB defines content as “harmful” if the service provider has reasonable grounds to believe that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on a child or adult.

Ofcom powers

Compliance with the OSB is to be overseen and enforced by Ofcom.

As above, one of the general duties for in scope businesses will be to keep records of risk assessments and any steps taken to comply with duties. The OSB will empower Ofcom to provide an “information notice” to businesses to compel disclosure of this information to determine compliance with the OSB.

Ofcom will also gain new powers to issue a “technology warning notice” to a business where Ofcom considers it is failing to comply with the OSB duty to minimise the presence of the most serious illegal content such as terrorism content. Ofcom will then be able to require the removal of content or, in the case of a search engine provider, search results.

Where non-compliance continues following an Ofcom notice, Ofcom will have power to issue fines of up to £18 million or 10% of global annual turnover, whichever is higher, should non-compliance persist after a ‘provisional notice of enforcement action’. The OSB will also make it a criminal offence to fail to comply with information requests from Ofcom, with responsibility falling to senior managers of the business.

If the OSB is passed as drafted, it will require Ofcom to prepare codes of practice to assist businesses providing regulated services in complying with their duties of care. Businesses must comply with the codes or demonstrate that an alternative approach is equally effective. It is difficult at this stage to anticipate what the Ofcom codes will provide for, but the Government has published guidance to help businesses protect users from online harm. Although this guidance is not intended to fill the role of the OSB, it is intended to help businesses take steps now to keep users safe and to comply with the OSB in anticipation of it becoming legislation.

Contact our expert commercial solicitors who can provide further specialist advice or contact our retail solicitors to learn about the full suite of legal services we provide to retailers.

Share on Twitter