Info@NationalCyberSecurity
Info@NationalCyberSecurity

a new landscape for internet regulation | #childsafety | #kids | #chldern | #parents | #schoolsafey


Introduction

The long-awaited Online Safety Act (Act) has finally been passed by Parliament and will become law once it has received Royal Assent.

The aim of the Act has been stated by the Government as being to “make the UK the safest place in the world to be online”. The Act is clearly looking to achieve a huge shift in how liability is assessed when it comes to all things internet and will undoubtedly have a monumental impact on affected internet service providers, if the duties in the Act are enforced as currently planned.

Current position

The Act seeks to change the current two-tier liability scheme, whereby publishers of online content have primary liability for the legality or illegality of that content.

This scheme also includes intermediary liability in certain cases, enabling third party content to be shared and disseminated online (such as by operating search engine services). Their liability is currently reactive; they are liable only if they are placed on notice of the content and they failed to remove or de-index it.

Overall structure of the Act

The Act targets (1) user to user services (with some exempted services, such as for email and messaging platform providers), (2) search services, and (3) user to user or server services that publish certain high-risk content.

The legislation imposes a broad range of duties of care on user-to-user services and search engines. These duties are designed to be enforced by Ofcom. The Act provides Ofcom with powers to impose penalties of £18m or 10 per cent of a company’s global turnover, whichever is the highest. Ofcom also has powers to seek court approved cessation orders to serve on service providers that it feels are failing in their duties of care.

Duties imposed by the Act

The duties of care listed in the Act are focused on the systems and processes that affected service providers must have in place to ensure the safety of users. There are numerous wide-ranging and onerous duties listed in the Act; the most onerous are reserved for providers of “high-risk” content (defined as Category 1, Category 2A and Category 2B of the Act).

The Act is concerned with three main categories of harmful content:

  1. illegal content: this relates to content relating to certain criminal offences (note that defamation is not included in this category),
  2. content that is lawful but harmful to children, and
  3. fraudulent advertising (although note that this is only an issue for providers of Category 1 or 2A content).

As mentioned above, these duties are very wide-ranging. However, key themes include:

  1. the need for service providers to carry out regular risk assessments,
  2. the need for service providers to ensure that they are creating systems and processes to ensure harm is reduced in a systemic manner across the platform in a proportionate manner (the Act does not mandate that all harmful content must always be immediately removed from a platform in order for the service provider to be considered compliant),
  3. there should be robust reporting and complaints measures in place for users, and
  4. service providers will have specific duties relating to transparency and accountability, which must be included in the provider’s terms of service.

Content that is harmful to children

The Act contains specific duties in relation to content that is deemed to be harmful to children (noting that this content does not need to be illegal).

Content that is harmful to children is defined in the Act as being content that “presents a material risk of significant harm to an appreciable number of children in the UK”. “Material”, “significant harm” and “appreciable” are not defined in the act.

There are also categories of content relating to “primary priority content” that is harmful to children (such as content that encourages self-harm, suicide or eating disorders) and “priority content” (such as bullying).

Ofcom

The Act therefore relies very fundamentally on Ofcom to police and enforce the duties and obligations prescribed by the Act.

Ofcom will have four main jobs:

  1. registering relevant service providers,
  2. creating codes of practice,
  3. creating guidance for relevant service providers, and
  4. 4. enforcement.

As mentioned above, Ofcom’s powers will include extensive investigatory powers backed by criminal sanction along with further interview and inspection powers.

Freedom of expression and privacy rights

A key criticism of the Act as it progressed through Parliament has been its impact on freedom of expression and privacy rights. In an attempt to assuage such concerns, the Act contains statements that freedom of expression and privacy rights must be protected when online service providers are implementing the necessary provisions in the Act.  

However, these duties are vaguely dealt with it in the Act, and it is not clear how these particular duties will need to be discharged by relevant service providers.

Key questions / challenges / timeline

Implementation of the Act (and therefore a sense of how its provisions will be enforced) is still some time away. Ofcom has provided initial roadmaps on when the Act will come into force once it receives Royal Assent, stating that it plans to release some codes of practice shortly after the Act receives Royal Assent (starting with codes on the duties relating to illegal content and child safety). However, many operative parts of the Act will not come into force until the Secretary of State says that they do. The Act is therefore likely not to be fully in force for about a year.

This new role for Ofcom clearly also raises questions, as Ofcom is already stretched in dealing with its existing obligations. It will be some time before it is clear whether Ofcom has the capabilities to thoroughly enforce the obligations laid down in the Act.

Further, questions remain about the duties created in the Act, and the huge amount of work that they will create for relevant compliance teams, lawyers, designers and more. Concerns remain over whether the new duties will simply become a box-ticking exercise due to the sheer amount of work and resources complying with them will require.

Several other key questions also remain:

  1. What will the thresholds be to qualify as a relevant provider in each category of service provider?
  2. Is this scheme going to be workable in practice, both for Ofcom and for service providers?
  3. Do the references to freedom of expression and privacy rights go far enough to protect civil rights and liberties?
  4. Does the Act carry the risk of unfair competition and / or of stifling innovation?
  5. Does the Act carry the risk of anti-democratic outcomes?

There are therefore many more debates to be had and questions to be asked and clarified before the Act comes into force. However, for relevant service providers, it will be important to begin familiarising themselves with the duties in the Act now, to prepare for the huge changes in the liability landscape that the Act will inevitably bring once fully in force.

This publication is a general summary of the law. It should not replace legal advice tailored to your specific circumstances.

© Farrer & Co LLP, October 2023


About the authors


Emily Costello

Associate

Emily specialises in reputation management and dispute resolution across a broad spectrum of privacy, defamation, tech and data protection issues. Emily provides bespoke legal advice to a wide range of clients, including high-profile individuals, schools, charities, corporations and executives.

Emily specialises in reputation management and dispute resolution across a broad spectrum of privacy, defamation, tech and data protection issues. Emily provides bespoke legal advice to a wide range of clients, including high-profile individuals, schools, charities, corporations and executives.





Emily’s profile page




Email Emily




+44 (0)20 3375 7300

————————————————


Source link

National Cyber Security

FREE
VIEW