On 26 October 2023, the Online Safety Bill, the UK Government’s flagship proposal for legislation regulating online content (…originally unveiled in the Online Harms White Paper in April 2019) passed into law as the Online Safety Act 2023 (Online Safety Act).
The Online Safety Act formally appoints Ofcom who, as the media regulator, has been given wide-ranging investigatory and enforcement powers to regulate the following three categories of internet service:
- User to User Services (i.e. services that allow users to interact with other users and/or to generate content);
- Search Services; and
- providers of pornographic content.
The duties contained in the Online Safety Act, and therefore Ofcom’s enforcement powers, apply to services which have “links to the UK” regardless of where in the world they are located. A service will have link to the UK if:
- it has a significant number of UK users;
- the UK is a target market for the service; or
- it can be used in the UK and there is a material risk of significant harm to UK users.
Ofcom has said that it will now be moving quickly to the implement the new rules having published a roadmap in March 2023 (before the Online Safety Act was finalised) outlining that it will publish its guidance and codes of practice in three phases of consultation:
- Phase One: illegal harms duties;
- Phase Two: child safety duties and pornography; and
- Phase Three: transparency, user empowerment, and other duties on categorised platforms (being Category 1, 2A or 2B services that meet certain thresholds yet to be set out in secondary legislation made by the Government – see more below)
On 9 November 2023, Ofcom commenced exercising its new powers by releasing the draft Codes of Practice in respect of Phase One: illegal harm duties (Draft Code of Practice).
Dame Melanie Dawes, Ofcom’s Chief Executive has said: “Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.”
The Draft Code of Practice contains the first detailed guidance which social media, gaming, pornography, search engines and sharing sites can follow to meet their duties in relation to illegal harms under the Online Safety Act.
It should be noted that:
- the Draft Code of Practice is non-binding and contains “best practice” recommendations. Regulated service providers can therefore choose a different approach to meeting its binding obligations under the Online Safety Act; and
- unlike Ofcom’s existing powers as the UK’s broadcast media regulator under which it makes decisions about individual complaints, Ofcom will not make decisions about individual videos, posts, messages or accounts, or respond to individual complaints. Rather, its role in regulating internet services will be to: (i) require service providers to tackle the causes of online harm by assessing the risk to their users of being harmed by illegal content while using their service and (ii) having identified the scale of the risk, to implement appropriate measures to make their services safer.
The Online Safety Act has a layered approach which means that the extent of the duties imposed on a service provider depends on: (i) the size of the service and (ii) the extent of risk to UK users arising from their use of the service.
The Draft Codes of Practice reflects this and differentiates between Large Services (those with an average user base in the UK greater than 7 million per month (i.e. approx. 10% of the UK population) and Smaller Services (i.e. all services which are not a Large Service). With enhanced recommendations for Large Services.
In relation to risk, there are three categories which apply to Large Services and Smaller Services:
- Low Risk: A service which assessed (in its illegal harm risk assessment) that the risk of all kinds of illegal harm are low risk.
- Specific Risk: A service which assessed (in its illegal harm risk assessment)a part of its service as being medium or high risk for a specific kind of harm for which Ofcom have propose a particular measure in its Code of Practice. Different harm-specific measures are recommended depending on which risk a service has identified. A service could have a single specific risk, or many specific risks.
- Multi-Risk: A service that faces significant risks for illegal harms. For such services, Ofcom has proposed additional measures that are aimed at illegal harms more generally, rather than being targeted at specific risks. In the draft guidance published by Ofcom it has provisionally indicated that it intends to define a service as multi-risk where the service has assessed (in its illegal harm risk assessment) the service as being medium or high risk for at least 2 of the 15 different kinds of Priority Harms set out in the illegal harm risk Assessment. The Priority Harms set out in the Online Safety Act include: terrorism offences; child sexual exploitation and abuse (CSEA), including grooming and child sexual abuse material (CSAM); encouraging or assisting suicide (or attempted suicide) or serious self-harm; hate offences; harassment, stalking, threats and abuse; controlling or coercive behaviour (CCB); drugs and psychoactive substances offences; firearms and other weapons offences; unlawful immigration and human trafficking; sexual exploitation of adults; extreme pornography offence; intimate image abuse; proceeds of crime offences; fraud and financial services offences; and the foreign interference offence (FIO).
Specific measures to protect Children
Ofcom has reiterated that protecting children will be Ofcom’s first priority as the UK’s online safety regulator.
In its Draft Codes of Practice, Ofcom has indicated that those user to user services which have a high risk of the relevant harm and Large Services with at least a medium risk of the relevant harm should ensure by default that:
- children are not presented with lists of suggested friend;
- children do not appear in other users’ lists of suggested friends;
- children are not visible in other users’ connection lists;
- children’s connection lists are not visible to other users;
- accounts outside a child’s connection list cannot send them direct messages; and
- Children’s location information is not visible to any other users.
In relation to child sexual abuse material (CSAM), Ofcom has also proposed that those user to user services which have a high risk of the relevant harm and Large Services with at least a medium risk of the relevant harm should also use:
- ‘hash matching’ technology. This is a way of identifying illegal images of child sexual abuse by matching them to a database of illegal images, to help identify and remove CSAM circulating online; and
- automated tools to detect URLs that have been identified as hosting CSAM.
Importantly, Ofcom has confirmed in the Draft Code of Practice that hash matching technology will not apply to private communications or end-to-end encrypted communications. [However, regulated end-to-end encrypted services are still subject to all the safety duties set out in the Online Safety Act and will still need to take steps to mitigate risks on their service including in respect of CSAM.]
In relation to search services, the Draft Codes of Practice provide that all general search services (i.e. not vertical search services) which are Large Services should provide crisis prevention information in response to any search requests regarding suicide and seeking specific, practical and/or instructive information regarding suicide methods.
Specific measures to tackle fraud and terrorism
The Draft Codes of Practice also publish detailed measures designed to target fraud and terrorism. This includes measures for all services, such as blocking accounts which are run (or there are reasonable grounds to infer that they are run) by or on behalf of a terrorist group or organisation proscribed by the UK Government.
There are also additional measures for Large Services where there is a Specific Risk who should use:
- keyword searches: to detect content containing keywords which are strongly associated with offences concerning articles for use in frauds (e.g. the sale of stolen credentials); and
- verification measures: to give users enhanced control by publishing clear internal policies for operating notable user verification (e.g. a blue tick on X) and paid-for user verification schemes and improved public transparency for users about what verified status means. Ofcom explains that this is aimed at reducing a user’s exposure to fake accounts and to address the risk or fraud and foreign interference in the UK with processes such as elections.
General Measures to minimise risk
Ofcom has also published a list of measures that all services regulated by the Online Safety Act can adopt to mitigate all types of illegal harm. This includes:
- appointing a named person who is accountable for compliance with illegal content safety duties, and reporting and complaints duties;
- adopting content moderation systems or processing which are designed to swiftly take down illegal content; and
- granular complaints processes which are easy to identify, access and use.
What can we expect next?
Phase One: Illegal Harms Duties
The Draft Code of Practice for Phase One will now be subject to industry and expert consultation. The consultation closes on 23 February 2024. The codes of practices will then be submitted to the Secretary of State for Science, Innovation and Technology, and subject to their approval, laid before Parliament.
Phase Two: child safety, pornography and the protection of women and girls
Phase Two will be dealt with in three stages:
- online pornography services and other interested stakeholders will be able to read and respond to Ofcom’s draft guidance on age assurance from December 2023;
- regulated services and other interested stakeholders will be able to read and respond to draft codes of practice relating to protection of children, in Spring 2024; and
- Ofcom expects to publish draft guidance on protecting women and girls by Spring 2025.
During the first and second stage, Ofcom will simultaneously consult on: (a) the analysis of the causes and impacts of online harm to children; and (b) draft risk assessment guidance focusing on harms to children.
Phase three: transparency, user empowerment, and other duties on categorised services.
Ofcom must produce a register of categorised services:
- Category 1 Services: These are the highest reach user to user services with the highest risk functionalities and have enhanced duties, including in relation to transparency, risk posed to adults by legal but harmful content and fraud.
- Category 2A Services: The highest reach search engine services with transparency and fraudulent advertising requirements.
- Category 2B services: Other services with potentially risky functionalities or other factions, with enhanced transparency requirements but no other additional duties.
Ofcom has indicated that it will advise the UK Government on the thresholds for these categories in early 2024. The UK Government will then make secondary legislation on the categorised services, Ofcom currently expect this to happen by summer 2024.
Assuming these timeframes are met, Ofcom has said that it will then:
- publish the register of categorised services by the end of 2024;
- publish draft proposals regarding the additional duties on these services in early 2025; and
- issue transparency notices in mid-2025.
Like to hear more on this topic?
We have previously written about the Online Safety Bill here:
- The return of the Online Safety Bill, Ally Clark, Duncan Calow (dlapiper.com)
- Further amendments to the Online Safety Bill not ruled out: Parliament battling over the scope and detail of the Online Safety Bill, Ally Clark, Duncan Calow (dlapiper.com)
- Online Safety Bill arrives at the House of Lords, Ally Clark, Duncan Calow (dlapiper.com)
- Online Safety Bill: Progress through Parliament & Ofcom Implementation, Elizabeth Bingham, Duncan Calow, Alex Lowe (dlapiper.com)
- Online Safety Bill passed by the House of Lords, Ally Clark, Duncan Calow (dlapiper.com)
- Online Safety Bill approved by Parliament and ready for Royal Assent, Ally Clark, Duncan Calow (dlapiper.com)
“Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.” Dame Melanie Dawes, Ofcom’s Chief Executive
newtab/…
————————————————