06 November 2023
After years of debate, the UK Parliament has signed off on the Online Safety Bill ("the Bill"), marking a significant stride towards a safer online environment for children and increased accountability for tech companies. Technology Secretary Michelle Donelan celebrates the Bill as a means of securing the online safety of British society. This pivotal development has however not been devoid of controversy, particularly concerning privacy.
The Online Safety Bill: what is it?
The Online Safety Bill places the onus on technology companies to protect children from legal but harmful material. The regulatory authority, Ofcom, has been granted additional enforcement powers to oversee and enforce compliance with the Bill's provisions. Key elements of the Bill include:
- Age verification for pornography sites: pornography sites will be required to implement age verification measures to prevent children from accessing explicit content.
- Removal of illegal content: online platforms are mandated to demonstrate their commitment to removing illegal content, such as child sexual abuse, cyberbullying, extreme violence, and more.
- New offenses: The Bill introduces new offences, including "cyber-flashing" (sending unsolicited sexual imagery online) and the sharing of "deepfake" pornography, where artificial intelligence is used to manipulate explicit content.
- Bereaved parents' rights: The legislation includes provisions to make it easier for bereaved parents to obtain information about their children from tech companies.
Who does the Bill apply to?
The Bill introduces legal requirements for a number of key players including:
- Internet service providers that host user-generated content;
- Search engines that enable searches across various websites and databases; and
- Providers of internet services that feature adult content.
These providers will be classified into two categories:
- "Category 1" services, which are the largest platforms with more users. They will face stricter and more demanding obligations.
- "Category 2" services, which are the rest of the providers subject to the Bill's provisions.
It is worth highlighting that these regulations are not limited to UK-based companies. As such, if a non-UK provider serves the UK market or has a substantial UK user base, it will fall under the scope of the Bill.
One of the most contentious aspects of the Bill is the potential requirement for messaging services to examine encrypted messages for child abuse material. Messaging platforms, like WhatsApp, Signal, and iMessage, argue that this would compromise user privacy and the integrity of end-to-end encryption. As a result, these companies have threatened to leave the UK rather than compromise message security. The government insists that such actions will only be taken once "feasible technology" is developed.
While the Bill is often seen as a tool to regulate Big Tech, it is worth noting that over 20,000 small businesses are expected to be affected by its provisions.
The role of Ofcom:
Ofcom, the UK's telecommunications regulator, will play a pivotal role in drawing up codes of conduct to guide companies on compliance with the Bill's provisions. The first draft codes are expected to be released soon. Ofcom's CEO, Dame Melanie Dawes, emphasises that their focus is on addressing the root causes of harm and protecting privacy and freedom of expression.
Ofcom has announced that it will give guidance and set out codes of practice on how in-scope companies can comply with their duties, in three phases, as set out in the Bill:
1. Phase One: Illegal Harms (2023-2024)
- Draft codes and guidance for online harm and risk assessment.
- Final decision expected in Autumn 2024.
2. Phase Two: Child Safety and Protection (2025)
- Guidelines for child safety and protection.
- Draft guidance on protecting women and girls.
3. Phase Three: Categorised Services (2024-2025)
- Transparency reports, user empowerment, and more for categorised services.
- Publication of categorised services register by end of 2024.
This phased approach covers key aspects of the Bill.
Regulation and penalties:
Failure to adhere to the new rules could result in significant fines for tech companies, including fines of up to 10% of global revenue or £18 million, whichever is higher. Company leaders may also face potential imprisonment as a penalty.
Tech companies should take note of the recent decision concerning TikTok and the penalties that followed, with the platform being fined345 million euros (£296 million) by the Irish Data Protection Commission (DPC) following apparent mismanagement of children's data, unveiling serious concerns regarding the security and privacy of young users on the platform.
The Irish data regulatory body, in its investigation, discovered that TikTok had set default account settings for children to be public, inadvertently exposing them to unsolicited communications from adults. The DPC delved deep into TikTok's adherence to the European Union's General Data Protection Regulation (GDPR) concerning privacy settings and operational features.
TikTok responded by expressing its disagreement with the fine, arguing that the issues under scrutiny were linked to features and settings that were in place three years ago, and the platform had already adjusted address these concerns. One such change was the defaulting of all accounts for users under 16 to private.
In the UK, the Information Commissioner's Office imposed a fine of £12.7 million on TikTok for its perceived failure to ensure that underage children were responsibly using the platform and that their data was suitably safeguarded. These fine underscores the growing emphasis on data privacy and child safety in today's digital landscape.
The TikTok fines serve as a salutary tale for tech companies. It will be interesting to see the practical impacts of the Bill on these companies.
Reactions and criticisms
The Bill had received mixed reactions. While it has been welcomed by organisations including the Equality and Human Rights Commission and the NSPCC, critics argue that it does not go far enough in addressing harmful misinformation and disinformation. The Bill's broad scope and its potential to affect free speech have also raised concerns.
How can organisations start to prepare:
Organisations are advised to become familiar with the provisions of the Bill and take preventive steps to ensure they are equipped for its introduction. In particular, they should consider the following practical steps:
- Establishing and enhancing systems for monitoring all content, balancing freedom of expression with user protection.
- Conducting risk assessments of their operations and websites.
- Revising their complaint procedures and terms of service.
- Developing internal mechanisms for identifying and reporting potential harm to children through service provision.
- Determining if they fall under Category 1 service classification, which carries additional compliance requirements.
The Bill marks a significant step toward regulating online content and safeguarding children from harmful material. However, it does raise questions in relation to privacy, encryption, and the boundaries of online regulation. As the Bill takes effect, the balance between safety and individual freedoms will remain a subject of debate, both in the UK and worldwide.
Lauren McFarlane, Associate: firstname.lastname@example.org / 0131 222 2939
Laura Patriche, Trainee Solicitor: email@example.com / 0131 222 2939