The future of data protection law in the UK is currently unclear. In a speech delivered at the Conservative Party Conference on 4 October, the Minister for Digital, Culture, Media & Sport, Michelle Donelan, stated that the government would be ‘replacing GDPR with our own business and consumer-friendly, British data protection system’.
In our previous blog, we looked at the Data Protection and Digital Information Bill (DPDI), a draft bill that had been introduced by the government on 18 July 2022 intended to amend, but not replace, the UK’s current data protection and e-privacy regime, in particular the UK General Data Protection Regulation (GDPR).
Now, it has been reported that work will be ‘paused’ on the DPDI. The DPDI is now potentially at risk of being shelved or amended – and the GDPR may even be repealed, as the perception is that it is too burdensome. In the Minister’s own words regarding her proposed new data law: ‘I can promise … that it will be simpler, it will be clearer, for businesses to navigate. No longer will our businesses be shackled by lots of unnecessary red tape.’
However, there is a law which will impact businesses and citizens alike, which is more vague and more onerous than the DPDI, or even the GDPR, currently making its way through the UK’s legislative process. In striking contrast with our data laws, it is not threatened with withdrawal or repeal – despite having been criticised by tech companies such as WhatsApp and civil society organisations such as Big Brother Watch, Open Rights Group, Liberty and Article 19 for the risks it poses both to freedom of speech and (ironically) online safety, through its potential impacts on the viability of end-to-end encryption.
That law is the Online Safety Bill.
The Online Safety Bill
The Online Safety Bill has been in gestation since April 2019 and the government’s publication of their Online Harms White Paper. Following a lengthy period of consultation, a Bill was finally presented to Parliament on 17 March 2022 and is currently at the report stage at the House of Commons.
Whilst the Bill has travelled through Parliament, it has unfolded in parallel with the tragic case of Molly Russell. An inquest heard that Molly had been presented with a large volume of content relating to self-harm, depression and suicide on the social media platforms Instagram and Pinterest. Senior Coroner Andrew Walker determined in September 2022 that this online content, which had been curated for her by the algorithms of those platforms, had contributed to her death in November 2017.
As a result, there is undoubtedly political pressure and public appetite to see laws which contain accountability requirements and/or liability provisions for online platforms, rather than leaving it to platform companies to self-regulate.
However, there is a live question about exactly what form those legal requirements should take. Coupled with this are genuine concerns about ‘overreach’, whether or not intentional, which results in the over-policing of content that is of public or democratic importance; and onerous requirements which may not be achievable in a technical sense if end-to-end encrypted apps protecting personal privacy are to continue to exist (or, at least, to be available to users in the UK).
Key features of the Bill
Some of the salient features of the Online Safety Bill are as follows:
- The core focus of the Online Safety Bill is on service providers of user-to-user and internet search services (as well as further provisions around pornography).
- Service providers will be divided into different ‘categories’ (‘Category 1’ and ‘Category 2A/2B’) based on ‘threshold conditions’ which are yet to be determined in secondary legislation.
- The intention is for the provisions to have extra-territorial scope, so that they will also cover providers for example targeting the UK or having a significant number of users in the UK.
- The Bill contains a number of duties of care and liability provisions for service providers who are caught by the scope of the Bill. A few key provisions are:
- All providers: illegal content risk assessment duty and safety duties.
- Where likely to be accessed by children: child risk assessment duty and safety duties. Definition of certain types of content harmful to children to be determined in secondary legislation.
- Where the provider is classed as ‘Category 1’: adult risk assessment duty and safety duties. Definition of certain types of content harmful to adults to be determined in secondary legislation. Also for ‘Category 1’: user empowerment duties; duties to protect content of democratic importance; duties to protect journalistic content.
- Finally, for all providers: general duties about freedom of expression and privacy; record-keeping and review duties.
- The intention is that the Bill will be regulated by Ofcom, who will have wide-ranging powers similar to the ICO in the context of investigating data breaches, but also a power to impose fines up to a maximum of £18 million or 10% of global annual turnover, far higher than the maximum fines that can be imposed under the UK’s data protection laws.
The future of the Bill
In September 2022, in response to a Parliamentary question, Prime Minister Liz Truss confirmed that the government intended to proceed with the Online Safety Bill. However, she also flagged that amendments to the Bill would be forthcoming in relation to concerns regarding the risk that it posed to free speech.
Along with the political pressure on the government to take action following the inquest into the death of Molly Russell, it would appear likely that we will see a fully-fledged Online Safety Act during the life of this government.
For more information about our Data Protection and Cybersecurity team, visit our web page here.