Recent high-profile incidents involving terrorist video content, under 18 self-harming and political manipulation have kept concerns about online material in the news. A particularly shocking example was the terrorist attack on a mosque in New Zealand on 15 March 2019 where footage of the attack was widely circulated on social media.
The UK is planning to tackle illegal and undesirable online activity with a wide-ranging new legal duty. The duty is expected to apply to any organisation that provides services supporting the sharing of user-uploaded content, or services enabling online interaction, and non-compliance could lead to substantial fines and personal liability for executives.
The plans are set out in the Online Harms White Paper, published in April. The core problem addressed by the White Paper is “the prevalence of illegal and harmful content online and the level of public concern about online harms”.
What kinds of activity are in scope?
A long list of online harms are identified. The following are already considered to have a clear definition:
- child sexual exploitation and abuse
- terrorist content and activity
- organised immigration crime
- modern slavery
- extreme pornography
- revenge pornography
- harassment and cyberstalking
- hate crime
- encouraging or assisting suicide
- incitement of violence
- sale of illegal goods / services, such as drugs and weapons (on the open internet)
- content illegally uploaded from prisons
- sexting of indecent images by under 18s.
There are a number of other online harms identified which are currently less well defined:
- cyberbullying and trolling
- extremist content and activity
- coercive behaviour
- intimidation
- disinformation
- violent content
- advocacy of self-harm
- promotion of Female Genital Mutilation (FGM).
In addition, access by children to pornography and inappropriate material, and overuse by children of social media and other screen-based activity is targeted. (It is worth noting that the Information Commissioner is consulting on a draft Code of Practice for “Age appropriate design” for online services.)
These are discussed in some detail in the White Paper, with concrete examples.
What does the White Paper propose?
The White Paper sets out proposals for a new regulatory framework. This would include the following key features:
- a new statutory duty of care to make organisations “take more responsibility for the safety of their users and to tackle harm caused by content or activity on their services”;
- compliance with the new duty of care to be overseen and enforced by an independent regulator;
- codes of practice to be developed relating to specific harms;
- a new culture of transparency, trust and accountability whereby the regulator is to have power to require annual ‘transparency reports' outlining the measures taken by those covered by the regulatory framework to tackle the various harms;
- a significant range of penalties and enforcement powers available to the new regulator;
- encouragement to using technology as part of the solution.
Is this bad for business?
The Government's vision in the White Paper is for “the UK to be the safest place in the world to go online, and the best place to start and grow a digital business”. Would the proposals mean that UK-based organisations will be at a disadvantage? The Goverment's intention is to create a level playing field, so that all organisations providing services to UK users will be in scope. Exactly how compliance by non-UK entities will be enforced is not clear, although measures like service-blocking are a possibility.
While this new compliance burden may be manageable for large tech companies, the range of in-scope organisations goes far beyond this, including SMEs and start-ups, charitable organisations, retailers that allow online reviews, file-sharing sites and cloud hosting providers. The intention is to expect a proportionate approach so that the size and capacity of organisations are considered, and to offer compliance support to “less well-resourced companies”. But “all companies will be required to take reasonable and proportionate action to tackle harms on their services”. The focus is on the types of service provided, rather than on the business model or sector.
The introduction of GDPR a year ago showed just how challenging it can be to shoulder wide-ranging new compliance burdens - especially for smaller, voluntary sector or less sophisticated organisations. We query whether extending this duty to cover so many organisations and types of service is really appropriate when the major risks of harm arise in a much narrower scope of online activity.
Have your say!
Stakeholders are invited to comment on the White Paper in a consultation process open until 1 July 2019.
Our content explained
Every piece of content we create is correct on the date it’s published but please don’t rely on it as legal advice. If you’d like to speak to us about your own legal requirements, please contact one of our expert lawyers.