Compliance
UK Information Commissioner’s Office (ICO) has recently released its guidance on content moderation and data protection as part of its commitments stemming from its joint statement with Ofcom (the UK’s communications regulator) in 2022, aimed at promoting synergy between online safety and data protection. The guidance is designated to provide practical advice to ensure compliance with the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA 2018). It offers a comprehensive overview of how organisations can navigate data protection laws while conducting content moderation activities.
Scope and Approach
The guidance is for organisations that use content moderation, as well as providers of content moderation products and services, encompassing both data controllers and data processors. It specifically addresses the moderation of user-generated content in user-to-user services, with a focus on assisting organisations that are carrying out content moderation to meet their obligations under the Online Safety Act 2023 (OSA).
However, it doesn’t cover compliance with the specific OSA obligations as the regulator for these obligations is Ofcom.
ICO decided to use their ‘must, should and could’ approach within the guidance. This new ICO approach makes a clear distinction between what the law requires and what is the best practice. ‘Must’ is a legislative requirement, ‘should’ refers to what ICO expects organisations to do to comply with the law and ‘could’ refers to options or examples that organisations could consider to help them comply with the law.
What is content moderation?
There is currently not a universally agreed definition of content moderation, but the ICO defined it as:
- the analysis of user-generated content to assess whether it meets certain standards; and
- any action a service takes as a result of this analysis.
Content moderation may include: content removal, service bans, feature blocking, visibility reduction.
Content moderation and personal information
Personal information means information that is associated with an identified or identifiable person.
User-generated content is likely to be personal data because it either directly concerns a specific individual or it is linked to additional information that renders someone identifiable. Content moderation may also involve personal data that is linked to the content or user’s account, such as user’s age, location, interests etc.
How to comply with data protection law
Organisations that use content moderation must be able to demonstrate that it is necessary, proportionate and complies with the data minimisation principle.
The key areas that you, as an organisation using content moderation, need to address include:
- Assessing personal data risks. You must identify risks to people’s personal data. Data Protection Impact Assessment (DPIA) can help you understand these risks and ICO recommends undertaking it in all cases. However, you must carry out DPIA if your processing involves new technologies (including AI), combines, compares or matches personal data obtained from various sources, involves solely automated processing that has a legal or similarly significant effect on the user or using children’s personal data and so on. It is important to undertake DPIA during development of new tools or new processing to ensure compliance at an early stage.
- Lawful processing. You must ensure that you are processing personal data lawfully and identify its lawful bases for processing. Lawful bases such as legal obligation or legitimate interests will be most likely relevant. You also need to assess if you are using special category personal data such as health data or biometric data. If you are, there are 10 conditions for processing this type of personal data in Article 9 UK GDPR. ICO produced a comprehensive guidance on special category personal data.
- Fair processing. You must only process personal data in a way that people would reasonably expect and that could not have unjustified adverse effects on them. Therefore, it is crucial that content moderation systems perform accurately and produce unbiased, consistent outputs.
- Tell people what you are doing. You must inform people why you are using their personal data, on what lawful basis, what kind of information it is, what decisions you are making, whether and for how long you keep the information, whether you share it with other organisations and how to exercise their data protection rights.
- Purposes. You must only collect personal data for specified, explicit and legitimate purposes. If the purposes change or you want to use a new purpose, you must ensure that the new purpose is compatible with the original purpose, relevant persons specifically consented to the new purpose, or you can point to a legal provision requiring or allowing the new processing in the public interest. New purpose must have a lawful basis for processing.
- Data minimisation. Content moderation systems are capable of gathering more information than is necessary to achieve your purposes. To be compliant, you must be able to demonstrate that using the personal data is necessary to achieve your purpose and there is not a less intrusive option available to achieve this.
- Automated decision-making. Some content moderation systems take decisions without any meaningful human involvement, for example AI-based tools used to classify and take action on content would likely involve automated decision making.
You must therefore consider whether your content moderation system is making such decisions and if the answer is yes, then Article 22 of the UK GDPR becomes relevant, imposing restrictions on solely automated decision-making that has ‘legal or similarly significant effects’ (decisions that affect someone’s legal rights or financial circumstances or that lead to users being discriminated against), unless an exception applies.
Comment
This ICO guidance is based on current UK data protection legislation. Recognising that content moderation is an evolving field lacking a clear legal framework, this guidance provides assurance to organisations navigating data protection concerns in content moderation.
However, when the Data Protection and Digital Information (DPDI) Bill becomes law around mid-2024, it may result in some changes to the UK GDPR and DPA 2018 that are relevant to this guidance. Moreover, Ofcom is in the process of finalising some of its technology and online safety codes of practice which are likely to lead to further guidance on this subject. ICO said this will be reflected and amended in the guidance.
The ICO has announced its plan to produce additional guidance on content moderation in the upcoming year.
UPDATE: On 1 May 2024, the ICO and Ofcom issued another joint statement regarding online safety and data protection that builds on their joint statement from 2022. They agreed to identify and continuously monitor the emergence of the so-called ‘Collaboration Themes’ which are issues of common interest relevant to the online safety and data protection regimes. They also agreed to identify companies or services that are subject to both online safety and data protection regimes and are of regulatory interest to both the ICO and Ofcom. Lastly, they outlined their intention to disclose information between themselves relating to services.
At EM Law, we are experts in data protection law. If you need help with ensuring you are compliant with the UK GDPR, please contact us. One of our leading data experts, such as Colin Lambertus, would be more than happy to speak with you.
Further Reading
New ICO AI Toolkit
May 22, 2022
Facial Recognition Technology: ICO Confirms Intent to Fine Clearview Over £17 Million
December 15, 2021
UK Data Protection Law: The ICO Asks Some Uncomfortable Questions
September 20, 2021