ICO Fines Transgender Charity Mermaids

ICO Fines Transgender Charity Mermaids

The Information Commissioner’s Office (ICO) has fined charity Mermaids £25,000 for failing to keep personal data (some of which was sensitive personal data) secure. ICO fines for failing to comply with data protection laws can go up to £17.5 million or 4% of an organisation’s total worldwide annual turnover, whichever is higher.

Background

Mermaids is a charity that supports transgender and gender-diverse children and their families. It started out as a support group formed by parents whose children were experiencing gender incongruence. It registered with the Charity Commissioner in 1999. The Charity Commissioner’s website shows that most of Mermaids’ income is derived from donations and legacies with total income for the financial year ending 31 March 2020 standing at £902,437.

In August 2016 the CEO of Mermaids set up an internet-based email group service at https://groups.io. The CEO created GeneralInfo@Groups.IO so that emails could be shared between the CEO and the 12 trustees of the charity. The email service offered various settings for security and privacy:

  • “Groups listed in directory, publicly viewable messages”
  • “Group not listed in directory, publicly viewable messages”
  • “Group listed in directory, private messages” and
  • “Group not listed in directory, private messages”.

The Mermaids group email service was set up under the default option “Groups listed in directory, publicly viewable messages”.

The Groups.IO email service was in active use by Mermaids from August 2016 until July 2017. After it became dormant it continued to hold emails. In addition to communications between the trustees, the emails included some forwarded emails from individuals who were using Mermaid’s services. Those emails included personal data, in some case relating to children, and some of the data was special category data (i.e. data concerning health, sex life or sexual orientation).

In June 2019 a service user of the charity who was the mother of a gender non-conforming child, informed the CEO that she had been contacted by a journalist from the Sunday Times who had told her that her personal data could be viewed online. The journalist had informed the parent that by searching online he could view confidential emails including her child’s name, date of birth, mother’s name, her employer’s address, her mobile telephone number and details of her child’s mental and physical health.

On the same day, Mermaids received pre-publication notice from the Sunday Times that the emails were accessible online and the newspaper would be publishing an article about the incident.

Mermaids immediately took steps to block access to the email site and engaged lawyers. They began informing data subjects about the incident, contacted the ICO to report what had happened and took other measures to deal with the situation.

ICO findings

The ICO’s investigation found, amongst other things, that Mermaids had failed to ensure that adequate measures were in place to ensure the appropriate security for personal data and as a result, 780 pages of confidential emails containing personal data relating to 550 individuals were searchable and viewable online for almost three years by third parties. The ICO also found that in the period May 2018 to June 2019 there was a negligent approach towards data protection at Mermaids, data protection policies were inadequate and there was a lack of adequate training. The ICO found that Mermaids should have applied restricted access to its email group and used pseudonymisation or encryption to add an extra layer of protection to the personal data it held and shut down the email group correctly when it was no longer in use.

ICO fine

On 5 July 2021 an ICO fine was imposed on Mermaids of £25,000.

In arriving at the fine the ICO took into consideration:

  • Mermaids’ income
  • The gravity of the incident
  • The fact that special category data was made public
  • The duration of the data breach
  • The number of data subjects affected
  • The damage caused
  • The intentional or negligent character of the infringement
  • The action taken by Mermaids to mitigate the damage caused
  • The degree of responsibility of Mermaids taking into account the technical and organisational measures they implemented
  • Any relevant previous infringements
  • The degree of cooperation provided by Mermaids with the ICO in order to remedy the infringement and mitigate the damage caused
  • Other aggravating or mitigating factors

The ICO’s Monetary Penalty Notice (which gives further detail and explanation about the ICO’s findings) can be accessed here.

Comment

One never wants to see an organisation receiving an ICO fine. However, given the nature of the work that Mermaids does and the sensitivity of some of the personal data that was made public, the fine appears low. Many businesses, especially small businesses, will try and find ways to cut corners to make their budgets or resources stretch further. Some businesses, especially those who do not process special category data, may feel from reading this ICO decision that the worst that can happen to them if they do not have proper data protection processes in place is that they are going to be fined less than £25,000.

In its decision the ICO took into account not just “the prompt remedial action taken by Mermaids” but also that “this breach was highlighted in a national newspaper and that resulted in a degree of reputational damage to the charity”. It also seems that the fact that Mermaids was a charity had some bearing on the ICO decision with the ICO balancing the fine as a deterrent against not wanting to be “taking away donations made by the public.”

The ICO took into account the financial position of Mermaids. While we do not know what the content of Mermaids’ representations were in this regard, the charity made a loss for its financial year ended 31 March 2020 with total expenditure of £1,041,325 against income of £902,437. Without us knowing the true financial position, it appears that if Mermaids had received an ICO fine of, say, £250,000, this may well have caused the charity to shut down.

It is worth noting as well that in addition to the ICO fine imposed, Mermaids costs for engaging lawyers and other consultants and dealing with the fallout from the incident would have been significant. Mermaids is also vulnerable to claims being brought against it by the data subjects themselves.

If you have any questions on data protection law or compliance please get in touch with one of our data protection lawyers.


data transfers

Data Transfers: EU Adopts New Model Clauses

Data transfers and their legal mechanisms are changing. Standard Contractual Clauses (SCCs) have been an integral part of international data transfers. Under EU data protection law, organisations handling EU personal data cannot transfer such data to third countries without some form of protection. SCCs have become the most practical, and hence most used, form of ensuring such protection. Following the publishing of new draft SCCs back in November and a subsequent consultation period, the EU commission announced on the 4th June 2021 that they had adopted two new sets of SCCs, updating previous clauses which were adopted before the introduction of GDPR. Hence the new SCCs are a product of the ramped up regulatory environment created by GDPR. Additionally, and significantly, the new SCCs respond to the ruling last summer (July 2020) in Schrems II.

What are Standard Contractual Clauses?

SCCs are essentially a set of clauses to enable lawful data transfers of EU personal data. They can be copied into a contract or form an independent agreement between a data exporter (based in the EU or UK) and a data importer (based in a third country) to ensure an adequate level of protection for personal data being transferred between two entities. Two sets of clauses have been published by the EU commission: one for the transfer of personal data to third countries and one for use between controllers and processors based in the EU.

Schrems II

The ruling in Data Protection Commissioner v Facebook Ireland and Maximillian Schrems (Case C-311/18)was significant for SCCs for two reasons. Firstly, it invalidated the EU-US Privacy Shield. Which many US organisations relied upon to transfer personal data out of the EU. In the same way that SCCs are considered to give personal data transfers out of the EU adequate protection, the Privacy Shield, if all principles were adhered to, could also offer such protection. Now that it has been invalidated more US organisations will be relying upon SCCs to ensure adequate protection of personal data transfers.

Secondly the ruling reviewed the effectiveness of the SCCs in place at the time and, whilst considering them to be a valid mechanism for data transfers, introduced new obligations on both data importers and data exporters. It was said that organisations should review the agreements they have in place by assessing whether to implement additional technical and organisational as well as contractual measures. This amounts to what has been described as a ‘data transfer impact assessment’ and is directly addressed in the new SCCs published by the EU commission.

New SCCs for data transfers

The new SCCs have therefore been introduced to make sure they align with the high standards of data privacy introduced by GDPR, amend previous deficiencies, such as a lack of variety of potential arrangements, and to address uncertainties into how to assess whether or not to implement organisational/technical measures after the ruling in Schrems II. These are dealt with respectively below.

New obligations/liabilities

Firstly, the new SCCs impose a ‘light-weight’ form of the GDPR on data importers. This comes in the form of third party rights for data subjects. This includes data importers considering the following obligations: purpose limitation, transparency, accuracy, data minimisation, retention, security, onward transfers, data subject rights, complaints mechanism and submission to jurisdiction. The final obligation means a data importer must submit to the laws of the EU country from which the personal data is being exported, including its courts and data protection regulatory authority.

Secondly, the data importer must now notify the data exporter in case of requests from public authorities or any direct access by public authorities to data transfers protected by SCCs. Data importers are also expected to try and obtain a waiver of a prohibition for a data exporter to be notified of such public authorities’ access.

And thirdly, data importers and exporters are now liable in relation to any damages to data subjects caused by a breach of the SCCs – material or non-material. In contrast to the GDPR, which requires a breach of both parties in case of joint liability, in some scenarios created by the new SCCs (controller-to-processor and processor-to-processor), the data exporter in Europe is now liable for violations by its processor or even sub-processor.

Modular approach for data transfers

The new SCCs employ a modular approach i.e. they create potential for an increased number of data transfer scenarios/modules. This includes:

  • controller to controller;
  • controller to processor;
  • from processor to sub-processor; and
  • processor to controller.

The processor to sub-processor module solves a long-standing problem. Up until now processors have been unsure of how to justify transfers to third countries. Now specific clauses exist to enable such data transfers. The only possible issue with the new modules is that any sub-processor wishing to engage a further sub-processor will have to get the permission of the original controller.

The new SCCs also allow the clauses to be used in a multi-party agreement without having to be replicated for each individual relationship. In practice this has been going on for a while but now it has been officially sanctioned. A related innovation in the new SCCs is also the possible introduction of a docking clause. The docking clause allows new parties to be added to the agreement over time.

Data transfer impact assessments

Clause 14 lays out the ways in which parties to an agreement can ensure compliance with the obligations introduced by Schrems II for data transfers. It says the parties must take due account of:

  • the specific circumstances of the transfer, including the length of the processing chain, the number of actors involved, and the transmission channels used; intended onward transfers; the type of recipient; the purpose of processing; the categories and format of the transferred personal data; the economic sector in which the transfer occurs, and the storage of the data transferred.
  • the laws and practices of the third country of destination – including those requiring the disclosure of data to public authorities or authorising access by such authorities.
  • any relevant contractual, technical, or organisational safeguards put in place to supplement the safeguards under the SCCs, including measures applied during transmission and to the processing of the personal data in the country of destination.

Additionally:

  • the parties agree to document the assessment described and make it available to supervisory authorities on request.
  • the data importer warrants that it has made its best efforts to provide the data exporter with relevant information and agrees that it will continue to cooperate with the data exporter in ensuring compliance with this assessment.
  • the data importer must notify the data exporter either of a public authority’s request to access data or where the public authority directly accesses personal data. If the data importer is unable to make that notification it must use best efforts to obtain a waiver.
  • the data importer must review access to personal data requests for legality and challenge them if there are reasonable grounds to do so. It must document its legal assessment and minimise the data disclosure as much as possible.

Moving forwards with data transfers

Organisations will be relieved to know that the European Commission has allowed for an 18-month transition period in which the previous SCCs will still be legally recognised (opposed to the 12-month transition period suggested in the drafts). This should give time to review current data transfers, agreements and update clauses where needed.

We are also still waiting for the final ‘Recommendations for Supplementary Measures’ in relation to the ruling in Schrems II from the European Data Protection Board (EDPB), which were open for feedback after being published in draft form in November. The EDPB have said ‘the recommendations… were subject to a public consultation. The EDPB received over 200 contributions from various stakeholders, which it is currently analysing’.

UK perspective

As it stands the new SCCs are not recognised in the UK and it will be up to the ICO to decide whether to accept their usage. The ICO is currently preparing its own contractual clauses and will consult on them over the summer. Allowing the use of both EU and UK approved SCCs will no doubt be of benefit to the EU’s adequacy decision for the UK (meaning whether the EU considers the UK adequate for data protection purposes and hence allow free flows of data to occur between the two regimes).

It is important to note however that the UK has been wanting to liberalise data transfers for some time and so the new ICO sanctioned clauses may well be less cumbersome than the new EU ones. Finally, for clarity, the old EU SCCs remain valid in the UK and should be, for the time being, the place where organisations transferring UK personal data go when putting agreements in place. You can find the clauses on the ICO’s website here.

Here to help

Data transfers have up until now often been a case of signing up to some clauses or entering into an agreement and then leaving it be. With the introduction of GDPR, the ruling of Schrems II and now the old pre-GDPR SCCs outdated, organisations need to be mindful of new obligations and most significantly the need for transfer impact assessments. Such assessments may need to be undertaken by a third party. If you need your current data transfer agreements reviewed, we are here to help.

EM law specialises in technology and data protection law. Get in touch if you need advice on data protection law or if you have any questions on the above.


Data Compliance

Data Compliance – Updates You Need To Make To Your Policies

Data compliance is an essential part of everyone’s business. There have been several shifts in UK law’s data compliance regime following Brexit and the ruling in Schrems II. This means that plenty of businesses’ privacy policies are not up to date. Changes range from simply swapping in different references to legislation, to considering the effect that Brexit has had on cross-border transfers of data. For those transferring data to the US, the invalidation of the Privacy Shield framework should also be considered. Here is our guide on the updates businesses should consider making to their privacy policies (and the issues we frequently spot when dealing with clients’ data protection documentation or doing due diligence on other companies’ privacy policies).

Data compliance post Brexit

With the start of 2021, and the end of the EU-UK transition period, the retained EU law version of the General Data Protection Regulation ((EU)2016/679), called the UK GDPR, applies in the UK, along with the Data Protection Act 2018 (DPA 2018). Therefore, the main body of data protection law in the UK is now made up of the UK GDPR and the DPA 2018.

So as a simple starting point for updates that need to be made to privacy policies, it should be made sure that all references to GDPR are changed to the UK GDPR. There may also be references to ‘Applicable Data Protection Laws’ and so the definition of these applicable laws needs to be changed to include the UK GDPR and the DPA 2018.

It is important to note that the EU GDPR (the data protection regime in the EU) will continue to have extra-territorial effect and so may apply to UK controllers or processors who have an establishment in the EU, or who offer goods or services to data subjects in the EU, or who monitor their behaviour as far as their behaviour takes place within the EU. So, if you operate in the EU as well as the UK you should consider including references to the EU GDPR. Additionally, it is important to be aware that even though your privacy policy may now refer to the UK GDPR, if you operate in the EU, you should consider the consequences of operating in two data protection regimes. This may include a review of your mechanisms for cross-border transfers of data to the EU.

Data compliance: cross-border transfers of data

Now that the UK has left the EU, all data transfers from the UK to the EU or vice-versa are defined as cross-border transfers for the purposes of data protection law. This means that to address data compliance additional safeguards need to be in place, for example Standard Contractual Clauses or reliance upon an adequacy decision (a decision made by a relevant authority that data protection is adequate in a particular country and hence data can flow there freely). As it stands, in June 2021, the UK has granted the EU an adequacy decision but the EU are yet to grant one to the UK.

In relation to updating your privacy policy, it will now be important, if transferring data to and from the EU, to show which safeguards you are relying upon to do so. It should be noted, however, that on 24th December 2020, the UK and the EU reached a trade and co-operation agreement addressing the arrangements following the end of the Brexit transition period on 31st December 2020. The agreement includes an interim provision (bridging mechanism) for transmission of personal data from the EU to the UK which could last up to six months. Therefore, under the current circumstances (as in June – the time of this blog), companies do not need to have additional safeguards in place for transfers of personal data to the EU. Regardless of these developments businesses should, however, state in their privacy policies that they are relying upon these provisional agreements and adequacy decisions to transfer data to the EU. This could start by including a simple acknowledgement in a privacy policy that any personal data transfers from the UK to the EU, are transfers taking place between two separate data protection regimes.

Privacy Shield invalidated

The EU-US Privacy Shield was a framework constructed by the US Department of Commerce and the European Commission to enable transatlantic data protection exchanges for commercial purposes. The Privacy Shield enabled companies from the US to comply with data protection requirements and enable free flows of personal data to and from the EU, without the need for additional safeguards (such as those expected for third countries – countries not deemed to have adequate levels of protection by the EU for personal data - such as the US).

The Privacy Shield was invalidated in July 2020 following the ECJ’s preliminary ruling in Data Protection Commissioner v Facebook Ireland and Maximillian Schrems (Case C-311/18) and should therefore no longer be relied upon for transfers to the US. The standard contractual clauses controller-to-processor were not invalidated and so organisations can still rely upon them when transferring data to the US. This means that any reference to the Privacy Shield in a privacy policy needs to be erased and the mechanisms which an organisation is now using to transfer data to the US need to be clearly stated. In the majority of cases this will mean mentioning the use of standard contractual clauses.

Data compliance checklist

Here is a list of things to consider when editing your privacy policy to ensure data compliance:

  • All references to UK data protection laws and legislation needs to be a reference to the UK GDPR and DPA 2018.
  • Transfers of data to the EU need to be treated as cross-border data transfers and so the legal basis for making these transfers needs to be stated (such as an adequacy decision, the current bridging mechanism, standard contractual clauses, binding corporate rules etc.).
  • Any reference to the EU-US Privacy Shield need to be erased for data transfers to the US and if standard contractual clauses are now being used this needs to be mentioned.

Data Compliance and Transparency

As part of the UK GDPR principles, businesses must comply with the transparency requirements set out in Articles 13 and 14 of the UK GDPR. The transparency principles require all controllers to notify data subjects about their personal data handling practices through a privacy policy, at the time that data is collected. It therefore follows that if your business has changed the way it processes the personal data of its customers, due to developments discussed in this blog,  and is relying on a new basis for that processing (i.e. UK GDPR instead of the previous regime), it goes without saying that, in order to comply with the transparency principles, such businesses should update their privacy policy to reflect this. For an online business, that will usually mean updating their website privacy policy.

Multiple jurisdictions

Organisations with entities in multiple jurisdictions face data compliance challenges when trying to implement website privacy policies as part of a global privacy compliance programme. Multinationals must choose between implementing a single, global privacy policy applicable for all its customers globally or jurisdiction specific policies. Taking into account that even within the EU, member states are likely to have varying rules on data protection. This will mean paying attention to the references to legislation in jurisdiction specific policies and being clear about how exactly cross-border data transfers are taking place between different branches of an organisation. Given the potential complexity of the structure of such data transfers it would be worth seeking legal advice. Many privacy policy regulators, including the ICO, recommend a layered policy format, which pairs a short summary with a linked detailed disclosure, as the most effective way to simplify a complex privacy policy and make it clearly and conspicuously accessible.

Here to help

Hopefully this blog should give you enough scope to update your privacy policy and address data compliance. But we can also help you do so. With Brexit and the ruling in Schrems II, data compliance has become legally complex but that doesn’t mean a practical approach for businesses isn’t possible. The next big step in UK data law is whether or not an adequacy decision will be granted. The decision is currently in the comitology procedure which means all EU member states need to agree the drafting. If an adequacy decision is reached, then data flows will be unimpeded between the EU and UK. Regardless of such a decision however, references to legislation and the mechanisms relied upon for cross-border transfers will still need to be updated.

Other developments include the recent publishing of updated Standard Contractual Clauses by the European Commission. This means that agreements which export EU data to a third country and rely upon Standard Contractual Clauses should be updated. The new versions also incorporate means by which to adhere to new requirements for cross-border transfers following the decision in Schrems II. Schrems II introduced an obligation to assess local data laws before going ahead with a transfer.

EM law specialises in technology and data protection law. Get in touch if you need advice on data protection law or if you have any questions on the above.


Managing Data

Managing Data – Software Services And AI Legal

Managing data is an essential part of the operation of a growth business. It’s a cliché often bandied around that today data is more valuable than oil. But as with oil, it’s only how the resource is used that defines its value. Whereas oil can be relied upon to produce energy in all circumstances, data cannot be relied upon to produce useful insights at all times. Therefore, the means and purpose by which it is processed becomes all the more important. Given its potential, it comes as no surprise that initiatives, public and private, for managing data more effectively are commonplace. The legal sphere attempting to regulate this burst of energy gets more complex by the day. Here is our introduction to some general issues you may face when managing data for profit, or to simply improve the running of your business.

GDPR and Brexit

Before GDPR came into force in all EU member states on 25 May 2018, the ICO commissioner stated in the ICO’s March 2017 paper, Big data, artificial intelligence, machine learning and data protection, that ‘it’s clear that the use of big data has implications for privacy, data protection and the associated rights of individuals… In addition to being transparent, organisations… need to be more accountable for what they do with personal data’.

At the end of the Brexit transition period (January 1st 2021), the GDPR and parts of the Data Protection Act 2018 became part of a new body of retained EU law. Essentially replicating the old regime in the UK. Data protection legislation in the UK is now comprised of the UK GDPR and the DPA 2018. From a UK perspective the GDPR operating in the EU will be known as the EU GDPR.

As the EU GDPR will continue to have extra-territorial effect (Article 3, EU GDPR) it may continue to apply to UK organisations who act as controllers or processors and have an establishment in the EU, or who offer goods or services to data subjects in the EU; or monitor their behaviour, as far as their behaviour takes place within the EU. UK businesses could therefore find themselves subject to parallel data protection regulatory regimes under both the UK GDPR and the EU GDPR.

Are you managing data as a processor or controller?

If offering a service, for example a software platform that allows companies to process personal data, then it would often be prudent to ensure you are defined as a data processor, and not a data controller, for data protection purposes. This is because, as opposed to data controllers who bear primary responsibility for the personal data involved, data processors have less obligations under data protection laws. Processers are essentially processing data under the instructions of the data controller. Whilst a data controller determines ‘the purposes and means’ of processing the personal data (Article 4(7), UK GDPR). A helpful way of thinking about it is that a data controller has direct duties to data subjects whereas a data processor only has duties to the data controller.

The distinction between controller and processor in an AI context was first considered in the ICO’s July 2017 decision on an agreement between the Royal Free Hospital and Google DeepMind. Under the agreement DeepMind used the UK’s standard publicly available acute kidney injury algorithm to process personal data of 1.6 million patients. The ICO ruled that the hospital had failed to comply with data protection law and was ordered to perform an audit on the system. The hospital’s law firm, Linklaters, concluded in the hospital’s audit report, Audit of the acute kidney injury detection system known as Streams, that DeepMind had been properly characterised as a data processor. This was because Streams ‘does not use complex artificial intelligence or machine learning to determine when a patient is at risk of acute kidney injury. Instead, it uses a simple algorithm mandated by the NHS’. It was therefore the lack of complexity involved in the ‘means’ of processing the personal data which meant that DeepMind were considered to be a data processor. A complex algorithm would have constituted a level of agency on DeepMind’s part which would have rendered their processing that of a data controller. It was deemed, however, that their services were simple enough to be doing nothing more than following the hospital’s instructions. This grey area should be of concern to anyone planning to use AI to analyse data. Make an algorithm too complex and you may take on the liability of a data controller and hence liability towards data subjects.

Anonymisation

Managing data to make it anonymous would fall under UK data protection laws. This is because the purpose with which the personal data was originally collected needs to be aligned with the purpose that it is later anonymised for. There are certain circumstances in which collecting personal data to begin with is not necessary and, if still useful, highly desirable for businesses wishing to process the data as they wish. If the data is originally collected in an anonymous format, then UK GDPR no longer applies. As GDPR states at recital 26, ‘the principles of data protection should… not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable’.

In an ICO report, Anonymisation: managing data protection risk code of practice, the ICO lists anonymisation as one of its six key recommendations for AI. It states ‘organisations should carefully consider whether the big data analytics to be undertaken actually requires the processing of personal data. Often, this will not be the case; in such circumstances organisations should use appropriate techniques to anonymise the personal data in the data sets before analysis’.

Profiling and automated decision making

AI’s ability to uncover hidden links in data about individuals and to predict individuals’ preferences can bring it within the GDPR’s regime for profiling and automated decision making. Article 22(1) states that ‘a data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly affects him or her’.

However, this is qualified by article 22(2) which states that this right does not apply to a decision that ‘(a) is necessary for entering into or performance of a contract between data subject and data controller; (b) is authorised by… law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or (c) is based on the data subject’s explicit consent’.

This is further qualified: ‘in the cases referred to in points (a) and (c)…, the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, (including) at least the right to obtain human intervention on the part of the controller, to express his or her point of view or contest the decision’. Having automated decision making within software performing data analysis can therefore introduce new obligations. Such obligations often being onerous for a data controller. This can include it being necessary to perform a Data Protection Impact Assessment or getting explicit consent from data subjects.

Other suggested compliance mechanisms

The ICO makes five recommendations for using AI to analyse data:

  • Privacy notices.
  • Data protection impact assessments – embed a privacy impact assessment framework into data processing activities to help identify privacy risks and assess the necessity and proportionality of a given project.
  • Privacy by design – implementing technical and organisational measures to address matters including data security, data minimisation and data segregation.
  • Ethical principles.
  • Auditable machine learning algorithms.

Treasure trove

Finding new and innovative ways for managing data is a treasure trove many wish to unlock. It is important to be wary of the growing regulatory landscape underpinning the sector. The world was shocked by the accusations made against Cambridge Analytica and making sure you display compliance is a must for maintaining a good reputation and attracting clients. With Brexit comes the potential for the complexity inherent in potentially diverging legal regimes. Being up to date on the development of the Privacy and Electronics Communications Regulations (PECR) will also be useful. Read our blog on PECR here.

EM law specialises in technology and data protection law. Get in touch if you need advice on data protection law or if you have any questions on the above.


Draft Adequacy Decisions

Draft Adequacy Decisions: Data Flows EU to UK

Draft adequacy decisions were published on 19 February 2021 by the European Commission (EC) for personal data transfers from the EU to the UK. The significance of the drafts are considerable given they are the first to be produced since the European Court of Justice’s (ECJ) ruling in Schrems II which struck down the adequacy decision previously granted to the EU-US Privacy shield.

The EC’s press release on the draft adequacy decisions stated that it has carefully assessed the UK’s law and practice on personal data protection, including the rules on public authorities access to personal data, and concluded that the UK ensures an ‘essentially equivalent’ level of protection to that guaranteed under the EU GDPR and Law Enforcement Directive.

What does adequacy mean?

‘Adequacy’ is a term that the EU uses to describe other countries, territories, sectors or international organisations that it deems to provide an ‘essentially equivalent’ level of data protection to that which exists within the EU. An adequacy decision is a formal decision made by the EU which recognises that another country, territory, sector or international organisation provides an equivalent level of protection for personal data as the EU does. The UK is seeking adequacy decisions under both the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED).

The effect of an adequacy decision is that personal data can be sent from an EEA state to a third country without any further safeguard being necessary. The trade deal agreed between the UK and the EU means that the UK has a bridge until 30 June 2021 where data can continue to flow from the European Economic Area (EEA) to the UK whilst the adequacy decisions process takes place. The bridge can finish sooner than this if the EU adopts adequacy decisions in respect of the UK.

Transfers of data from the UK to the EEA are permitted. The UK Government has recognised EU Commission adequacy decisions made before the end of the transition period. This allows restricted transfers to continue to be made from the UK to most organisations, countries, territories or sectors covered by an EU adequacy decision.

Adequacy criteria

In order to draw conclusions on the UK’s data protection regime, the EC assessed a number of factors when producing the draft adequacy decisions:

  • UK constitution – especially in relation to the UK’s adoption of the rights in the European Convention on Human Rights in its UK Human Rights Act 1998.
  • UK data protection laws – particularly how the UK has adopted EU data laws following Brexit through the implementation of the UK GDPR and maintenance of the DPA 2018. This includes the incorporation of both the territorial and material scope of EU data law as well as definitions, principles and rights afforded to individuals. The main point being that they are all equivalent to those in the EU GDPR.
  • Restrictions on transfers outside of the UK – how, via the implementation of the UK GDPR, the rules on international transfers of data are as restrictive as under the EU GDPR, and how data subjects in the EU can therefore have confidence that onwards transfers of data will be effectively restrained.
  • Enforcement – the Information Commissioner’s Office (ICO) is the “independent supervisory authority tasked with powers to monitor and enforce compliance with the data protection rules” and is equivalent to the various data protection authorities to be found throughout the member states of the European Union. The EC considered the number of cases investigated and fines imposed by the ICO, as a method by which to deduce its legitimacy.
  • Redress – here the EC highlighted the ability of data subjects to make complaints with the ICO, prosecute for damages under the UK GDPR and utilise the Human Rights Act 1998 to express their data rights, with the European Court of Human Rights as an ultimate source of authority.

Consequences of adoption

If adopted, the draft adequacy decisions will be valid for an initial term of four years, only renewable if the level of protection in the UK continues to be adequate. The drafts include strict mechanisms for monitoring and review, suspension or withdrawal, to address any problematic development of the UK system which will no longer be bound by EU privacy rules.

UK government response to the draft adequacy decisions

The UK government has welcomed the draft adequacy decisions, urging the EU to fulfil its commitment to complete the approval process swiftly. The Information Commissioner described the progress as "an important milestone in securing the continued frictionless data transfers from the EU to the UK".

The draft adequacy decisions are now with the EDPB for a "non-binding opinion", following which the EC will request approval from EU member states' representatives. It could then adopt final adequacy decisions. Until then, organisations continue to be able to receive personal data from the EU under the temporary "bridging mechanism", agreed in the EU-UK Trade and Cooperation Agreement.

Schrems II

The draft adequacy decisions also include a detailed assessment of the conditions and limitations, as well as the oversight mechanisms and remedies applicable in case of access to data by UK public authorities, in particular for law enforcement and national security purposes. These are likely included to address the ECJ's ruling in Schrems II and concerns over the UK's use of mass surveillance techniques.

In Schrems II, the ECJ ruled that free data flows moving from the EU to certain US organisations under the EU-US privacy shield did not offer an essentially equivalent level of protection as under EU law. This was substantially based on the fact that national security laws in the US were deemed to undermine citizens’ data rights. When assessing the UK, the ECJ, in light of the ruling in Schrems II, was always going to pay close attention to UK national security laws. Additionally, Schrems II introduced more stringent obligations on organisations when carrying out cross border data transfers and so there has been a general concern that this newly stringent approach may reduce the UK’s chance of receiving an adequacy decision. The drafts can therefore be seen as a highly positive step.

What stands in the UK’s way?

Although the process for an adequacy decision under the EU GDPR is now underway with the draft adequacy decisions in place and, although the UK government has stated on a number of occasions that it is confident that the EU will deem the UK data protection regime ‘essentially equivalent’, it is worth noting that a number of issues may impact on the UK's ability to satisfy the EU:

  • The UK's use of mass surveillance techniques may lead to EU member states raising concerns about data protection in the UK, which might jeopardise an Adequacy Decision. The ruling of the ECtHR which held that aspects of the UK's surveillance regimes under the Regulation of Investigatory Powers Act 2000 (RIPA) did not comply with Articles 8 and 10 of the ECHR, is particularly relevant (Big Brother Watch and others v United Kingdom). The human rights groups which brought the claim were not satisfied with the judgment and appealed to the Grand Chamber, the ECtHR's highest judicial bench.
  • Membership of the Five Eyes intelligence sharing community means EU citizens' data could be transferred by UK security services to third countries (including the US) which are not considered to have adequate data protection.
  • Potential for unprotected onward data transfers as the UK will be able to decide which countries it deems adequate and what arrangements to have with them.

The draft adequacy decisions - a positive step

Although nothing can be taken for granted, the draft adequacy decisions are a positive step and the fact that the UK has committed to remaining party to the ECHR and "Convention 108", will likely carry some leverage as adherence to such international conventions is important for the stability and durability of adequacy findings.

If you have any questions on the draft adequacy decisions, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


E-privacy

E-Privacy – PECR and Brexit

E-Privacy regulations complement data protection laws by setting out privacy rights for electronic communications. The idea being that whilst widespread public access to digital mobile networks and the internet has opened up new possibilities for businesses and users, they have also created new risks for privacy. E-Privacy regulations have been a point of contention within the EU and reform has been expected for some time. On 10 February 2021, 4 years after the European Commission’s initial legislative proposal and to the surprise of many, the European Council reached a compromise agreement on their position on the E-privacy Regulation. What this means for E-privacy rules in the UK remains to be seen. With Brexit behind us, and therefore no obligation to introduce new EU legislation in the UK, but with an adequacy decision pending, and therefore a desire for the UK to align with the EU on data protection, it is hard to say whether or not the UK will choose to implement them. For more information on data protection and a potential adequacy decision after Brexit read our blog.

E-Privacy and PECR

PECR are the Privacy and Electronic Communications Regulations which comprise the E-privacy regulations in the UK. Their full title is The Privacy and Electronic Communications (EC Directive) Regulations 2003. They are derived from European law. PECR have been amended a number of times. The more recent changes were made in 2018, to ban cold-calling of claims management services and to introduce director liability for serious breaches of the marketing rules; and in 2019 to ban cold-calling of pensions schemes in certain circumstances and to incorporate the GDPR definition of consent.

What kind of areas do PECR cover?

PECR cover several areas:

  • Marketing by electronic means, including marketing calls, texts, emails and faxes.
  • The use of cookies or similar technologies that track information about people accessing a website or other electronic service.
  • Security of public electronic communications services.
  • Privacy of customers using communications networks or services as regards traffic and location data, itemised billing, line identification services (eg caller ID and call return), and directory listings.

How does this fit with the UK GDPR?

The UK GDPR sits alongside PECR. PECR rules apply and use the UK GDPR standard of consent (which is a high threshold). This means that if you send electronic marketing or use cookies or similar technologies you must comply with both PECR and the UK GDPR. Unsurprisingly, there is some overlap, given that both aim to protect people’s privacy. Complying with PECR will help you comply with the UK GDPR, and vice versa – but there are some differences. In particular, it’s important to realise that PECR apply even if you are not processing personal data. For example, many of the rules protect companies as well as individuals, and the marketing rules apply even if you cannot identify the person you are contacting.

If you are a network or service provider, Article 95 of the UK GDPR says the UK GDPR does not apply where there are already specific PECR rules. This is to avoid duplication, and means that if you are a network or service provider, you only need to comply with PECR rules (and not the UK GDPR) on:

  • security and security breaches;
  • traffic data;
  • location data;
  • itemised billing; and
  • line identification services.

Electronic and telephone marketing

PECR restrict unsolicited marketing by phone, fax, email, text, or other electronic message. There are different rules for different types of communication. The rules are generally stricter for marketing to individuals than for marketing to companies. Companies will often need specific consent to send unsolicited direct marketing. The best way to obtain valid consent is to ask customers to tick opt-in boxes confirming they are happy to receive marketing calls, texts or emails from you.

E-Privacy: Cookies and similar technologies

Companies must tell people if they set cookies, and clearly explain what the cookies do and why. You must also get the user’s consent. Consent must be actively and clearly given. There is an exception for cookies that are essential to provide an online service at someone’s request (e.g. to remember what’s in their online basket, or to ensure security in online banking). The same rules also apply if you use any other type of technology to store or gain access to information on someone’s device.

Communications networks and services

PECR are not just concerned with marketing by electronic means. They also contain provisions that concern the security of public electronic communications services and the privacy of customers using communications networks or services. Some of these provisions only apply to service providers (e.g. the security provisions) but others apply more widely. For example, the directories provision applies to any organisation wanting to compile a telephone, fax or email directory.

EU Council position on E-Privacy rules

On 10 February 2021, EU member states agreed on a negotiating mandate for revised rules on the protection of privacy and confidentiality in the use of electronic communications services. These updated E-privacy rules will define cases in which service providers are allowed to process electronic communications data or have access to data stored on end-users’ devices. The agreement allows the Portuguese presidency to start talks with the European Parliament on the final text. The agreement included:

  • The regulation will cover electronic communications content transmitted using publicly available services and networks, and metadata related to the communication. Metadata includes, for example, information on location and the time and recipient of communication. It is considered potentially as sensitive as the content itself.
  • As a main rule, electronic communications data will be confidential. Any interference, including listening to, monitoring and processing of data by anyone other than the end-user will be prohibited, except when permitted by the E-privacy regulation.
  • Permitted processing of electronic communications data without the consent of the user includes, for example, ensuring the integrity of communications services, checking for the presence of malware or viruses, or cases where the service provider is bound by EU or member states’ law for the prosecution of criminal offences or prevention of threats to public security.
  • Metadata may be processed for instance for billing, or for detecting or stopping fraudulent use. With the user’s consent, service providers could, for example, use metadata to display traffic movements to help public authorities and transport operators to develop new infrastructure where it is most needed. Metadata may also be processed to protect users’ vital interests, including for monitoring epidemics and their spread or in humanitarian emergencies, in particular natural and man-made disasters.
  • In certain cases, providers of electronic communications networks and services may process metadata for a purpose other than that for which it was collected, even when this is not based on the user’s consent or certain provisions on legislative measures under EU or member state law. This  processing for another purpose must be compatible with the initial purpose, and strong specific safeguards apply to it.
  • As the user’s terminal equipment, including both hardware and software, may store highly personal information, such as photos and contact lists, the use of processing and storage capabilities and the collection of information from the device will only be allowed with the user’s consent or for other specific transparent purposes laid down in the regulation.
  • The end-user should have a genuine choice on whether to accept cookies or similar identifiers. Making access to a website dependent on consent to the use of cookies for additional purposes as an alternative to a paywall will be allowed if the user is able to choose between that offer and an equivalent offer by the same provider that does not involve consenting to cookies.
  • To avoid cookie consent fatigue, an end-user will be able to give consent to the use of certain types of cookies by whitelisting one or several providers in their browser settings. Software providers will be encouraged to make it easy for users to set up and amend whitelists on their browsers and withdraw consent at any moment.

Brexit

PECR continues to apply after the UK's exit from the EU on 31 January 2020. The draft ePR, described in detail above, which is still in the process of being agreed, was not finalised before 31 January 2020 and will therefore not become directly applicable in the UK. Once it is directly applicable to EU member states (which is likely 24 months after its coming into force), the UK will then need to consider to what extent to mirror the new rules. In any case, given that UK companies will continue to process data of EU end users, it will still be necessary to be aware of any discrepancies created by E-privacy reform in the EU.

The deadlock is over

It has long been considered that EU E-privacy regulations have lagged behind the technological progress seen in online marketing techniques and EU negotiations around reform have at times seemed never-ending. The agreement reached by the EU council will therefore be seen as a necessary improvement in legal certainty, although plenty of questions still abound.

PECR in its pre-reformed state will continue to apply in the UK. On 19th February 2021, the European Commission issued its draft adequacy decision that would allow EU-to-UK data transfers. While the E-privacy Regulation is not strictly relevant to the UK’s continued adequacy status, alignment on E-privacy rules would likely be viewed positively by the EU institutions, which could prompt the UK to update its laws in line with the new EU regime. The reforms will of course also be relevant to any UK business that operates in the EU. Even if the Regulation is finally adopted this year, it will not apply for a further two years meaning, these changes will likely not come into effect until 2023 at the earliest.

If you have any questions on E-privacy and data protection, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Robot Manufacturing

Robot Manufacturing: Product Liability Law

Robot manufacturing is a growth industry as the costs of producing robots are going down while the savings in labour costs are rising. With the rise and rise of automation in all areas of life, the word robot has come to mean a wider variety of things. Artificial intelligence (read our blog on some legal issues) has received the most amount of attention recently especially given its crossover with big data (read our blog). But what about the more conventional notion of a robot – the walking, talking lump of steel, more willing to do jobs we’re not so keen on. The sort that product liability law applies to more obviously. This blog covers some issues that a robot manufacturer may encounter when putting such a product on the market.

Robot Manufacturing - product liability and safety risk management

Product liability lies in the accountability faced by a manufacturer of a sub-standard, defective or dangerous product. Such accountability applies to the members of the public who purchase such a product and those below a manufacturer in the supply chain. Contractual protections, insurance and effective risk management can be used to protect a manufacturer from such liability.

Can liability for a dangerous or defective product be excluded or limited?

Contractual protections create a variety of scenarios. Whilst a manufacturer may wish to limit its own liability to parties beneath it in the supply chain, they may well wish to enhance the liabilities of a manufacturer above it. The terms of a contract are the vehicle by which this can be achieved. Additionally a robot manufacturer will want to be certain that quality and safety standards are judiciously applied to every level of its supply chain, especially key suppliers. Given the potential technological complexity of such an industry, it may even be useful to seek the advice of specialist consultants who have the know-how to ensure all the liability that should be attributed to suppliers is and that a comprehensive set of standards is agreed to in the contract.

There are various ways that robot manufacturing companies can do this. Limiting and excluding liabilities within their contracts can be one way – but this is only possible for certain things. There are restrictions on limiting liability for dangerous or defective products. Restrictions that overrule contractual clauses. Robot manufacturers should therefore look to other means, i.e. non-contractual, to control their risk.

Different considerations need to be taken into account when dealing with the various parties within the supply chain. For example, manufacturers, distributors, importers and retailers will all have concerns specific to the role that they undertake.

Is there an effective quality and safety assurance programme in place?

Effective quality assurance is at the heart of non-contractual risk aversion for product manufacturers. This is particularly important in the robot manufacturing industry given that the products usually involve a lot of automation and therefore the blame for something going wrong is more likely to fall on the manufacturers’ shoulders, rather than the customers. Setting up an internal committee to oversee product safety is useful first step. The team should incorporate members from all over the business to ensure nothing is missed. The committee's function should be: to review products and their associated documents; ensure that all appropriate regulatory and internal procedures have been followed and documented before and after marketing; authorise any necessary action (for example, changing warnings or design); review after-sales monitoring reports for trends and significant incidents; review insurance arrangements.

Additionally, keeping good records of exactly what was supplied when and by whom can simplify such a committee’s job.

Is there an effective enquiries and complaints system in place?

A robot manufacturing company must be able to respond to any complaints or enquiries with regard to a product’s safety. This is a legal requirement as well as being based on a desire to look out for your customers and improve your product. Things to think about: does the company have a system to handle customer enquiries and complaints? If so, could it be improved? Are staff adequately trained? Does the company have a policy of recovering allegedly unsafe items and investigating and recording the items and circumstances? Is there a systematic review of adverse incident information involving multi-disciplinary input from different departments? Can the company identify repeat claimants who may not be genuine?

The Services Directive (2006/123/EC), before Brexit, created an obligation to inform customers of how to make complaints and how manufacturers should deal with such complaints. The Services Directive is implemented in the UK by the Provision of Services Regulations 2009 (SI 2009/2999) (PSRs). The PSRs apply to most product manufacturers in the UK and introduce obligations around how to respond to enquiries or complaints about their products.

After Brexit, the Provision of Services (Amendment etc.) (EU Exit) Regulations (SI 2018/1329) implemented this EU law into UK law. All of the obligations around how to deal with enquiries or complaints essentially remain the same. Although some changes were made to EEA-specific provisions. This included the revocation of a requirement not to discriminate against customers based on their place of residence. This means that manufacturers in the UK could treat customers in the EEA differently to customers in the UK now that we have left the EU. The practical implications of this change are yet to be seen.

Robot Manufacturing - Data protection and privacy

The use of robots (drones being a good example) fitted with cameras or other sensors which can collect personal data such as images of people or vehicle plate numbers, geolocation data or electromagnetic signals relating to an individual's device (for example, mobile phones, tablets, Wi-Fi routers, and so on) can have privacy implications.

At EU level, there is no data protection legislation specific to the use of robots/drones; the applicable legal framework is contained in the General Data Protection Regulation (EU) 2016/679 (GDPR). In the UK, the processing of personal data via robot/drones is subject to the GDPR and the Data Protection Act 2018 (DPA) and the legal provisions applicable to CCTV systems. After Brexit, the GDPR will be retained in UK law and amended to become the UK GDPR. For more information on data protection after Brexit read our blog.

The GDPR and the DPA set out the conditions under which personal data can be processed and provide for certain exemptions and derogations, the most relevant being:

  • Household exemption: This applies to the processing of personal data in the course of a purely personal or household activity. This exemption could potentially apply to individuals using robots/drones for their own purposes. However, the ECJ has narrowly interpreted this exemption in the context of the use of CCTV camera. As a result, its application will depend on the specific circumstances of each case. The Information Commissioner's Office (ICO), the UK data protection regulator in charge of enforcing GDPR and DPA requirements, has issued guidance in relation to the use of drones. The ICO makes a distinction between the use of drones by "hobbyists" and their use for professional or commercial purposes. Although "hobbyists" would be likely to be exempted from the GDPR and the DPA on the basis of the household exemption, the ICO has provided tips for the responsible use of drones, inviting people to think of privacy considerations and to apply a common sense approach when recording and sharing images captured by a drone.
  • Journalistic exemption: In cases where personal data is collected through drones with a view to the publication of some journalistic, academic, artistic or literary material. In this case, processing would, under certain conditions, be exempt from many data protection obligations to the extent that such obligations would be incompatible with the purposes of journalism, academic, literary or artistic purposes which are sought by the processing.

Here to help

Robot manufacturing companies come up against many of the same legal issues as other product manufacturing companies. Having risk assessment procedures in place, as well as mechanisms to deal with potential faults, should reduce liability. However, robots are likely to be able to collect data and so data protection law also becomes important.

EM law specialises in technology and contract law. Get in touch if you need advice on Robot Manufacturing or have any questions on the above.


Legitimate Interests

Legitimate Interests – Lawful Processing of Personal Data

When processing personal data legally, organisations have six possible reasons or ‘bases’ to rely upon: consent, contract, legal obligation, vital interest, public task or legitimate interests. Most of these are unambiguous. Fulfilling a contract or protecting someone’s life for example. On the surface, ‘legitimate interests’ appears more open to interpretation. What will be considered legitimate? And whose interests will be taken into account? When all else fails, organisations often mistakenly look to legitimate interests as a base for processing that furthers their business interest. Seeing legitimate interests as a fall-back is misguided. In many respects it is just as stringent as any of the other possible bases.

Legitimate Interests - Legislation

The UK GDPR describes legitimate interests as “processing necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child”.

Legitimate interests is different to the other lawful bases as it is not centred around a particular purpose (e.g. performing a contract with the individual, complying with a legal obligation, protecting vital interest or carrying out a public task), and it is not processing that the individual has specifically agreed to (consent). Legitimate interests is more flexible and could in principle apply to any type of processing for any reasonable purpose.

Because it could apply in a wide range of circumstances, it puts the onus on you to balance your legitimate interests and the necessity of processing the personal data against the interests, rights and freedoms of the individual taking into account the particular circumstances. This is different to the other lawful bases which presume that your interests and those of the individual are balanced.

Three-part test

The ICO (UK data protection regulatory authority) interprets the legislation with a three-part test. The wording creates three distinct obligations:

  1. “Processing is necessary for…” – the necessity teste. is the processing necessary for the purpose?
  2. “… the purposes of the legitimate interests pursued by the controller or by a third party, …” – the purpose teste. is there a legitimate interest behind the processing?
  3. “… except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.” – the balancing teste. are the legitimate interests overridden by the individual’s interests, rights or freedoms?

Purpose test – what counts as a ‘legitimate interests’?

A wide range of interests may be legitimate interests. It could be your legitimate interests in the processing or it could include the legitimate interests of any third party. The term ‘third party’ doesn’t just refer to other organisations, it could also be a third party individual. The legitimate interests of the public in general may also play a part when deciding whether the legitimate interests in the processing override the individual’s interests and rights. If the processing has a wider public interest for society at large, then this may add weight to your interests when balancing these against those of the individual.

Examples

The UK GDPR does not have an exhaustive list of what purposes are likely to constitute legitimate interests. However, the recitals do say the following purposes constitute legitimate interests: fraud prevention; ensuring network and information security; or indicating possible criminal acts or threats to public security.

Therefore, if you are processing for one of these purposes you may have less work to do to show that the legitimate interests basis applies. The recitals also say that the following activities may indicate a legitimate interest: processing employee or client data; direct marketing; or administrative transfers within a group of companies.

However, whilst these last three activities may indicate legitimate interests, you still need to do some work to identify your precise purpose and show that it is legitimate in the specific circumstances, and in particular that any direct marketing complies with e-privacy rules on consent.

The necessity test

You need to demonstrate that the processing is necessary for the purposes of the legitimate interests you have identified. This doesn’t mean that it has to be absolutely essential, but it must be a targeted and proportionate way of achieving your purpose. You need to decide on the facts of each case whether the processing is proportionate and adequately targeted to meet its objectives, and whether there is any less intrusive alternative, i.e. can you achieve your purpose by some other reasonable means without processing the data in this way? If you could achieve your purpose in a less invasive way, then the more invasive way is not necessary.

The balancing test

Just because you have determined that your processing is necessary for your legitimate interests does not mean that you are automatically able to rely on this basis for processing. You must also perform a ‘balancing test’ to justify any impact on individuals. The balancing test is where you take into account “the interests or fundamental rights and freedoms of the data subject which require the protection of personal data” and check they don’t override your interests. In essence, this is a light-touch risk assessment to check that any risks to individuals’ interests are proportionate. If the data belongs to children then you need to be particularly careful to ensure their interests and rights are protected.

Reasonable expectations

Recital 47 of the UK GDPR says “the existence of a legitimate interest would need careful assessment including whether a data subject can reasonably expect at the time and in the context of the collection of the personal data that processing for that purpose may take place. The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing.”

The UK GDPR is clear that the interests of the individual could in particular override your legitimate interests if you intend to process personal data in ways the individual does not reasonably expect. This is because if processing is unexpected, individuals lose control over the use of their data, and may not be in an informed position to exercise their rights. There is a clear link here to your transparency obligations.

You need to assess whether the individual can reasonably expect the processing, taking into account particularly when and how the data was collected. This is an objective test. The question is not whether a particular individual actually expected the processing, but whether a reasonable person should expect the processing in the circumstances.

How do you apply legitimate interests in practice?

The ICO guidance states that organisations should undertake the three-part test and document the outcome, this process is referred to as a "legitimate interests assessment" (LIA). The length of a LIA will vary depending on the context and circumstances surrounding the processing. LIAs are intended to be a simple form of risk assessment, in contrast to a data protection impact assessment (DPIA) which is a "much more in-depth end-to-end process". A LIA is also a potential trigger for a DPIA. The ICO confirms that there is no specific duty in the UK GDPR to undertake a LIA, however, as a matter of best practice, one should be undertaken by organisations in order to meet their obligations under the UK GDPR accountability principle.

Once a LIA has been undertaken and an organisation has concluded that the legitimate interests basis for processing applies, then it should continue to keep the LIA under regular review. Where a LIA identifies high risks to the rights and freedoms of the individual, then a DPIA should be undertaken to assess these risks in more detail.

What else is there to consider?

The ICO also recommends that:

  • Individuals are informed of the purpose for processing, that legitimate interest is the basis being relied on and what that legitimate interest is. Organisations' privacy notices should also be updated to reflect this.
  • Where an organisation's purposes change or where it has a new purpose, it may still be able to continue processing for that new purpose on the basis of legitimate interests as long as the new purpose is compatible with the original purpose. A compatibility assessment should be undertaken in this case.
  • Organisations should be aware of individuals’ rights, for example, where legitimate interests is relied on as a basis for processing then the right to data portability does not apply to any personal data being processed on that basis.

Here to help

The concept of ‘legitimate interests’ as a basis for processing personal data predates GDPR. Many organisations are consequently aware of the concept. It should not, however, be taken for granted when organisations wish to further a business interest. As shown above, there are a number of obligations to consider, and therefore the basis should not be considered lightly or as a last resort.

If you have any questions on legitimate interests, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


International Transfers of Personal Data

International Transfers of Personal Data - What Are The Rules?

International transfers of personal data have been shaken up in recent memory. Most obviously Brexit has placed the EU and UK in separate data protection regimes rendering any transfer between them international, meaning they are now subject to new conditions. Additionally, data transfers to the US have been disrupted by the judgement in Schrems II. This landmark case led to the striking down of the EU-US Privacy Shield which enabled free flow of data to certain US-based organisations. For more information on the impact of Brexit read our blog.

Where does it all lead? It is easy to be overwhelmed by the complexity of the legal and political implications of these developments. However, as most organisations are realising, the simple solution continues to be Standard Contractual Clauses (SCCs). After an introduction to international transfers, this blog will focus on the use and future of SCCs. Which for the majority of organisations will be the most practical data transfer mechanism.

General principle for data exports to non-UK countries

International transfers of personal data to a country outside the UK (third country) may only take place if the controller and the processor comply with certain conditions. A transfer of personal data to a third country may take place if:

  • the UK has decided that the third country ensures an adequate level of protection

or

  • the controller or processor has provided appropriate safeguards; enforceable data subject rights and effective legal remedies for data subjects are available.

Third countries with adequate levels of protection

The UK has “adequacy regulations” in relation to the following countries and territories:

  • The European Economic Area (EEA) countries. These are the EU member states and the EFTA States. The EU member states are Austria, Belgium, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Ireland, Italy, Latvia, Lithuania, Luxembourg, Malta, Netherlands, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, Sweden. The EFTA states are Iceland, Norway and Liechtenstein.
  • EU or EEA institutions, bodies, offices or agencies.
  • Gibraltar
  • Countries, territories and sectors covered by the European Commission’s adequacy decisions (in force at 31 December 2020). These include a full finding of adequacy about the following countries and territories: Andorra, Argentina, Guernsey, Isle of Man, Israel, Jersey, New Zealand, Switzerland and Uruguay. In addition, the partial findings of adequacy about: Japan – only covers private sector organisations. Canada - only covers data that is subject to Canada's Personal Information Protection and Electronic Documents Act (PIPEDA). Not all data is subject to PIPEDA. For more details please see the EU Commission's FAQson the adequacy finding on the Canadian PIPEDA.

International transfers of personal data - adequate safeguards

If the third country has not been granted an adequacy decision then organisations can rely upon adequate safeguards. Schrems II has added an additional burden - before you may rely on an appropriate safeguard to make a restricted transfer, you must be satisfied that the data subjects of the transferred data continue to have a level of protection essentially equivalent to that under the UK data protection regime. This can be done by undertaking a risk assessment, which takes into account the protections contained in that appropriate safeguard and the legal framework of the destination country (including laws governing public authority access to the data). This assessment is undoubtedly complex in many situations. The ICO intends to issue guidance on this topic in due course.

Controllers and processors may provide adequate safeguards by:

  • A legally binding agreement between public authorities or bodies.
  • Binding corporate rules (agreements governing transfers made between organisations within a corporate group).
  • Standard data protection clauses in the form of template transfer clauses adopted by the Commission.
  • Standard data protection clauses in the form of template transfer clauses adopted by the ICO.
  • Compliance with an approved code of conduct approved by a supervisory authority.
  • Certification under an approved certification mechanism as provided for in the GDPR.

Is the restricted transfer covered by an exception?

If you are making a restricted transfer that is not covered by UK ‘adequacy regulations’, nor an appropriate safeguard, then you can only make that transfer if it is covered by one of the ‘exceptions’ set out in Article 49 of the UK GDPR:

Exception 1. Has the individual given his or her explicit consent to the restricted transfer?

Exception 2. Do you have a contract with the individual? Is the restricted transfer necessary for you to perform that contract?

Exception 3. Do you have (or are you entering into) a contract with an individual which benefits another individual whose data is being transferred? Is that transfer necessary for you to either enter into that contract or perform that contract?

Exception 4: You need to make the restricted transfer for important reasons of public interest.

Exception 5: You need to make the restricted transfer to establish if you have a legal claim, to make a legal claim or to defend a legal claim.

Exception 6: You need to make the restricted transfer to protect the vital interests of an individual. He or she must be physically or legally incapable of giving consent.

Exception 7: You are making the restricted transfer from a public register.

Exception 8: you are making a one-off restricted transfer and it is in your compelling legitimate interests.

International transfers of personal data - Standard Contractual Clauses

You can make a restricted transfer if you and the receiver have entered into a contract incorporating standard data protection clauses recognised or issued in accordance with the UK data protection regime. These are known as ‘standard contractual clauses’ (‘SCCs’ or ‘model clauses’).

The SCCs contain contractual obligations on you (the data exporter) and the receiver (the data importer), and rights for the individuals whose personal data is transferred. Individuals can directly enforce those rights against the data importer and the data exporter.

ICO guidance on Standard Contractual Clauses

The commentary on the ICO webpage on Standard Contractual Clauses (SCCs) after the transition period ends provides guidance on what the ICO expects from UK controllers in relation to restricted transfers, i.e. when they are seeking to export personal data from the UK to entities located in countries which do not provide an adequate level of data protection. As shown above, the SCCs represent one of a number of "appropriate safeguards" available to enable such transfers to take place. SCCs are often the most practical method for organisations when it comes to data transfers.

The ICO guidance states that UK controllers can continue to use the existing EU SCCs. The guidance goes on to state:

"You are able to make changes to those EU SCCs so they make sense in a UK context provided you do not change the legal meaning of the SCCs. For example, changing references from the old EU Data Protection to the UK GDPR, changing references to the EU or Member States, to the UK, and changing references to a supervisory authority to the ICO.

Otherwise you must not make any changes to the SCCs, unless it is to add protections or more clauses on business related issues. You can add parties (i.e. additional data importers or exporters) provided they are also bound by the SCCs."

ICO versions of the SCCS

The versions of the SCCs the ICO has created contain suggested changes. These are only suggestions but if you wish to deviate from these suggested changes they should be consistent with the principles set out in the above guidance extract and the guidance generally, i.e. it needs to make sense in a UK context and not change the legal meaning of the SCCs. The ICO versions act as a starting point therefore, making changes only where strictly necessary to make them make sense.

Schedule 21 of the Data Protection Act 2018 details the types of changes that can be made to the EU version for use by a UK controller but it does also seem to allow for use of the EU version as they are, without amendment, unless disapplied by the Secretary of State or the Information Commissioner (see paragraphs 7 and 8 of Schedule 21).

Exporting from both the UK and the EU

Ideally, if personal data is to be exported from both the UK and the EU to a jurisdiction not deemed adequate by both the UK government and the European Commission, the exports from each of the UK and the EU should be treated separately as, while virtually identical, the EU GDPR and UK GDPR are completely separate regulatory regimes. If SCCs are chosen as the appropriate safeguard, the safest option would be to have the data exports from the UK and the EU to be covered by different sets of clauses (or potentially, depending on risk, to use the EU SCCs with an additional set of amendments for the UK version).

This point is underlined in the original European Commission decision of 2004 which states each set of SCCs as a whole forms a model, so data exporters should not be allowed to amend these sets or totally or partially merge them in any manner. To meet the data transfer requirements under the UK GDPR and the EU GDPR, if a controller wants to use SCCs, they cannot be adapted beyond what has been recommended by both the ICO and the guidance from the EC on their use.

Retrospective?

It is important to point out that, looking retrospectively, if the EU SCCs were entered into prior to the end of the transition period, they will continue to be valid for restricted transfers under the UK GDPR. There will not be a need to replace the EU SCCs contracted before 1 January 2020 with updated UK SCCs.

New Standard Contractual Clauses

On 12 November 2020 the EU Commission published standard contractual clauses for international transfers of personal data to third countries under the General Data Protection Regulation ((EU) 2016/679) (GDPR). This was a draft implementing decision and Annex. The Commission has previously indicated that these clauses would be finalised before the end of 2020 although, as they require the opinion of the EDPB and EDPS, and consultation with member states under the comitology procedure, they will now come into force in 2021.

The Commission notes that the clauses are a modernisation of the previous clauses, designed to better reflect the use of new and complex processing operations involving multiple parties, complex processing chains and evolving relationships. They are designed to be flexible and allow for a number of parties, including for parties to accede to the clauses later ("docking clause"). They are drafted in a modular approach with general clauses followed by options for different processing circumstances.

Key points of interest include that the clauses:

  • Can be used by controllers and processors, including those not established in the EU but that are caught by the GDPR and cover both controller to controller and controller to processor options. They can also be used for EU processor to non-EU controller transfers and processor to sub-processor transfers, both of which are new options.
  • Can be included in a wider contract and additional clauses and safeguards can be added provided these are not contradictory or prejudice the rights of data subjects.
  • Should include rules for liability and indemnification between the parties and are enforceable by data subjects as third-party beneficiaries against the data exporter or importer.

What does this mean for the UK?

Under the UK-EU trade and co-operation agreement, the UK is obliged to not exercise certain powers under its own data protection legislation including producing its own SCCs during the four to six month extension period (starting on the 1st January 2021 – for more info see our blog). The ICO intends to consult on and publish new UK SCCs during 2021. With Brexit, the ICO and Secretary of State must keep the transitional arrangements for SCCs under review, and both are now able to issue new SCCs. It may be that at some point the EU SCCs will cease to be valid, for new and/or existing restricted transfers from the UK.

The extent to which the ICO, who are reviewing the new EU SCCs, are influenced by the new EU model clauses will come to be another example of how the two regimes wish to either spilt or merge. Given that the UK has already granted countries in the EU an adequacy decision (and seem to hope to get one in return), it is not overly speculative to suggest that the new EU SCCs will, in some form or another, be incorporated into UK data protection law. However, as noted above, this will not be possible until after the four to six month extension period the UK currently find themselves in.

Here to help

International transfers of personal data is a complex area of law and in a state of transition. As suggested above the most practical solution for a lot of organisations will be the use of SCCs but that’s not to say your transfers cannot be enabled any other way (see above). The extent to which organisations will have to review their positions will be based upon whether or not the EU grants the UK an adequacy decision and the extent to which the ICO incorporates the soon to be published new EU standard contractual clauses into their own. In any event organisations need to be on the lookout for when these new clauses come into force in both the EU and UK.

If you have any questions on Brexit and data protection, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Adtech

Adtech - ICO report into adtech and real time bidding

On 22 January 2021, the Information Commissioner’s Office (ICO) announced its resumption of an investigation into real time bidding (RTB) and adtech. The investigation had been put on hold by COVID-19. Simon McDougall, ICO Deputy Commissioner, commented in a statement that the “the complex system of RTB uses people’s sensitive personal data to serve adverts and should require people’s explicit consent, which is not happening right now.”

The ICO will continue its investigation with a series of audits focusing on digital market platforms. It will issue assessment notices to specific companies over the coming months, so that it can gauge the state of the industry.

What is adtech?

Adtech (short for advertising technology) is the umbrella term for the software and tools that help agencies and brands target, deliver, and analyse their digital advertising efforts. If you have come across the terms "programmatic" or "omnichannel," then you may already know a little about what ad tech does.

Programmatic advertising, for instance, buys target audiences instead of time slots: Think about buying ad space that reaches a particular demographic wherever it is instead of buying a prime time TV spot and hoping the right people are watching.

Omnichannel marketing reaches target consumers across all channels -- mobile, video, desktop, and more -- within the context of how they've interacted with a brand (those first seeing an ad will receive a different message from those who have engaged with that brand a number of times). Adtech methodologies seek to deliver the right content at the right time to the right consumers, so there's less wasteful spending.

What is real-time bidding?

Real-time bidding (RTB) is an automated digital auction process that allows advertisers to bid on ad space from publishers on a cost-per-thousand-impressions, or CPM, basis. CPM is what you pay for one thousand people to see your ad. Like an auction, the highest bid from relevant ads will typically win the ad placement.

ICO report

On 20 June 2019 the ICO issued an update report into adtech and real time bidding. The report was unofficial and highlighted ways in which advertising technology was in many cases systemically breaking data protection and e-privacy laws. Particular concerns included:

  • Adtech companies are not collecting personal data lawfully when dealing with customer cookies because instead of relying on consent (as stated in the Privacy and Electronic Communications Regulations (PECR)) they are using legitimate interest as their legal base. Even if legitimate interests were the correct base, a data processor would need to carry out a variety of tests and impose safeguards on data collected by that reasoning. A lot of adtech companies are therefore additionally failing to exercise their legitimate interest legal base correctly, regardless of the fact it is the wrong base to begin with.
  • Explicit consent is not being obtained when processing special categories of data.
  • Data Impact Protection Assessments are either not being carried out when necessary or being carried out incorrectly.
  • Privacy notices are not giving data subjects the information they need to be informed under GDRP and PECR. They are also undermining the transparency principle expected of all data processers.
  • Detailed profiles are being created on potential targets for advertising and are then shared among hundreds of bidders. The data minimisation and storage limitation principles are being undermined for this reason.
  • Security measures to ensure personal data is protected and appropriate safeguards when transferring data internationally are often being undermined or ignored.

Progress so far

In January 2020, the ICO's Executive Director for Tech Policy and Innovation published a blog about progress so far (Adtech - the reform of real time bidding has started and will continue). He noted the ICO's continued concern about the issues already raised but added that the Internet Advertising Bureau (IAB UK) and Google are starting to make the changes needed.

The IAB UK has responded to pressure from the ICO and aims to make the sector more aware of data protection principles. It recognises the need to address issues related to cookies and special categories of data and will publish UK-focused guidance. (IAB UK sets out actions to address ICO’s real-time bidding concerns, 9 January 2020). The ICO has supported googles decision to phase out third party cookies over the next two years which is indicative of the changing landscape of online marketing.

Moving forward

Due to sensitivity of the work, the ICO will publish its final findings, once it has concluded its investigation. In the meantime, Mr McDougall advises organisations operating in the adtech space to urgently assess how they use personal data, in particular their compliance with obtaining individuals’ consent, reliance on legitimate interests, deployment of data protection by design and default and use of data protection impact assessments.

Using legitimate interests as a legal basis in adtech

Using legitimate interests as the legal base for adtech has become commonplace. This is unsurprising given that it means no mechanism to obtain or record consents is needed.

The ICO online guidance "When is consent appropriate?" says that ‘If you need consent under e-privacy laws to send a marketing message, then in practice consent is also the appropriate lawful basis under the GDPR’. The ICO Adtech Update expands on this:

  • Trying to apply legitimate interests when GDPR-compliant consent has been obtained would be unnecessary and could confuse individuals.
  • Where an individual has given consent they would expect processing to cease when they withdrew consent. However, an entity relying on legitimate interests might seek to continue processing in this scenario, which would be unfair.

The ICO Adtech Update also makes the point that reliance on legitimate interests for marketing activities is only possible if organisations are able to show that their use of personal data is proportionate, has a minimal privacy impact, and individuals would not be surprised or likely to object. The ICO considers that the processing involved in real time bidding (RTB) cannot meet these criteria and legitimate interests cannot be used for the main bid request processing. The ICO does not rule out use of legitimate interests for other purposes, such as a demand-side platform supplementing a bid request with additional information.

Data protection impact assessments (DPIAs)

Controllers should carry out a Data Protection Impact Assessment (DPIA) before beginning processing that is likely to result in a high risk to the rights and freedoms of individuals (Article 35, GDPR). The ICO has published a list of processing operations likely to result in such a high risk, for which DPIAs are mandatory. The ICO Adtech Update confirms that Real Time Bidding, as used in adtech, involves several such processing operations. The ICO draft Direct Marketing Code states that the type and volume of processing that you can undertake in the online world, and the risks associated with that processing, mean it is highly likely that a DPIA will be required before processing begins.

Data minimisation

The GDPR requires that personal data collected must be limited to what is necessary in relation to the purposes for which it is processed. The ICO Adtech Update states that the creation of detailed profiles, repeatedly updated with information about individuals' online acitivities, is disproportionate for the purposes of targeted advertising. It is also intrusive and unfair, in particular as individuals are often unaware that the processing takes place and the privacy information provided does not clearly inform them what is happening.

Data integrity and confidentiality

Under the GDPR personal data must be stored securely. The ICO Adtech Update noted that real time bidding often involves sharing personal data with adtech companies in non-EU jurisdictions, resulting in international transfers. Further participants have no real control over the other adtech companies with whom data is shared. Contractual controls are insufficient; appropriate monitoring and technical and organisational controls are also required.

Accountability

Data controllers must be able to demonstrate their compliance with the GDPR. The ICO Adtech Update notes that the complexities of the adtech ecosystem mean that many adtech companies will find it difficult to understand, document and be able to demonstrate how their processing operations work, what they do, who they share any data with and how any processors are vetted and controlled; and how they can enable individuals to exercise their rights.

Accuracy and storage limitation

Other GDPR requirements include that data must be accurate and kept up to date and that personal data must be kept for no longer than is necessary. The ICO Adtech Update highlights the fact that because of the vast number of adtech companies involved in real time bidding it is difficult to ensure compliance with these principles. The ICO Cookie Guidance states that it is necessary to check that the duration of any cookies is appropriate; any default durations should be reviewed.

Here to help

Adtech has revolutionised the marketing industry and was firmly in place before the introduction of GDPR in 2018. It is now the ICO’s aim to bring this boom industry in line with UK data protection law. If you have any questions on adtech and data protection, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.