UK data protection law

UK Data Protection Law: The ICO Asks Some Uncomfortable Questions

UK data protection law has the possibility to assert its independence following Brexit. That said, we don't expect the UK to stray too far from the EU position because to do so would threaten the continuing adoption of an adequacy decision by the European Commission on 28 June 2021. A recent consultation issued by the Information Commissioner’s Office (ICO), the regulatory body for data protection in the UK, has raised some difficult and long unanswered questions. Alongside a draft international data transfer agreement and a draft risk assessment for international transfers, the ICO has posed questions which many data lawyers have been scratching their heads over since GDPR was introduced in 2016. Organisations can submit their opinions up until 5pm on Thursday 7th October 2021. We look at the questions asked here and take a view on their proposed answers.

UK Data Protection Law: Article 3 UK GDPR

The ICO raised UK data protection law questions around how Article 3 UK GDPR would apply to certain relationships. The crux of this investigation concerning who, in an international context, the UK GDPR applies to. ‘The Regulation’ in the article is referring to UK GDPR. The article is as follows:

  • “This Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the United Kingdom, regardless of whether the processing takes place in the UK or not.
  • This Regulation applies to the relevant processing of personal data of data subjects who are in the United Kingdom by a controller or processor not established in the United Kingdom where the processing activities are related to:
  1. The offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the United Kingdom; or
  2. The monitoring of their behaviour as far as their behaviour takes place within the United Kingdom.”

The UK data protection law issues raised by the ICO here was whether Article 3 applied to (a) an overseas processor of a UK controller; and (b) an overseas joint controller with a UK controller.

This needs simplifying. Firstly, data controllers are the main decision-makers – they exercise overall control over the purposes and processing of personal data. Processors act on behalf of, and only on the instructions of, the relevant controller. A good example would be where a business uses a software platform to process its data for analysis, human resources or any other possible function. The business would usually be the controller in this case and the software platform provider, the processor.

The ICO is therefore asking whether the Regulation (UK GDPR), as referred to in Article 3, directly applies to overseas processors or an overseas joint controller of a UK controller. This is supplementary to the fact that data controllers are obliged to enter into data processing agreements with their processors which will contain most of the UK GDPR obligations. If the ICO were to say that such overseas processors and controllers were to be directly subject to the UK GDPR, then such contractual provisions may become obsolete. It is also noted that an overseas joint controller usually always processes ‘personal data in the context of the activities of an establishment of a controller… in the United Kingdom’ (Article 3(1)) and therefore, usually, overseas joint controllers would be seen to be directly subject to UK GDPR.

Google Spain Judgement

The ‘spanner in the works’ of all this theorising around the obligations of overseas processors / joint controllers was the controversial judgement in Google Spain SL v. Agencia Espanola de Protection de Datos. Although a Spanish case, it eventually went to the European Court of Justice (ECJ), and with all case law from the ECJ being retained by the UK following Brexit, the ruling still holds sway. Essentially the ECJ stated that Google was the data controller of EU personal data published by third party websites and so GDPR applied, even though they were based in the US. Many commentators would distinguish Google in this instance from your average overseas processor/controller, however, because of the unusual amount of influence the search engine has over how personal data is presented to users.

Article 3 UK GDPR – ICO views

So the ICO is considering making the following three proposals applicable – subject to the consultation, which allows organisations to submit their opinions by 7th October 2021:

  • For overseas processors of a UK GDPR Controller under Article 3(1) – the ICO suggests that whether UK GDPR directly applies will depend on the circumstances.
  • For overseas processors of a UK GDPR Controller under Article 3(2) – the ICO suggests that the UK GDPR should always be directly applicable to processors.
  • For overseas joint controllers with a UK-based joint controller – the ICO suggests that whether UK GDPR directly applies will depend on the circumstances.

As previously stated, this may all be fairly irrelevant given that UK controllers are obliged to enter into data processing agreements with their processors and most overseas controllers will intuitively be deemed to be subject to the UK GDPR. If the ICO were to decide that the UK GDPR applied to all overseas processors, this would potentially mean less contractual protection for data subjects. We take the suggestion at 2) therefore to be counterproductive, although we support the suggestions in 1) and 3) even though they introduce increased ambiguity.

UK Data Protection Law - Chapter V UK GDPR

The next set of UK data protection law questions posed by the ICO concerned chapter 5 UK GDPR and, in particular, Article 44. Questions here were raised around when a ‘restricted transfer’ is taking place and therefore whether measures need to be taken to protect such a transfer. The article is as follows:

“Any transfer of personal data which are undergoing processing or are intended for processing after transfer to a third country or to an international organisation shall take place only if, subject to the other provisions of this Regulation, the conditions laid down in this Chapter are complied with by the controller and processor, including for onward transfers of personal data from the third country or an international organisation to another third country or to another international organisation. All provisions in this Chapter shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined”.

What this is essentially saying is that a transfer falling within article 44 is a “restricted transfer” and can therefore only take place when the conditions in Chapter V are complied with. This means contractual protections will need to be in place, security measures will need to be accessed and a risk assessment should also take place.

Chapter V UK GDPR – ICO views

The ICO asked UK data protection law questions around when such Article 44 restricted transfers would take place. They suggested the following clarifications could take place:

  • In order for a restricted transfer to take place, there must be a transfer from one legal entity to another – this means that it is not a restricted transfer where the data flows within a legal entity. For example, it is not a restricted transfer where an employee takes a laptop outside the UK, or a UK company shares data with its overseas branch. However, where the data flow stays within a single legal entity, it would still have to ensure those data flows comply with general UK GDPR obligations (e.g. security requirements) but not the transfer requirements in Chapter V.
  • A UK GDPR processor with a non-UK GDPR controller, will only make a restricted transfer to its own overseas sub-processors – this interpretation means that it is a restricted transfer when a UK GDPR processor (with a non-UK GDPR controller) appoints an overseas sub-processor and transfers personal data to it. But it is not a restricted transfer when a UK GDPR processor (with a non-UK GDPR controller) returns data to its non-UK GDPR controller or sends it to a separate overseas controller or processor (but not its own sub-processor).
  • Whether processing by an importer subject to UK GDPR is considered a restricted transfer – currently, if the importer is already required to process data in accordance with UK GDPR, no additional Chapter V protection is needed. For example, the exporter will not need to carry out a Schrems II risk assessment nor put in place an Art 46 transfer tool. What the ICO is now suggesting is that it could update its guidance to reflect that a restricted transfer takes place whenever the exporter is subject to UK GDPR (and may be located in the UK or overseas) and the importer is located outside of the UK.

We support points 1) and 2) because they bring greater clarity around when restricted transfers are not taking place. However, point 3) seems slightly unnecessary although is not wholly unsupported. If an organisation has gone to the effort of determining whether or not an importer they are working with is subject to the UK GDPR, why then should they have to make a restricted transfer? On the flip side, point 3) offers further protection to data subjects because more contractual protection is introduced. You could also argue that although some importers are subject to UK GDPR, the difficulty in enforcing it will mean contractual obligations would be a good thing. Overall, we support the proposal in point 3 but with some reservations.

UK Data Protection Law - The Finer Details

This is pretty high-level UK data protection law analysis. The main finding here being that the ICO isn’t afraid to ask difficult questions, although in some cases they seem to be on a rather unnecessary tack. We support most of the suggestions and are pleased to see the ICO acting independently. Some simple questions, however, such as whether consultants are deemed to be within an organisation, are still left unanswered. Our hope would be that they don’t stop here in taking their independent perspective on retained EU law and that they continue to keep in mind the practical issues raised by organisations.

EM law specialises in technology and UK data protection law. Get in touch if you need advice on data protection law or if you have any questions on the above.


IDTA: One Small Step for Data Protection, One Giant Leap for the UK

IDTA or international data transfer agreements, after the current ICO consultation, are likely to be the new means by which organisations can transfer data out of the UK. On the 11th August 2021, the UK’s data protection regulatory body, the Information Commissioner’s Office (ICO) announced that it has launched a consultation on how organisations can continue to protect people’s personal data when transferring it outside the UK. This includes a draft ICO international data transfer agreement (IDTA), draft international transfer risk assessment and tool and updated guidance. You have until 5pm on 7 October 2021 to make your opinions on the drafts heard.

IDTA - Background

Many organisation’s will be scratching their heads after reading the above. Another round of guidance issued on international transfers? Another consultation on draft clauses? You could be mixing it up with the 4th June 2021 adoption by the EU of new Standard Contractual Clauses (SCCs) (read our blog: Data Transfer: EU Adopts New Model Clauses). The new EU SCCs brought long called for clarity on issues stemming from the fact that the old SCCs had been drafted before the implementation of GDPR. There are two key elements to the background of this new ICO consultation: Brexit and the European Court of Justice’s ruling in Schrems II.


With Brexit came the possibility for the UK to diverge from EU data protection laws. This is unlikely however, especially following the adoption of an adequacy decision by the European Commission on 28 June 2021. The adequacy decision allows data to flow freely from the EU to the UK. But it relies on the UK’s continued commitment to the obligations created by GDPR, as well as alignment on future developments in EU data protection rules (the adequacy decision will be automatically reviewed every four years).

Regardless of this willingness and incentive to remain aligned with the EU, the new EU SCCs will not apply in the UK due to Brexit, as with all new EU law, and so it was a natural progression for the UK to also adopt new clauses to reflect this change. Instead of SCCs, the ICO is calling this their international data transfer agreement (IDTA). Although the draft IDTA is in many ways similar to the new EU SCCs, there are also some differences. The changes are only really to do with the possible construction and use of the agreement rather than changing any of the actual obligations. An important way in which the ICO has followed the EU’s lead is by introducing measures reflecting the decision in Schrems II.

IDTA - Schrems II

The 2020 ruling in Data Protection Commissioner v Facebook Ireland and Maximillian Schrems (Case C-311/18) EU:C:2020:559, otherwise known as Schrems II, invalidated the EU-US Privacy Shield, essentially ending the free flow of data from the EU to certified organisations in the US. The ruling also introduced a new obligation on organisations transferring data out of the EU. It would now be necessary to undertake a risk assessment for such transfers.

Risk assessment

In response to these developments, the ICO has published a lengthy draft international transfer risk assessment and tool. At the start of the guidance it states:

“The Schrems II judgment embedded risk assessments into the rules on international data transfers. The Court held that before you may rely on an Article 46 UK GDPR transfer tool to make an international data transfer, you must carry out a risk assessment, and this is therefore a requirement under UK data protection laws.”

The ICO guidance focusses on two key aspects of the laws and practices of the destination country. First, whether the IDTA will be enforceable in that country, as this goes to the heart of what it means to put in place contractual protections. Secondly, it considers the destination country’s regime which might require that the importer gives third party access to the data being transferred. The most likely third party will be a government surveillance department. It is important to highlight which third parties may have access because this could conflict with the terms of an IDTA, which seeks to control and confine data to the parties of the agreement. Knowing exactly which third parties and for what reason they may have access is essential for assessing the safety of international data transfers, as reflected in Schrems II which deemed overly invasive US government surveillance as the main reason for invalidating the EU-US privacy shield.

The draft risk assessment produced by the ICO is supposed to be used alongside an IDTA. It states that it is only really intended to be used for routine international data transfers and that more complex transactions will need a more complex bespoke assessment. Complexity may be introduced by the riskiness of the data being processed or the human rights record of the country to which data is being transferred. The assessment and guidance includes:

  • There is no need for the country to which data is being transferred to have an identical data protection legal system but rather to reflect certain values.
  • It could be good for a country to have surveillance laws because this implies that such surveillance is being regulated.
  • There are examples to illustrate low, medium or high risk.
  • It highlights the importance of being able to enforce the data transfer agreement. A jurisdiction in which obtaining judgment is seriously impeded will increase risk.
  • The ICO states that if a risk assessment is undertaken, but such assessment turns out to be wrong, it will take into account the difficulty of carrying out such an assessment. Therefore, if the ICO deems an assessment to have been undertaken diligently, this will be reflected in any potential regulatory action.

Draft IDTA

The draft international data transfer agreement is bespoke to the UK. It is to be used when transferring UK personal data out of the UK. The improvements it makes:

  • The tables at the start of the agreement allow the parties to input all the information peculiar to their agreement. This is a simple way of making sure all of the information needed is available from the outset. Once organisations are used to the tables, they should be very easy to fill out and so new relationships should be easy to facilitate.
  • Unlike the new EU SCCs which provide a number of different modules, i.e. a number of different documents for different relationships such as a controller-to-processor agreement, processor-to-processor agreement etc., the new IDTA is a document which captures all such potential relationships. It does this by specifying that certain clauses do or do not apply to either processors or controllers or other third parties.
  • The agreements can also easily be made multi-party or can be drafted to give one party the power to make decisions for everyone.
  • The IDTA encompasses a wider range of relationships including transfers from a processor to a third party who is not a sub-processor, but some other third party, and transfers between joint controllers.

Using the EU SCCs

An important question which the new ICO drafts and guidance address is that of whether or not organisations can use the new EU SCCs to transfer UK personal data out of the UK. The answer to which is yes but on the condition that a “UK addendum” is added to any such agreement. The addendum will modify the parts of the EU SCCs referring to EU law and is flexible to further modification, within limits.

However, it is clear that the EU SCCs without modification will be insufficient and this is particularly concerning given that it will take a long time for the ICO to approve the addendum. Until the addendum is approved, data transfers to and from the UK and EU will have to be facilitated by different agreements. What is also clear is that the old SCCs will cease to be valid as they do not account for the provisions in GDPR or the ruling in Schrems II. The consultation proposes that the old SCCs should stop being used three months and forty days after the IDTA is laid before parliament. For existing data transfers that time period is extended to 21 months after the IDTA is officially approved.

It should also be noted that the draft addendum can be used to alter other data transfer agreements such as the New Zealand or ASEAN agreements.

IDTA - Make your voice heard

On the whole we think the proposals made by the ICO are sensible and should be supported, especially concerning the draft addendum to the new EU SCCs. The addendum will make UK-EU business much easier to manage, allowing organisations to use one set of clauses for all of their operations, regardless of jurisdiction, with only minimal extra documentation. For the moment however, organisations need to be wary of divergence in legal systems and the additional burdens created by the ruling in Schrems II. The risk assessment and guidance introduces welcome clarifications for international transfers, giving hope to the idea that if organisations use best efforts within the scope of the guidance given, this will be reflected positively in any ICO regulatory action.

EM law specialises in technology and data protection law. Get in touch if you need advice on data protection law or if you have any questions on the above.

ICO Fines Transgender Charity Mermaids

ICO Fines Transgender Charity Mermaids

The Information Commissioner’s Office (ICO) has fined charity Mermaids £25,000 for failing to keep personal data (some of which was sensitive personal data) secure. ICO fines for failing to comply with data protection laws can go up to £17.5 million or 4% of an organisation’s total worldwide annual turnover, whichever is higher.


Mermaids is a charity that supports transgender and gender-diverse children and their families. It started out as a support group formed by parents whose children were experiencing gender incongruence. It registered with the Charity Commissioner in 1999. The Charity Commissioner’s website shows that most of Mermaids’ income is derived from donations and legacies with total income for the financial year ending 31 March 2020 standing at £902,437.

In August 2016 the CEO of Mermaids set up an internet-based email group service at The CEO created GeneralInfo@Groups.IO so that emails could be shared between the CEO and the 12 trustees of the charity. The email service offered various settings for security and privacy:

  • “Groups listed in directory, publicly viewable messages”
  • “Group not listed in directory, publicly viewable messages”
  • “Group listed in directory, private messages” and
  • “Group not listed in directory, private messages”.

The Mermaids group email service was set up under the default option “Groups listed in directory, publicly viewable messages”.

The Groups.IO email service was in active use by Mermaids from August 2016 until July 2017. After it became dormant it continued to hold emails. In addition to communications between the trustees, the emails included some forwarded emails from individuals who were using Mermaid’s services. Those emails included personal data, in some case relating to children, and some of the data was special category data (i.e. data concerning health, sex life or sexual orientation).

In June 2019 a service user of the charity who was the mother of a gender non-conforming child, informed the CEO that she had been contacted by a journalist from the Sunday Times who had told her that her personal data could be viewed online. The journalist had informed the parent that by searching online he could view confidential emails including her child’s name, date of birth, mother’s name, her employer’s address, her mobile telephone number and details of her child’s mental and physical health.

On the same day, Mermaids received pre-publication notice from the Sunday Times that the emails were accessible online and the newspaper would be publishing an article about the incident.

Mermaids immediately took steps to block access to the email site and engaged lawyers. They began informing data subjects about the incident, contacted the ICO to report what had happened and took other measures to deal with the situation.

ICO findings

The ICO’s investigation found, amongst other things, that Mermaids had failed to ensure that adequate measures were in place to ensure the appropriate security for personal data and as a result, 780 pages of confidential emails containing personal data relating to 550 individuals were searchable and viewable online for almost three years by third parties. The ICO also found that in the period May 2018 to June 2019 there was a negligent approach towards data protection at Mermaids, data protection policies were inadequate and there was a lack of adequate training. The ICO found that Mermaids should have applied restricted access to its email group and used pseudonymisation or encryption to add an extra layer of protection to the personal data it held and shut down the email group correctly when it was no longer in use.

ICO fine

On 5 July 2021 an ICO fine was imposed on Mermaids of £25,000.

In arriving at the fine the ICO took into consideration:

  • Mermaids’ income
  • The gravity of the incident
  • The fact that special category data was made public
  • The duration of the data breach
  • The number of data subjects affected
  • The damage caused
  • The intentional or negligent character of the infringement
  • The action taken by Mermaids to mitigate the damage caused
  • The degree of responsibility of Mermaids taking into account the technical and organisational measures they implemented
  • Any relevant previous infringements
  • The degree of cooperation provided by Mermaids with the ICO in order to remedy the infringement and mitigate the damage caused
  • Other aggravating or mitigating factors

The ICO’s Monetary Penalty Notice (which gives further detail and explanation about the ICO’s findings) can be accessed here.


One never wants to see an organisation receiving an ICO fine. However, given the nature of the work that Mermaids does and the sensitivity of some of the personal data that was made public, the fine appears low. Many businesses, especially small businesses, will try and find ways to cut corners to make their budgets or resources stretch further. Some businesses, especially those who do not process special category data, may feel from reading this ICO decision that the worst that can happen to them if they do not have proper data protection processes in place is that they are going to be fined less than £25,000.

In its decision the ICO took into account not just “the prompt remedial action taken by Mermaids” but also that “this breach was highlighted in a national newspaper and that resulted in a degree of reputational damage to the charity”. It also seems that the fact that Mermaids was a charity had some bearing on the ICO decision with the ICO balancing the fine as a deterrent against not wanting to be “taking away donations made by the public.”

The ICO took into account the financial position of Mermaids. While we do not know what the content of Mermaids’ representations were in this regard, the charity made a loss for its financial year ended 31 March 2020 with total expenditure of £1,041,325 against income of £902,437. Without us knowing the true financial position, it appears that if Mermaids had received an ICO fine of, say, £250,000, this may well have caused the charity to shut down.

It is worth noting as well that in addition to the ICO fine imposed, Mermaids costs for engaging lawyers and other consultants and dealing with the fallout from the incident would have been significant. Mermaids is also vulnerable to claims being brought against it by the data subjects themselves.

If you have any questions on data protection law or compliance please get in touch with one of our data protection lawyers.

data transfers

Data Transfers: EU Adopts New Model Clauses

Data transfers and their legal mechanisms are changing. Standard Contractual Clauses (SCCs) have been an integral part of international data transfers. Under EU data protection law, organisations handling EU personal data cannot transfer such data to third countries without some form of protection. SCCs have become the most practical, and hence most used, form of ensuring such protection. Following the publishing of new draft SCCs back in November and a subsequent consultation period, the EU commission announced on the 4th June 2021 that they had adopted two new sets of SCCs, updating previous clauses which were adopted before the introduction of GDPR. Hence the new SCCs are a product of the ramped up regulatory environment created by GDPR. Additionally, and significantly, the new SCCs respond to the ruling last summer (July 2020) in Schrems II.

What are Standard Contractual Clauses?

SCCs are essentially a set of clauses to enable lawful data transfers of EU personal data. They can be copied into a contract or form an independent agreement between a data exporter (based in the EU or UK) and a data importer (based in a third country) to ensure an adequate level of protection for personal data being transferred between two entities. Two sets of clauses have been published by the EU commission: one for the transfer of personal data to third countries and one for use between controllers and processors based in the EU.

Schrems II

The ruling in Data Protection Commissioner v Facebook Ireland and Maximillian Schrems (Case C-311/18)was significant for SCCs for two reasons. Firstly, it invalidated the EU-US Privacy Shield. Which many US organisations relied upon to transfer personal data out of the EU. In the same way that SCCs are considered to give personal data transfers out of the EU adequate protection, the Privacy Shield, if all principles were adhered to, could also offer such protection. Now that it has been invalidated more US organisations will be relying upon SCCs to ensure adequate protection of personal data transfers.

Secondly the ruling reviewed the effectiveness of the SCCs in place at the time and, whilst considering them to be a valid mechanism for data transfers, introduced new obligations on both data importers and data exporters. It was said that organisations should review the agreements they have in place by assessing whether to implement additional technical and organisational as well as contractual measures. This amounts to what has been described as a ‘data transfer impact assessment’ and is directly addressed in the new SCCs published by the EU commission.

New SCCs for data transfers

The new SCCs have therefore been introduced to make sure they align with the high standards of data privacy introduced by GDPR, amend previous deficiencies, such as a lack of variety of potential arrangements, and to address uncertainties into how to assess whether or not to implement organisational/technical measures after the ruling in Schrems II. These are dealt with respectively below.

New obligations/liabilities

Firstly, the new SCCs impose a ‘light-weight’ form of the GDPR on data importers. This comes in the form of third party rights for data subjects. This includes data importers considering the following obligations: purpose limitation, transparency, accuracy, data minimisation, retention, security, onward transfers, data subject rights, complaints mechanism and submission to jurisdiction. The final obligation means a data importer must submit to the laws of the EU country from which the personal data is being exported, including its courts and data protection regulatory authority.

Secondly, the data importer must now notify the data exporter in case of requests from public authorities or any direct access by public authorities to data transfers protected by SCCs. Data importers are also expected to try and obtain a waiver of a prohibition for a data exporter to be notified of such public authorities’ access.

And thirdly, data importers and exporters are now liable in relation to any damages to data subjects caused by a breach of the SCCs – material or non-material. In contrast to the GDPR, which requires a breach of both parties in case of joint liability, in some scenarios created by the new SCCs (controller-to-processor and processor-to-processor), the data exporter in Europe is now liable for violations by its processor or even sub-processor.

Modular approach for data transfers

The new SCCs employ a modular approach i.e. they create potential for an increased number of data transfer scenarios/modules. This includes:

  • controller to controller;
  • controller to processor;
  • from processor to sub-processor; and
  • processor to controller.

The processor to sub-processor module solves a long-standing problem. Up until now processors have been unsure of how to justify transfers to third countries. Now specific clauses exist to enable such data transfers. The only possible issue with the new modules is that any sub-processor wishing to engage a further sub-processor will have to get the permission of the original controller.

The new SCCs also allow the clauses to be used in a multi-party agreement without having to be replicated for each individual relationship. In practice this has been going on for a while but now it has been officially sanctioned. A related innovation in the new SCCs is also the possible introduction of a docking clause. The docking clause allows new parties to be added to the agreement over time.

Data transfer impact assessments

Clause 14 lays out the ways in which parties to an agreement can ensure compliance with the obligations introduced by Schrems II for data transfers. It says the parties must take due account of:

  • the specific circumstances of the transfer, including the length of the processing chain, the number of actors involved, and the transmission channels used; intended onward transfers; the type of recipient; the purpose of processing; the categories and format of the transferred personal data; the economic sector in which the transfer occurs, and the storage of the data transferred.
  • the laws and practices of the third country of destination – including those requiring the disclosure of data to public authorities or authorising access by such authorities.
  • any relevant contractual, technical, or organisational safeguards put in place to supplement the safeguards under the SCCs, including measures applied during transmission and to the processing of the personal data in the country of destination.


  • the parties agree to document the assessment described and make it available to supervisory authorities on request.
  • the data importer warrants that it has made its best efforts to provide the data exporter with relevant information and agrees that it will continue to cooperate with the data exporter in ensuring compliance with this assessment.
  • the data importer must notify the data exporter either of a public authority’s request to access data or where the public authority directly accesses personal data. If the data importer is unable to make that notification it must use best efforts to obtain a waiver.
  • the data importer must review access to personal data requests for legality and challenge them if there are reasonable grounds to do so. It must document its legal assessment and minimise the data disclosure as much as possible.

Moving forwards with data transfers

Organisations will be relieved to know that the European Commission has allowed for an 18-month transition period in which the previous SCCs will still be legally recognised (opposed to the 12-month transition period suggested in the drafts). This should give time to review current data transfers, agreements and update clauses where needed.

We are also still waiting for the final ‘Recommendations for Supplementary Measures’ in relation to the ruling in Schrems II from the European Data Protection Board (EDPB), which were open for feedback after being published in draft form in November. The EDPB have said ‘the recommendations… were subject to a public consultation. The EDPB received over 200 contributions from various stakeholders, which it is currently analysing’.

UK perspective

As it stands the new SCCs are not recognised in the UK and it will be up to the ICO to decide whether to accept their usage. The ICO is currently preparing its own contractual clauses and will consult on them over the summer. Allowing the use of both EU and UK approved SCCs will no doubt be of benefit to the EU’s adequacy decision for the UK (meaning whether the EU considers the UK adequate for data protection purposes and hence allow free flows of data to occur between the two regimes).

It is important to note however that the UK has been wanting to liberalise data transfers for some time and so the new ICO sanctioned clauses may well be less cumbersome than the new EU ones. Finally, for clarity, the old EU SCCs remain valid in the UK and should be, for the time being, the place where organisations transferring UK personal data go when putting agreements in place. You can find the clauses on the ICO’s website here.

Here to help

Data transfers have up until now often been a case of signing up to some clauses or entering into an agreement and then leaving it be. With the introduction of GDPR, the ruling of Schrems II and now the old pre-GDPR SCCs outdated, organisations need to be mindful of new obligations and most significantly the need for transfer impact assessments. Such assessments may need to be undertaken by a third party. If you need your current data transfer agreements reviewed, we are here to help.

EM law specialises in technology and data protection law. Get in touch if you need advice on data protection law or if you have any questions on the above.

Data Compliance

Data Compliance – Updates You Need To Make To Your Policies

Data compliance is an essential part of everyone’s business. There have been several shifts in UK law’s data compliance regime following Brexit and the ruling in Schrems II. This means that plenty of businesses’ privacy policies are not up to date. Changes range from simply swapping in different references to legislation, to considering the effect that Brexit has had on cross-border transfers of data. For those transferring data to the US, the invalidation of the Privacy Shield framework should also be considered. Here is our guide on the updates businesses should consider making to their privacy policies (and the issues we frequently spot when dealing with clients’ data protection documentation or doing due diligence on other companies’ privacy policies).

Data compliance post Brexit

With the start of 2021, and the end of the EU-UK transition period, the retained EU law version of the General Data Protection Regulation ((EU)2016/679), called the UK GDPR, applies in the UK, along with the Data Protection Act 2018 (DPA 2018). Therefore, the main body of data protection law in the UK is now made up of the UK GDPR and the DPA 2018.

So as a simple starting point for updates that need to be made to privacy policies, it should be made sure that all references to GDPR are changed to the UK GDPR. There may also be references to ‘Applicable Data Protection Laws’ and so the definition of these applicable laws needs to be changed to include the UK GDPR and the DPA 2018.

It is important to note that the EU GDPR (the data protection regime in the EU) will continue to have extra-territorial effect and so may apply to UK controllers or processors who have an establishment in the EU, or who offer goods or services to data subjects in the EU, or who monitor their behaviour as far as their behaviour takes place within the EU. So, if you operate in the EU as well as the UK you should consider including references to the EU GDPR. Additionally, it is important to be aware that even though your privacy policy may now refer to the UK GDPR, if you operate in the EU, you should consider the consequences of operating in two data protection regimes. This may include a review of your mechanisms for cross-border transfers of data to the EU.

Data compliance: cross-border transfers of data

Now that the UK has left the EU, all data transfers from the UK to the EU or vice-versa are defined as cross-border transfers for the purposes of data protection law. This means that to address data compliance additional safeguards need to be in place, for example Standard Contractual Clauses or reliance upon an adequacy decision (a decision made by a relevant authority that data protection is adequate in a particular country and hence data can flow there freely). As it stands, in June 2021, the UK has granted the EU an adequacy decision but the EU are yet to grant one to the UK.

In relation to updating your privacy policy, it will now be important, if transferring data to and from the EU, to show which safeguards you are relying upon to do so. It should be noted, however, that on 24th December 2020, the UK and the EU reached a trade and co-operation agreement addressing the arrangements following the end of the Brexit transition period on 31st December 2020. The agreement includes an interim provision (bridging mechanism) for transmission of personal data from the EU to the UK which could last up to six months. Therefore, under the current circumstances (as in June – the time of this blog), companies do not need to have additional safeguards in place for transfers of personal data to the EU. Regardless of these developments businesses should, however, state in their privacy policies that they are relying upon these provisional agreements and adequacy decisions to transfer data to the EU. This could start by including a simple acknowledgement in a privacy policy that any personal data transfers from the UK to the EU, are transfers taking place between two separate data protection regimes.

Privacy Shield invalidated

The EU-US Privacy Shield was a framework constructed by the US Department of Commerce and the European Commission to enable transatlantic data protection exchanges for commercial purposes. The Privacy Shield enabled companies from the US to comply with data protection requirements and enable free flows of personal data to and from the EU, without the need for additional safeguards (such as those expected for third countries – countries not deemed to have adequate levels of protection by the EU for personal data - such as the US).

The Privacy Shield was invalidated in July 2020 following the ECJ’s preliminary ruling in Data Protection Commissioner v Facebook Ireland and Maximillian Schrems (Case C-311/18) and should therefore no longer be relied upon for transfers to the US. The standard contractual clauses controller-to-processor were not invalidated and so organisations can still rely upon them when transferring data to the US. This means that any reference to the Privacy Shield in a privacy policy needs to be erased and the mechanisms which an organisation is now using to transfer data to the US need to be clearly stated. In the majority of cases this will mean mentioning the use of standard contractual clauses.

Data compliance checklist

Here is a list of things to consider when editing your privacy policy to ensure data compliance:

  • All references to UK data protection laws and legislation needs to be a reference to the UK GDPR and DPA 2018.
  • Transfers of data to the EU need to be treated as cross-border data transfers and so the legal basis for making these transfers needs to be stated (such as an adequacy decision, the current bridging mechanism, standard contractual clauses, binding corporate rules etc.).
  • Any reference to the EU-US Privacy Shield need to be erased for data transfers to the US and if standard contractual clauses are now being used this needs to be mentioned.

Data Compliance and Transparency

As part of the UK GDPR principles, businesses must comply with the transparency requirements set out in Articles 13 and 14 of the UK GDPR. The transparency principles require all controllers to notify data subjects about their personal data handling practices through a privacy policy, at the time that data is collected. It therefore follows that if your business has changed the way it processes the personal data of its customers, due to developments discussed in this blog,  and is relying on a new basis for that processing (i.e. UK GDPR instead of the previous regime), it goes without saying that, in order to comply with the transparency principles, such businesses should update their privacy policy to reflect this. For an online business, that will usually mean updating their website privacy policy.

Multiple jurisdictions

Organisations with entities in multiple jurisdictions face data compliance challenges when trying to implement website privacy policies as part of a global privacy compliance programme. Multinationals must choose between implementing a single, global privacy policy applicable for all its customers globally or jurisdiction specific policies. Taking into account that even within the EU, member states are likely to have varying rules on data protection. This will mean paying attention to the references to legislation in jurisdiction specific policies and being clear about how exactly cross-border data transfers are taking place between different branches of an organisation. Given the potential complexity of the structure of such data transfers it would be worth seeking legal advice. Many privacy policy regulators, including the ICO, recommend a layered policy format, which pairs a short summary with a linked detailed disclosure, as the most effective way to simplify a complex privacy policy and make it clearly and conspicuously accessible.

Here to help

Hopefully this blog should give you enough scope to update your privacy policy and address data compliance. But we can also help you do so. With Brexit and the ruling in Schrems II, data compliance has become legally complex but that doesn’t mean a practical approach for businesses isn’t possible. The next big step in UK data law is whether or not an adequacy decision will be granted. The decision is currently in the comitology procedure which means all EU member states need to agree the drafting. If an adequacy decision is reached, then data flows will be unimpeded between the EU and UK. Regardless of such a decision however, references to legislation and the mechanisms relied upon for cross-border transfers will still need to be updated.

Other developments include the recent publishing of updated Standard Contractual Clauses by the European Commission. This means that agreements which export EU data to a third country and rely upon Standard Contractual Clauses should be updated. The new versions also incorporate means by which to adhere to new requirements for cross-border transfers following the decision in Schrems II. Schrems II introduced an obligation to assess local data laws before going ahead with a transfer.

EM law specialises in technology and data protection law. Get in touch if you need advice on data protection law or if you have any questions on the above.

Managing Data

Managing Data – Software Services And AI Legal

Managing data is an essential part of the operation of a growth business. It’s a cliché often bandied around that today data is more valuable than oil. But as with oil, it’s only how the resource is used that defines its value. Whereas oil can be relied upon to produce energy in all circumstances, data cannot be relied upon to produce useful insights at all times. Therefore, the means and purpose by which it is processed becomes all the more important. Given its potential, it comes as no surprise that initiatives, public and private, for managing data more effectively are commonplace. The legal sphere attempting to regulate this burst of energy gets more complex by the day. Here is our introduction to some general issues you may face when managing data for profit, or to simply improve the running of your business.

GDPR and Brexit

Before GDPR came into force in all EU member states on 25 May 2018, the ICO commissioner stated in the ICO’s March 2017 paper, Big data, artificial intelligence, machine learning and data protection, that ‘it’s clear that the use of big data has implications for privacy, data protection and the associated rights of individuals… In addition to being transparent, organisations… need to be more accountable for what they do with personal data’.

At the end of the Brexit transition period (January 1st 2021), the GDPR and parts of the Data Protection Act 2018 became part of a new body of retained EU law. Essentially replicating the old regime in the UK. Data protection legislation in the UK is now comprised of the UK GDPR and the DPA 2018. From a UK perspective the GDPR operating in the EU will be known as the EU GDPR.

As the EU GDPR will continue to have extra-territorial effect (Article 3, EU GDPR) it may continue to apply to UK organisations who act as controllers or processors and have an establishment in the EU, or who offer goods or services to data subjects in the EU; or monitor their behaviour, as far as their behaviour takes place within the EU. UK businesses could therefore find themselves subject to parallel data protection regulatory regimes under both the UK GDPR and the EU GDPR.

Are you managing data as a processor or controller?

If offering a service, for example a software platform that allows companies to process personal data, then it would often be prudent to ensure you are defined as a data processor, and not a data controller, for data protection purposes. This is because, as opposed to data controllers who bear primary responsibility for the personal data involved, data processors have less obligations under data protection laws. Processers are essentially processing data under the instructions of the data controller. Whilst a data controller determines ‘the purposes and means’ of processing the personal data (Article 4(7), UK GDPR). A helpful way of thinking about it is that a data controller has direct duties to data subjects whereas a data processor only has duties to the data controller.

The distinction between controller and processor in an AI context was first considered in the ICO’s July 2017 decision on an agreement between the Royal Free Hospital and Google DeepMind. Under the agreement DeepMind used the UK’s standard publicly available acute kidney injury algorithm to process personal data of 1.6 million patients. The ICO ruled that the hospital had failed to comply with data protection law and was ordered to perform an audit on the system. The hospital’s law firm, Linklaters, concluded in the hospital’s audit report, Audit of the acute kidney injury detection system known as Streams, that DeepMind had been properly characterised as a data processor. This was because Streams ‘does not use complex artificial intelligence or machine learning to determine when a patient is at risk of acute kidney injury. Instead, it uses a simple algorithm mandated by the NHS’. It was therefore the lack of complexity involved in the ‘means’ of processing the personal data which meant that DeepMind were considered to be a data processor. A complex algorithm would have constituted a level of agency on DeepMind’s part which would have rendered their processing that of a data controller. It was deemed, however, that their services were simple enough to be doing nothing more than following the hospital’s instructions. This grey area should be of concern to anyone planning to use AI to analyse data. Make an algorithm too complex and you may take on the liability of a data controller and hence liability towards data subjects.


Managing data to make it anonymous would fall under UK data protection laws. This is because the purpose with which the personal data was originally collected needs to be aligned with the purpose that it is later anonymised for. There are certain circumstances in which collecting personal data to begin with is not necessary and, if still useful, highly desirable for businesses wishing to process the data as they wish. If the data is originally collected in an anonymous format, then UK GDPR no longer applies. As GDPR states at recital 26, ‘the principles of data protection should… not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable’.

In an ICO report, Anonymisation: managing data protection risk code of practice, the ICO lists anonymisation as one of its six key recommendations for AI. It states ‘organisations should carefully consider whether the big data analytics to be undertaken actually requires the processing of personal data. Often, this will not be the case; in such circumstances organisations should use appropriate techniques to anonymise the personal data in the data sets before analysis’.

Profiling and automated decision making

AI’s ability to uncover hidden links in data about individuals and to predict individuals’ preferences can bring it within the GDPR’s regime for profiling and automated decision making. Article 22(1) states that ‘a data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly affects him or her’.

However, this is qualified by article 22(2) which states that this right does not apply to a decision that ‘(a) is necessary for entering into or performance of a contract between data subject and data controller; (b) is authorised by… law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or (c) is based on the data subject’s explicit consent’.

This is further qualified: ‘in the cases referred to in points (a) and (c)…, the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, (including) at least the right to obtain human intervention on the part of the controller, to express his or her point of view or contest the decision’. Having automated decision making within software performing data analysis can therefore introduce new obligations. Such obligations often being onerous for a data controller. This can include it being necessary to perform a Data Protection Impact Assessment or getting explicit consent from data subjects.

Other suggested compliance mechanisms

The ICO makes five recommendations for using AI to analyse data:

  • Privacy notices.
  • Data protection impact assessments – embed a privacy impact assessment framework into data processing activities to help identify privacy risks and assess the necessity and proportionality of a given project.
  • Privacy by design – implementing technical and organisational measures to address matters including data security, data minimisation and data segregation.
  • Ethical principles.
  • Auditable machine learning algorithms.

Treasure trove

Finding new and innovative ways for managing data is a treasure trove many wish to unlock. It is important to be wary of the growing regulatory landscape underpinning the sector. The world was shocked by the accusations made against Cambridge Analytica and making sure you display compliance is a must for maintaining a good reputation and attracting clients. With Brexit comes the potential for the complexity inherent in potentially diverging legal regimes. Being up to date on the development of the Privacy and Electronics Communications Regulations (PECR) will also be useful. Read our blog on PECR here.

EM law specialises in technology and data protection law. Get in touch if you need advice on data protection law or if you have any questions on the above.

Draft Adequacy Decisions

Draft Adequacy Decisions: Data Flows EU to UK

Draft adequacy decisions were published on 19 February 2021 by the European Commission (EC) for personal data transfers from the EU to the UK. The significance of the drafts are considerable given they are the first to be produced since the European Court of Justice’s (ECJ) ruling in Schrems II which struck down the adequacy decision previously granted to the EU-US Privacy shield.

The EC’s press release on the draft adequacy decisions stated that it has carefully assessed the UK’s law and practice on personal data protection, including the rules on public authorities access to personal data, and concluded that the UK ensures an ‘essentially equivalent’ level of protection to that guaranteed under the EU GDPR and Law Enforcement Directive.

What does adequacy mean?

‘Adequacy’ is a term that the EU uses to describe other countries, territories, sectors or international organisations that it deems to provide an ‘essentially equivalent’ level of data protection to that which exists within the EU. An adequacy decision is a formal decision made by the EU which recognises that another country, territory, sector or international organisation provides an equivalent level of protection for personal data as the EU does. The UK is seeking adequacy decisions under both the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED).

The effect of an adequacy decision is that personal data can be sent from an EEA state to a third country without any further safeguard being necessary. The trade deal agreed between the UK and the EU means that the UK has a bridge until 30 June 2021 where data can continue to flow from the European Economic Area (EEA) to the UK whilst the adequacy decisions process takes place. The bridge can finish sooner than this if the EU adopts adequacy decisions in respect of the UK.

Transfers of data from the UK to the EEA are permitted. The UK Government has recognised EU Commission adequacy decisions made before the end of the transition period. This allows restricted transfers to continue to be made from the UK to most organisations, countries, territories or sectors covered by an EU adequacy decision.

Adequacy criteria

In order to draw conclusions on the UK’s data protection regime, the EC assessed a number of factors when producing the draft adequacy decisions:

  • UK constitution – especially in relation to the UK’s adoption of the rights in the European Convention on Human Rights in its UK Human Rights Act 1998.
  • UK data protection laws – particularly how the UK has adopted EU data laws following Brexit through the implementation of the UK GDPR and maintenance of the DPA 2018. This includes the incorporation of both the territorial and material scope of EU data law as well as definitions, principles and rights afforded to individuals. The main point being that they are all equivalent to those in the EU GDPR.
  • Restrictions on transfers outside of the UK – how, via the implementation of the UK GDPR, the rules on international transfers of data are as restrictive as under the EU GDPR, and how data subjects in the EU can therefore have confidence that onwards transfers of data will be effectively restrained.
  • Enforcement – the Information Commissioner’s Office (ICO) is the “independent supervisory authority tasked with powers to monitor and enforce compliance with the data protection rules” and is equivalent to the various data protection authorities to be found throughout the member states of the European Union. The EC considered the number of cases investigated and fines imposed by the ICO, as a method by which to deduce its legitimacy.
  • Redress – here the EC highlighted the ability of data subjects to make complaints with the ICO, prosecute for damages under the UK GDPR and utilise the Human Rights Act 1998 to express their data rights, with the European Court of Human Rights as an ultimate source of authority.

Consequences of adoption

If adopted, the draft adequacy decisions will be valid for an initial term of four years, only renewable if the level of protection in the UK continues to be adequate. The drafts include strict mechanisms for monitoring and review, suspension or withdrawal, to address any problematic development of the UK system which will no longer be bound by EU privacy rules.

UK government response to the draft adequacy decisions

The UK government has welcomed the draft adequacy decisions, urging the EU to fulfil its commitment to complete the approval process swiftly. The Information Commissioner described the progress as "an important milestone in securing the continued frictionless data transfers from the EU to the UK".

The draft adequacy decisions are now with the EDPB for a "non-binding opinion", following which the EC will request approval from EU member states' representatives. It could then adopt final adequacy decisions. Until then, organisations continue to be able to receive personal data from the EU under the temporary "bridging mechanism", agreed in the EU-UK Trade and Cooperation Agreement.

Schrems II

The draft adequacy decisions also include a detailed assessment of the conditions and limitations, as well as the oversight mechanisms and remedies applicable in case of access to data by UK public authorities, in particular for law enforcement and national security purposes. These are likely included to address the ECJ's ruling in Schrems II and concerns over the UK's use of mass surveillance techniques.

In Schrems II, the ECJ ruled that free data flows moving from the EU to certain US organisations under the EU-US privacy shield did not offer an essentially equivalent level of protection as under EU law. This was substantially based on the fact that national security laws in the US were deemed to undermine citizens’ data rights. When assessing the UK, the ECJ, in light of the ruling in Schrems II, was always going to pay close attention to UK national security laws. Additionally, Schrems II introduced more stringent obligations on organisations when carrying out cross border data transfers and so there has been a general concern that this newly stringent approach may reduce the UK’s chance of receiving an adequacy decision. The drafts can therefore be seen as a highly positive step.

What stands in the UK’s way?

Although the process for an adequacy decision under the EU GDPR is now underway with the draft adequacy decisions in place and, although the UK government has stated on a number of occasions that it is confident that the EU will deem the UK data protection regime ‘essentially equivalent’, it is worth noting that a number of issues may impact on the UK's ability to satisfy the EU:

  • The UK's use of mass surveillance techniques may lead to EU member states raising concerns about data protection in the UK, which might jeopardise an Adequacy Decision. The ruling of the ECtHR which held that aspects of the UK's surveillance regimes under the Regulation of Investigatory Powers Act 2000 (RIPA) did not comply with Articles 8 and 10 of the ECHR, is particularly relevant (Big Brother Watch and others v United Kingdom). The human rights groups which brought the claim were not satisfied with the judgment and appealed to the Grand Chamber, the ECtHR's highest judicial bench.
  • Membership of the Five Eyes intelligence sharing community means EU citizens' data could be transferred by UK security services to third countries (including the US) which are not considered to have adequate data protection.
  • Potential for unprotected onward data transfers as the UK will be able to decide which countries it deems adequate and what arrangements to have with them.

The draft adequacy decisions - a positive step

Although nothing can be taken for granted, the draft adequacy decisions are a positive step and the fact that the UK has committed to remaining party to the ECHR and "Convention 108", will likely carry some leverage as adherence to such international conventions is important for the stability and durability of adequacy findings.

If you have any questions on the draft adequacy decisions, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


E-Privacy – PECR and Brexit

E-Privacy regulations complement data protection laws by setting out privacy rights for electronic communications. The idea being that whilst widespread public access to digital mobile networks and the internet has opened up new possibilities for businesses and users, they have also created new risks for privacy. E-Privacy regulations have been a point of contention within the EU and reform has been expected for some time. On 10 February 2021, 4 years after the European Commission’s initial legislative proposal and to the surprise of many, the European Council reached a compromise agreement on their position on the E-privacy Regulation. What this means for E-privacy rules in the UK remains to be seen. With Brexit behind us, and therefore no obligation to introduce new EU legislation in the UK, but with an adequacy decision pending, and therefore a desire for the UK to align with the EU on data protection, it is hard to say whether or not the UK will choose to implement them. For more information on data protection and a potential adequacy decision after Brexit read our blog.

E-Privacy and PECR

PECR are the Privacy and Electronic Communications Regulations which comprise the E-privacy regulations in the UK. Their full title is The Privacy and Electronic Communications (EC Directive) Regulations 2003. They are derived from European law. PECR have been amended a number of times. The more recent changes were made in 2018, to ban cold-calling of claims management services and to introduce director liability for serious breaches of the marketing rules; and in 2019 to ban cold-calling of pensions schemes in certain circumstances and to incorporate the GDPR definition of consent.

What kind of areas do PECR cover?

PECR cover several areas:

  • Marketing by electronic means, including marketing calls, texts, emails and faxes.
  • The use of cookies or similar technologies that track information about people accessing a website or other electronic service.
  • Security of public electronic communications services.
  • Privacy of customers using communications networks or services as regards traffic and location data, itemised billing, line identification services (eg caller ID and call return), and directory listings.

How does this fit with the UK GDPR?

The UK GDPR sits alongside PECR. PECR rules apply and use the UK GDPR standard of consent (which is a high threshold). This means that if you send electronic marketing or use cookies or similar technologies you must comply with both PECR and the UK GDPR. Unsurprisingly, there is some overlap, given that both aim to protect people’s privacy. Complying with PECR will help you comply with the UK GDPR, and vice versa – but there are some differences. In particular, it’s important to realise that PECR apply even if you are not processing personal data. For example, many of the rules protect companies as well as individuals, and the marketing rules apply even if you cannot identify the person you are contacting.

If you are a network or service provider, Article 95 of the UK GDPR says the UK GDPR does not apply where there are already specific PECR rules. This is to avoid duplication, and means that if you are a network or service provider, you only need to comply with PECR rules (and not the UK GDPR) on:

  • security and security breaches;
  • traffic data;
  • location data;
  • itemised billing; and
  • line identification services.

Electronic and telephone marketing

PECR restrict unsolicited marketing by phone, fax, email, text, or other electronic message. There are different rules for different types of communication. The rules are generally stricter for marketing to individuals than for marketing to companies. Companies will often need specific consent to send unsolicited direct marketing. The best way to obtain valid consent is to ask customers to tick opt-in boxes confirming they are happy to receive marketing calls, texts or emails from you.

E-Privacy: Cookies and similar technologies

Companies must tell people if they set cookies, and clearly explain what the cookies do and why. You must also get the user’s consent. Consent must be actively and clearly given. There is an exception for cookies that are essential to provide an online service at someone’s request (e.g. to remember what’s in their online basket, or to ensure security in online banking). The same rules also apply if you use any other type of technology to store or gain access to information on someone’s device.

Communications networks and services

PECR are not just concerned with marketing by electronic means. They also contain provisions that concern the security of public electronic communications services and the privacy of customers using communications networks or services. Some of these provisions only apply to service providers (e.g. the security provisions) but others apply more widely. For example, the directories provision applies to any organisation wanting to compile a telephone, fax or email directory.

EU Council position on E-Privacy rules

On 10 February 2021, EU member states agreed on a negotiating mandate for revised rules on the protection of privacy and confidentiality in the use of electronic communications services. These updated E-privacy rules will define cases in which service providers are allowed to process electronic communications data or have access to data stored on end-users’ devices. The agreement allows the Portuguese presidency to start talks with the European Parliament on the final text. The agreement included:

  • The regulation will cover electronic communications content transmitted using publicly available services and networks, and metadata related to the communication. Metadata includes, for example, information on location and the time and recipient of communication. It is considered potentially as sensitive as the content itself.
  • As a main rule, electronic communications data will be confidential. Any interference, including listening to, monitoring and processing of data by anyone other than the end-user will be prohibited, except when permitted by the E-privacy regulation.
  • Permitted processing of electronic communications data without the consent of the user includes, for example, ensuring the integrity of communications services, checking for the presence of malware or viruses, or cases where the service provider is bound by EU or member states’ law for the prosecution of criminal offences or prevention of threats to public security.
  • Metadata may be processed for instance for billing, or for detecting or stopping fraudulent use. With the user’s consent, service providers could, for example, use metadata to display traffic movements to help public authorities and transport operators to develop new infrastructure where it is most needed. Metadata may also be processed to protect users’ vital interests, including for monitoring epidemics and their spread or in humanitarian emergencies, in particular natural and man-made disasters.
  • In certain cases, providers of electronic communications networks and services may process metadata for a purpose other than that for which it was collected, even when this is not based on the user’s consent or certain provisions on legislative measures under EU or member state law. This  processing for another purpose must be compatible with the initial purpose, and strong specific safeguards apply to it.
  • As the user’s terminal equipment, including both hardware and software, may store highly personal information, such as photos and contact lists, the use of processing and storage capabilities and the collection of information from the device will only be allowed with the user’s consent or for other specific transparent purposes laid down in the regulation.
  • The end-user should have a genuine choice on whether to accept cookies or similar identifiers. Making access to a website dependent on consent to the use of cookies for additional purposes as an alternative to a paywall will be allowed if the user is able to choose between that offer and an equivalent offer by the same provider that does not involve consenting to cookies.
  • To avoid cookie consent fatigue, an end-user will be able to give consent to the use of certain types of cookies by whitelisting one or several providers in their browser settings. Software providers will be encouraged to make it easy for users to set up and amend whitelists on their browsers and withdraw consent at any moment.


PECR continues to apply after the UK's exit from the EU on 31 January 2020. The draft ePR, described in detail above, which is still in the process of being agreed, was not finalised before 31 January 2020 and will therefore not become directly applicable in the UK. Once it is directly applicable to EU member states (which is likely 24 months after its coming into force), the UK will then need to consider to what extent to mirror the new rules. In any case, given that UK companies will continue to process data of EU end users, it will still be necessary to be aware of any discrepancies created by E-privacy reform in the EU.

The deadlock is over

It has long been considered that EU E-privacy regulations have lagged behind the technological progress seen in online marketing techniques and EU negotiations around reform have at times seemed never-ending. The agreement reached by the EU council will therefore be seen as a necessary improvement in legal certainty, although plenty of questions still abound.

PECR in its pre-reformed state will continue to apply in the UK. On 19th February 2021, the European Commission issued its draft adequacy decision that would allow EU-to-UK data transfers. While the E-privacy Regulation is not strictly relevant to the UK’s continued adequacy status, alignment on E-privacy rules would likely be viewed positively by the EU institutions, which could prompt the UK to update its laws in line with the new EU regime. The reforms will of course also be relevant to any UK business that operates in the EU. Even if the Regulation is finally adopted this year, it will not apply for a further two years meaning, these changes will likely not come into effect until 2023 at the earliest.

If you have any questions on E-privacy and data protection, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.

Robot Manufacturing

Robot Manufacturing: Product Liability Law

Robot manufacturing is a growth industry as the costs of producing robots are going down while the savings in labour costs are rising. With the rise and rise of automation in all areas of life, the word robot has come to mean a wider variety of things. Artificial intelligence (read our blog on some legal issues) has received the most amount of attention recently especially given its crossover with big data (read our blog). But what about the more conventional notion of a robot – the walking, talking lump of steel, more willing to do jobs we’re not so keen on. The sort that product liability law applies to more obviously. This blog covers some issues that a robot manufacturer may encounter when putting such a product on the market.

Robot Manufacturing - product liability and safety risk management

Product liability lies in the accountability faced by a manufacturer of a sub-standard, defective or dangerous product. Such accountability applies to the members of the public who purchase such a product and those below a manufacturer in the supply chain. Contractual protections, insurance and effective risk management can be used to protect a manufacturer from such liability.

Can liability for a dangerous or defective product be excluded or limited?

Contractual protections create a variety of scenarios. Whilst a manufacturer may wish to limit its own liability to parties beneath it in the supply chain, they may well wish to enhance the liabilities of a manufacturer above it. The terms of a contract are the vehicle by which this can be achieved. Additionally a robot manufacturer will want to be certain that quality and safety standards are judiciously applied to every level of its supply chain, especially key suppliers. Given the potential technological complexity of such an industry, it may even be useful to seek the advice of specialist consultants who have the know-how to ensure all the liability that should be attributed to suppliers is and that a comprehensive set of standards is agreed to in the contract.

There are various ways that robot manufacturing companies can do this. Limiting and excluding liabilities within their contracts can be one way – but this is only possible for certain things. There are restrictions on limiting liability for dangerous or defective products. Restrictions that overrule contractual clauses. Robot manufacturers should therefore look to other means, i.e. non-contractual, to control their risk.

Different considerations need to be taken into account when dealing with the various parties within the supply chain. For example, manufacturers, distributors, importers and retailers will all have concerns specific to the role that they undertake.

Is there an effective quality and safety assurance programme in place?

Effective quality assurance is at the heart of non-contractual risk aversion for product manufacturers. This is particularly important in the robot manufacturing industry given that the products usually involve a lot of automation and therefore the blame for something going wrong is more likely to fall on the manufacturers’ shoulders, rather than the customers. Setting up an internal committee to oversee product safety is useful first step. The team should incorporate members from all over the business to ensure nothing is missed. The committee's function should be: to review products and their associated documents; ensure that all appropriate regulatory and internal procedures have been followed and documented before and after marketing; authorise any necessary action (for example, changing warnings or design); review after-sales monitoring reports for trends and significant incidents; review insurance arrangements.

Additionally, keeping good records of exactly what was supplied when and by whom can simplify such a committee’s job.

Is there an effective enquiries and complaints system in place?

A robot manufacturing company must be able to respond to any complaints or enquiries with regard to a product’s safety. This is a legal requirement as well as being based on a desire to look out for your customers and improve your product. Things to think about: does the company have a system to handle customer enquiries and complaints? If so, could it be improved? Are staff adequately trained? Does the company have a policy of recovering allegedly unsafe items and investigating and recording the items and circumstances? Is there a systematic review of adverse incident information involving multi-disciplinary input from different departments? Can the company identify repeat claimants who may not be genuine?

The Services Directive (2006/123/EC), before Brexit, created an obligation to inform customers of how to make complaints and how manufacturers should deal with such complaints. The Services Directive is implemented in the UK by the Provision of Services Regulations 2009 (SI 2009/2999) (PSRs). The PSRs apply to most product manufacturers in the UK and introduce obligations around how to respond to enquiries or complaints about their products.

After Brexit, the Provision of Services (Amendment etc.) (EU Exit) Regulations (SI 2018/1329) implemented this EU law into UK law. All of the obligations around how to deal with enquiries or complaints essentially remain the same. Although some changes were made to EEA-specific provisions. This included the revocation of a requirement not to discriminate against customers based on their place of residence. This means that manufacturers in the UK could treat customers in the EEA differently to customers in the UK now that we have left the EU. The practical implications of this change are yet to be seen.

Robot Manufacturing - Data protection and privacy

The use of robots (drones being a good example) fitted with cameras or other sensors which can collect personal data such as images of people or vehicle plate numbers, geolocation data or electromagnetic signals relating to an individual's device (for example, mobile phones, tablets, Wi-Fi routers, and so on) can have privacy implications.

At EU level, there is no data protection legislation specific to the use of robots/drones; the applicable legal framework is contained in the General Data Protection Regulation (EU) 2016/679 (GDPR). In the UK, the processing of personal data via robot/drones is subject to the GDPR and the Data Protection Act 2018 (DPA) and the legal provisions applicable to CCTV systems. After Brexit, the GDPR will be retained in UK law and amended to become the UK GDPR. For more information on data protection after Brexit read our blog.

The GDPR and the DPA set out the conditions under which personal data can be processed and provide for certain exemptions and derogations, the most relevant being:

  • Household exemption: This applies to the processing of personal data in the course of a purely personal or household activity. This exemption could potentially apply to individuals using robots/drones for their own purposes. However, the ECJ has narrowly interpreted this exemption in the context of the use of CCTV camera. As a result, its application will depend on the specific circumstances of each case. The Information Commissioner's Office (ICO), the UK data protection regulator in charge of enforcing GDPR and DPA requirements, has issued guidance in relation to the use of drones. The ICO makes a distinction between the use of drones by "hobbyists" and their use for professional or commercial purposes. Although "hobbyists" would be likely to be exempted from the GDPR and the DPA on the basis of the household exemption, the ICO has provided tips for the responsible use of drones, inviting people to think of privacy considerations and to apply a common sense approach when recording and sharing images captured by a drone.
  • Journalistic exemption: In cases where personal data is collected through drones with a view to the publication of some journalistic, academic, artistic or literary material. In this case, processing would, under certain conditions, be exempt from many data protection obligations to the extent that such obligations would be incompatible with the purposes of journalism, academic, literary or artistic purposes which are sought by the processing.

Here to help

Robot manufacturing companies come up against many of the same legal issues as other product manufacturing companies. Having risk assessment procedures in place, as well as mechanisms to deal with potential faults, should reduce liability. However, robots are likely to be able to collect data and so data protection law also becomes important.

EM law specialises in technology and contract law. Get in touch if you need advice on Robot Manufacturing or have any questions on the above.

Legitimate Interests

Legitimate Interests – Lawful Processing of Personal Data

When processing personal data legally, organisations have six possible reasons or ‘bases’ to rely upon: consent, contract, legal obligation, vital interest, public task or legitimate interests. Most of these are unambiguous. Fulfilling a contract or protecting someone’s life for example. On the surface, ‘legitimate interests’ appears more open to interpretation. What will be considered legitimate? And whose interests will be taken into account? When all else fails, organisations often mistakenly look to legitimate interests as a base for processing that furthers their business interest. Seeing legitimate interests as a fall-back is misguided. In many respects it is just as stringent as any of the other possible bases.

Legitimate Interests - Legislation

The UK GDPR describes legitimate interests as “processing necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child”.

Legitimate interests is different to the other lawful bases as it is not centred around a particular purpose (e.g. performing a contract with the individual, complying with a legal obligation, protecting vital interest or carrying out a public task), and it is not processing that the individual has specifically agreed to (consent). Legitimate interests is more flexible and could in principle apply to any type of processing for any reasonable purpose.

Because it could apply in a wide range of circumstances, it puts the onus on you to balance your legitimate interests and the necessity of processing the personal data against the interests, rights and freedoms of the individual taking into account the particular circumstances. This is different to the other lawful bases which presume that your interests and those of the individual are balanced.

Three-part test

The ICO (UK data protection regulatory authority) interprets the legislation with a three-part test. The wording creates three distinct obligations:

  1. “Processing is necessary for…” – the necessity teste. is the processing necessary for the purpose?
  2. “… the purposes of the legitimate interests pursued by the controller or by a third party, …” – the purpose teste. is there a legitimate interest behind the processing?
  3. “… except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.” – the balancing teste. are the legitimate interests overridden by the individual’s interests, rights or freedoms?

Purpose test – what counts as a ‘legitimate interests’?

A wide range of interests may be legitimate interests. It could be your legitimate interests in the processing or it could include the legitimate interests of any third party. The term ‘third party’ doesn’t just refer to other organisations, it could also be a third party individual. The legitimate interests of the public in general may also play a part when deciding whether the legitimate interests in the processing override the individual’s interests and rights. If the processing has a wider public interest for society at large, then this may add weight to your interests when balancing these against those of the individual.


The UK GDPR does not have an exhaustive list of what purposes are likely to constitute legitimate interests. However, the recitals do say the following purposes constitute legitimate interests: fraud prevention; ensuring network and information security; or indicating possible criminal acts or threats to public security.

Therefore, if you are processing for one of these purposes you may have less work to do to show that the legitimate interests basis applies. The recitals also say that the following activities may indicate a legitimate interest: processing employee or client data; direct marketing; or administrative transfers within a group of companies.

However, whilst these last three activities may indicate legitimate interests, you still need to do some work to identify your precise purpose and show that it is legitimate in the specific circumstances, and in particular that any direct marketing complies with e-privacy rules on consent.

The necessity test

You need to demonstrate that the processing is necessary for the purposes of the legitimate interests you have identified. This doesn’t mean that it has to be absolutely essential, but it must be a targeted and proportionate way of achieving your purpose. You need to decide on the facts of each case whether the processing is proportionate and adequately targeted to meet its objectives, and whether there is any less intrusive alternative, i.e. can you achieve your purpose by some other reasonable means without processing the data in this way? If you could achieve your purpose in a less invasive way, then the more invasive way is not necessary.

The balancing test

Just because you have determined that your processing is necessary for your legitimate interests does not mean that you are automatically able to rely on this basis for processing. You must also perform a ‘balancing test’ to justify any impact on individuals. The balancing test is where you take into account “the interests or fundamental rights and freedoms of the data subject which require the protection of personal data” and check they don’t override your interests. In essence, this is a light-touch risk assessment to check that any risks to individuals’ interests are proportionate. If the data belongs to children then you need to be particularly careful to ensure their interests and rights are protected.

Reasonable expectations

Recital 47 of the UK GDPR says “the existence of a legitimate interest would need careful assessment including whether a data subject can reasonably expect at the time and in the context of the collection of the personal data that processing for that purpose may take place. The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing.”

The UK GDPR is clear that the interests of the individual could in particular override your legitimate interests if you intend to process personal data in ways the individual does not reasonably expect. This is because if processing is unexpected, individuals lose control over the use of their data, and may not be in an informed position to exercise their rights. There is a clear link here to your transparency obligations.

You need to assess whether the individual can reasonably expect the processing, taking into account particularly when and how the data was collected. This is an objective test. The question is not whether a particular individual actually expected the processing, but whether a reasonable person should expect the processing in the circumstances.

How do you apply legitimate interests in practice?

The ICO guidance states that organisations should undertake the three-part test and document the outcome, this process is referred to as a "legitimate interests assessment" (LIA). The length of a LIA will vary depending on the context and circumstances surrounding the processing. LIAs are intended to be a simple form of risk assessment, in contrast to a data protection impact assessment (DPIA) which is a "much more in-depth end-to-end process". A LIA is also a potential trigger for a DPIA. The ICO confirms that there is no specific duty in the UK GDPR to undertake a LIA, however, as a matter of best practice, one should be undertaken by organisations in order to meet their obligations under the UK GDPR accountability principle.

Once a LIA has been undertaken and an organisation has concluded that the legitimate interests basis for processing applies, then it should continue to keep the LIA under regular review. Where a LIA identifies high risks to the rights and freedoms of the individual, then a DPIA should be undertaken to assess these risks in more detail.

What else is there to consider?

The ICO also recommends that:

  • Individuals are informed of the purpose for processing, that legitimate interest is the basis being relied on and what that legitimate interest is. Organisations' privacy notices should also be updated to reflect this.
  • Where an organisation's purposes change or where it has a new purpose, it may still be able to continue processing for that new purpose on the basis of legitimate interests as long as the new purpose is compatible with the original purpose. A compatibility assessment should be undertaken in this case.
  • Organisations should be aware of individuals’ rights, for example, where legitimate interests is relied on as a basis for processing then the right to data portability does not apply to any personal data being processed on that basis.

Here to help

The concept of ‘legitimate interests’ as a basis for processing personal data predates GDPR. Many organisations are consequently aware of the concept. It should not, however, be taken for granted when organisations wish to further a business interest. As shown above, there are a number of obligations to consider, and therefore the basis should not be considered lightly or as a last resort.

If you have any questions on legitimate interests, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.