Draft Adequacy Decisions

Draft Adequacy Decisions: Data Flows EU to UK

Draft adequacy decisions were published on 19 February 2021 by the European Commission (EC) for personal data transfers from the EU to the UK. The significance of the drafts are considerable given they are the first to be produced since the European Court of Justice’s (ECJ) ruling in Schrems II which struck down the adequacy decision previously granted to the EU-US Privacy shield.

The EC’s press release on the draft adequacy decisions stated that it has carefully assessed the UK’s law and practice on personal data protection, including the rules on public authorities access to personal data, and concluded that the UK ensures an ‘essentially equivalent’ level of protection to that guaranteed under the EU GDPR and Law Enforcement Directive.

What does adequacy mean?

‘Adequacy’ is a term that the EU uses to describe other countries, territories, sectors or international organisations that it deems to provide an ‘essentially equivalent’ level of data protection to that which exists within the EU. An adequacy decision is a formal decision made by the EU which recognises that another country, territory, sector or international organisation provides an equivalent level of protection for personal data as the EU does. The UK is seeking adequacy decisions under both the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED).

The effect of an adequacy decision is that personal data can be sent from an EEA state to a third country without any further safeguard being necessary. The trade deal agreed between the UK and the EU means that the UK has a bridge until 30 June 2021 where data can continue to flow from the European Economic Area (EEA) to the UK whilst the adequacy decisions process takes place. The bridge can finish sooner than this if the EU adopts adequacy decisions in respect of the UK.

Transfers of data from the UK to the EEA are permitted. The UK Government has recognised EU Commission adequacy decisions made before the end of the transition period. This allows restricted transfers to continue to be made from the UK to most organisations, countries, territories or sectors covered by an EU adequacy decision.

Adequacy criteria

For purposes of the draft adequacy decisions, the EC assessed (amongst other things) the following as it relates to the rules applying to the processing of personal data:

  • The UK’s constitutional framework – including, the existence of the UK Human Rights Act 1998 which incorporates the rights contained in the European Convention on Human Rights.
  • The UK’s data protection framework – in particular, the fact that the EU GDPR has been incorporated into UK law (UK GDPR) and as such, the UK’s legislative framework for data protection is closely aligned to that in the EU. This includes both the territorial and material scope of the UK GDPR, the definitions for key concepts under the UK GDPR (e.g., personal data), the data protection principles of the UK GDPR (e.g., fair and lawful processing), and the data protection rights afforded to individuals (for which a particularly detailed analysis of the exemptions to these rights is provided) – all of which are equivalent to those provided in the EU GDPR.
  • Onward transfers of personal data from the UK – in particular, the fact that the same restrictions on international transfers of personal data under the EU GDPR are provided in the UK GDPR in turn, safeguarding the onward transfer of EU personal data from the UK to another third country (e.g., the US).
  • Oversight and enforcement – the existence of the UK’s Information Commissioner’s Office (ICO) as an “independent supervisory authority tasked with powers to monitor and enforce compliance with the data protection rules” and the powers of enforcement granted to the ICO which are equivalent to those granted to EU data protection authorities under the EU GDPR. Interestingly, references to the number of cases investigated by the ICO (approximately 40,000 complaints from data subjects per year and 2,000 investigations) as well as the fines issued by the ICO under the EU GDPR, are both factors considered in the EC’s assessment.
  • Redress – the requirement that individuals are provided with effective administrative and judicial redress, including compensation for damages. The EC here references the ability for a data subject to: (a) complain to (and about) the ICO, (b) bring a claim against controllers and processors for material and non-material damages under the UK GDPR, and (c) bring a claim in UK courts under the UK’s Human Rights Act 1998 and ultimately in the European Court of Human Rights.

Consequences of adoption

If adopted, the draft adequacy decisions will be valid for an initial term of four years, only renewable if the level of protection in the UK continues to be adequate. The drafts include strict mechanisms for monitoring and review, suspension or withdrawal, to address any problematic development of the UK system which will no longer be bound by EU privacy rules.

UK government response to the draft adequacy decisions

The UK government has welcomed the draft adequacy decisions, urging the EU to fulfil its commitment to complete the approval process swiftly. The Information Commissioner described the progress as "an important milestone in securing the continued frictionless data transfers from the EU to the UK".

The draft adequacy decisions are now with the EDPB for a "non-binding opinion", following which the EC will request approval from EU member states' representatives. It could then adopt final adequacy decisions. Until then, organisations continue to be able to receive personal data from the EU under the temporary "bridging mechanism", agreed in the EU-UK Trade and Cooperation Agreement.

Schrems II

The draft adequacy decisions also include a detailed assessment of the conditions and limitations, as well as the oversight mechanisms and remedies applicable in case of access to data by UK public authorities, in particular for law enforcement and national security purposes. These are likely included to address the ECJ's ruling in Schrems II and concerns over the UK's use of mass surveillance techniques.

In Schrems II, the ECJ ruled that free data flows moving from the EU to certain US organisations under the EU-US privacy shield did not offer an essentially equivalent level of protection as under EU law. This was substantially based on the fact that national security laws in the US were deemed to undermine citizens’ data rights. When assessing the UK, the ECJ, in light of the ruling in Schrems II, was always going to pay close attention to UK national security laws. Additionally, Schrems II introduced more stringent obligations on organisations when carrying out cross border data transfers and so there has been a general concern that this newly stringent approach may reduce the UK’s chance of receiving an adequacy decision. The drafts can therefore be seen as a highly positive step.

What stands in the UK’s way?

Although the process for an adequacy decision under the EU GDPR is now underway with the draft adequacy decisions in place and, although the UK government has stated on a number of occasions that it is confident that the EU will deem the UK data protection regime ‘essentially equivalent’, it is worth noting that a number of issues may impact on the UK's ability to satisfy the EU:

  • The UK's use of mass surveillance techniques may lead to EU member states raising concerns about data protection in the UK, which might jeopardise an Adequacy Decision. The ruling of the ECtHR which held that aspects of the UK's surveillance regimes under the Regulation of Investigatory Powers Act 2000 (RIPA) did not comply with Articles 8 and 10 of the ECHR, is particularly relevant (Big Brother Watch and others v United Kingdom). The human rights groups which brought the claim were not satisfied with the judgment and appealed to the Grand Chamber, the ECtHR's highest judicial bench.
  • Membership of the Five Eyes intelligence sharing community means EU citizens' data could be transferred by UK security services to third countries (including the US) which are not considered to have adequate data protection.
  • Potential for unprotected onward data transfers as the UK will be able to decide which countries it deems adequate and what arrangements to have with them.

The draft adequacy decisions - a positive step

Although nothing can be taken for granted, the draft adequacy decisions are a positive step and the fact that the UK has committed to remaining party to the ECHR and "Convention 108", will likely carry some leverage as adherence to such international conventions is important for the stability and durability of adequacy findings.

If you have any questions on the draft adequacy decisions, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


E-privacy

E-Privacy – PECR and Brexit

E-Privacy regulations complement data protection laws by setting out privacy rights for electronic communications. The idea being that whilst widespread public access to digital mobile networks and the internet has opened up new possibilities for businesses and users, they have also created new risks for privacy. E-Privacy regulations have been a point of contention within the EU and reform has been expected for some time. On 10 February 2021, 4 years after the European Commission’s initial legislative proposal and to the surprise of many, the European Council reached a compromise agreement on their position on the E-privacy Regulation. What this means for E-privacy rules in the UK remains to be seen. With Brexit behind us, and therefore no obligation to introduce new EU legislation in the UK, but with an adequacy decision pending, and therefore a desire for the UK to align with the EU on data protection, it is hard to say whether or not the UK will choose to implement them. For more information on data protection and a potential adequacy decision after Brexit read our blog.

E-Privacy and PECR

PECR are the Privacy and Electronic Communications Regulations which comprise the E-privacy regulations in the UK. Their full title is The Privacy and Electronic Communications (EC Directive) Regulations 2003. They are derived from European law. PECR have been amended a number of times. The more recent changes were made in 2018, to ban cold-calling of claims management services and to introduce director liability for serious breaches of the marketing rules; and in 2019 to ban cold-calling of pensions schemes in certain circumstances and to incorporate the GDPR definition of consent.

What kind of areas do PECR cover?

PECR cover several areas:

  • Marketing by electronic means, including marketing calls, texts, emails and faxes.
  • The use of cookies or similar technologies that track information about people accessing a website or other electronic service.
  • Security of public electronic communications services.
  • Privacy of customers using communications networks or services as regards traffic and location data, itemised billing, line identification services (eg caller ID and call return), and directory listings.

How does this fit with the UK GDPR?

The UK GDPR sits alongside PECR. PECR rules apply and use the UK GDPR standard of consent (which is a high threshold). This means that if you send electronic marketing or use cookies or similar technologies you must comply with both PECR and the UK GDPR. Unsurprisingly, there is some overlap, given that both aim to protect people’s privacy. Complying with PECR will help you comply with the UK GDPR, and vice versa – but there are some differences. In particular, it’s important to realise that PECR apply even if you are not processing personal data. For example, many of the rules protect companies as well as individuals, and the marketing rules apply even if you cannot identify the person you are contacting.

If you are a network or service provider, Article 95 of the UK GDPR says the UK GDPR does not apply where there are already specific PECR rules. This is to avoid duplication, and means that if you are a network or service provider, you only need to comply with PECR rules (and not the UK GDPR) on:

  • security and security breaches;
  • traffic data;
  • location data;
  • itemised billing; and
  • line identification services.

Electronic and telephone marketing

PECR restrict unsolicited marketing by phone, fax, email, text, or other electronic message. There are different rules for different types of communication. The rules are generally stricter for marketing to individuals than for marketing to companies. Companies will often need specific consent to send unsolicited direct marketing. The best way to obtain valid consent is to ask customers to tick opt-in boxes confirming they are happy to receive marketing calls, texts or emails from you.

E-Privacy: Cookies and similar technologies

Companies must tell people if they set cookies, and clearly explain what the cookies do and why. You must also get the user’s consent. Consent must be actively and clearly given. There is an exception for cookies that are essential to provide an online service at someone’s request (e.g. to remember what’s in their online basket, or to ensure security in online banking). The same rules also apply if you use any other type of technology to store or gain access to information on someone’s device.

Communications networks and services

PECR are not just concerned with marketing by electronic means. They also contain provisions that concern the security of public electronic communications services and the privacy of customers using communications networks or services. Some of these provisions only apply to service providers (e.g. the security provisions) but others apply more widely. For example, the directories provision applies to any organisation wanting to compile a telephone, fax or email directory.

EU Council position on E-Privacy rules

On 10 February 2021, EU member states agreed on a negotiating mandate for revised rules on the protection of privacy and confidentiality in the use of electronic communications services. These updated E-privacy rules will define cases in which service providers are allowed to process electronic communications data or have access to data stored on end-users’ devices. The agreement allows the Portuguese presidency to start talks with the European Parliament on the final text. The agreement included:

  • The regulation will cover electronic communications content transmitted using publicly available services and networks, and metadata related to the communication. Metadata includes, for example, information on location and the time and recipient of communication. It is considered potentially as sensitive as the content itself.
  • As a main rule, electronic communications data will be confidential. Any interference, including listening to, monitoring and processing of data by anyone other than the end-user will be prohibited, except when permitted by the E-privacy regulation.
  • Permitted processing of electronic communications data without the consent of the user includes, for example, ensuring the integrity of communications services, checking for the presence of malware or viruses, or cases where the service provider is bound by EU or member states’ law for the prosecution of criminal offences or prevention of threats to public security.
  • Metadata may be processed for instance for billing, or for detecting or stopping fraudulent use. With the user’s consent, service providers could, for example, use metadata to display traffic movements to help public authorities and transport operators to develop new infrastructure where it is most needed. Metadata may also be processed to protect users’ vital interests, including for monitoring epidemics and their spread or in humanitarian emergencies, in particular natural and man-made disasters.
  • In certain cases, providers of electronic communications networks and services may process metadata for a purpose other than that for which it was collected, even when this is not based on the user’s consent or certain provisions on legislative measures under EU or member state law. This  processing for another purpose must be compatible with the initial purpose, and strong specific safeguards apply to it.
  • As the user’s terminal equipment, including both hardware and software, may store highly personal information, such as photos and contact lists, the use of processing and storage capabilities and the collection of information from the device will only be allowed with the user’s consent or for other specific transparent purposes laid down in the regulation.
  • The end-user should have a genuine choice on whether to accept cookies or similar identifiers. Making access to a website dependent on consent to the use of cookies for additional purposes as an alternative to a paywall will be allowed if the user is able to choose between that offer and an equivalent offer by the same provider that does not involve consenting to cookies.
  • To avoid cookie consent fatigue, an end-user will be able to give consent to the use of certain types of cookies by whitelisting one or several providers in their browser settings. Software providers will be encouraged to make it easy for users to set up and amend whitelists on their browsers and withdraw consent at any moment.

Brexit

PECR continues to apply after the UK's exit from the EU on 31 January 2020. The draft ePR, described in detail above, which is still in the process of being agreed, was not finalised before 31 January 2020 and will therefore not become directly applicable in the UK. Once it is directly applicable to EU member states (which is likely 24 months after its coming into force), the UK will then need to consider to what extent to mirror the new rules. In any case, given that UK companies will continue to process data of EU end users, it will still be necessary to be aware of any discrepancies created by E-privacy reform in the EU.

The deadlock is over

It has long been considered that EU E-privacy regulations have lagged behind the technological progress seen in online marketing techniques and EU negotiations around reform have at times seemed never-ending. The agreement reached by the EU council will therefore be seen as a necessary improvement in legal certainty, although plenty of questions still abound.

PECR in its pre-reformed state will continue to apply in the UK. On 19th February 2021, the European Commission issued its draft adequacy decision that would allow EU-to-UK data transfers. While the E-privacy Regulation is not strictly relevant to the UK’s continued adequacy status, alignment on E-privacy rules would likely be viewed positively by the EU institutions, which could prompt the UK to update its laws in line with the new EU regime. The reforms will of course also be relevant to any UK business that operates in the EU. Even if the Regulation is finally adopted this year, it will not apply for a further two years meaning, these changes will likely not come into effect until 2023 at the earliest.

If you have any questions on E-privacy and data protection, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Robot Manufacturing

Robot Manufacturing: Product Liability Law

Robot manufacturing is a growth industry as the costs of producing robots are going down while the savings in labour costs are rising. With the rise and rise of automation in all areas of life, the word robot has come to mean a wider variety of things. Artificial intelligence (read our blog on some legal issues) has received the most amount of attention recently especially given its crossover with big data (read our blog). But what about the more conventional notion of a robot – the walking, talking lump of steel, more willing to do jobs we’re not so keen on. The sort that product liability law applies to more obviously. This blog covers some issues that a robot manufacturer may encounter when putting such a product on the market.

Robot Manufacturing - product liability and safety risk management

The grounds for product liability make clear that risks are faced by those involved in the supply chain of a product, such as in robot manufacturing, in respect of their liability to the public, and to others in the chain, for sub-standard/defective or dangerous products. There are various ways in which these risks can be managed, including contractual protections, insurance and risk management.

Can liability for a dangerous or defective product be excluded or limited?

It is important to ensure that the terms of the contracts to which a robot manufacturing company is party protect it, as far as possible, from the risk of product liability claims. A company will want to have the right to reclaim from the company above it in the supply chain, any loss that has been caused by that company's product or actions. Conversely, a company may wish to limit its liability to other parties below it in the supply chain. A company may also want to impose contractual obligations on key suppliers to ensure that they comply with quality and safety standards.

There are various ways that robot manufacturing companies can do this. In general, parties to a contract have the option of limiting or excluding their liability for certain things. However, when it comes to product liability, there are restrictions on limiting liability for dangerous or defective products. It is therefore important to also consider non-contractual provisions that can protect a company from this risk.

Different considerations need to be taken into account when dealing with the various parties within the supply chain. For example, manufacturers, distributors, importers and retailers will all have concerns specific to the role that they undertake.

Is there an effective quality and safety assurance programme in place?

It is important to have an effective quality assurance programme in place. Especially when dealing with a highly automated product, such as in robot manufacturing. Appropriate internal controls, checks and processes are necessary to monitor product safety. One option is to put in place a product safety committee internally, which should include a range of personnel within the company, including people from the legal, marketing, production and design teams. The committee's function should be to:

  • Review products and their associated documents.
  • Ensure that all appropriate regulatory and internal procedures have been followed and documented before and after marketing.
  • Authorise any necessary action (for example, changing warnings or design).
  • Review after-sales monitoring reports for trends and significant incidents.
  • Review insurance arrangements.

Other steps that a company can take are to ensure quality assurance records are maintained and are readily accessible and to keep good records of who supplied each part of the product.

Is there an effective enquiries and complaints system in place?

The ability of a robot manufacturing company to effectively capture and respond to safety information provided by customers, or other individuals, is important for ensuring both product safety and limiting product liability exposure. There are several points to be considered:

  • Does the company have a system to handle customer enquiries and complaints? If so, could it be improved?
  • Are staff adequately trained?
  • Does the company have a policy of recovering allegedly unsafe items and investigating and recording the items and circumstances?
  • Is there a systematic review of adverse incident information involving multi-disciplinary input from different departments?
  • Can the company identify repeat claimants who may not be genuine?

When a robot manufacturing company provides services along with products (for example, repairs or advice), duties existed before Brexit, under the Services Directive (2006/123/EC), to inform customers of how to make complaints and to deal with complaints promptly and fairly. The Services Directive is implemented in the UK by the Provision of Services Regulations 2009 (SI 2009/2999) (PSRs). The PSRs apply to the majority of private sector businesses in the UK providing services to consumers. Under the PSRs, a trader must provide consumers with certain information about itself and deal with customer complaints promptly.

The Services Directive was revoked immediately after the Brexit transition period ended on 31 December 2020. The PSRs, however, are still in force as retained EU law but were amended by the Provision of Services (Amendment etc.) (EU Exit) Regulations (SI 2018/1329). The amendments revoked EEA-specific provisions in the PSRs that implemented or reflected cross-border, intra-EU provisions of the Services Directive. Most importantly for service providers is the removal of the provisions that prohibited certain service providers from discriminating against customers on the basis of their place of residence. In practice, this means that a business that supplies services to customers in the UK can treat non-UK customers differently. The provisions in the PSRs requiring businesses to provide certain information to customers and to respond to complaints remain in place.

Is there an "early warning" system in place?

Robot manufacturing companies need to be alert to the early warning signs of situations that may lead to safety issues, regulatory action or develop into product liability claims. The key question to consider in these circumstances is whether an incident concerns product safety rather than quality alone, for example:

  • Can the product be returned for examination or testing?
  • Was the adverse event foreseen and covered by warnings?
  • Was the product being used contrary to instructions?
  • Could the adverse event occur again?
  • How serious is the risk and what categories of users might be at risk?

It is often prudent to require robot manufacturing company suppliers to notify the company immediately about defect reports, claims made against them or significant safety information relating to goods they supply to the company. A company's distributors should also be required to report similar information relating to the company's products.

Robot Manufacturing - Data protection and privacy

The use of robots (drones being a good example) fitted with cameras or other sensors which can collect personal data such as images of people or vehicle plate numbers, geolocation data or electromagnetic signals relating to an individual's device (for example, mobile phones, tablets, Wi-Fi routers, and so on) can have privacy implications.

At EU level, there is no data protection legislation specific to the use of robots/drones; the applicable legal framework is contained in the General Data Protection Regulation (EU) 2016/679 (GDPR). In the UK, the processing of personal data via robot/drones is subject to the GDPR and the Data Protection Act 2018 (DPA) and the legal provisions applicable to CCTV systems. After Brexit, the GDPR will be retained in UK law and amended to become the UK GDPR. For more information on data protection after Brexit read our blog.

The GDPR and the DPA set out the conditions under which personal data can be processed and provide for certain exemptions and derogations, the most relevant being:

  • Household exemption: This applies to the processing of personal data in the course of a purely personal or household activity. This exemption could potentially apply to individuals using robots/drones for their own purposes. However, the ECJ has narrowly interpreted this exemption in the context of the use of CCTV camera. As a result, its application will depend on the specific circumstances of each case. The Information Commissioner's Office (ICO), the UK data protection regulator in charge of enforcing GDPR and DPA requirements, has issued guidance in relation to the use of drones. The ICO makes a distinction between the use of drones by "hobbyists" and their use for professional or commercial purposes. Although "hobbyists" would be likely to be exempted from the GDPR and the DPA on the basis of the household exemption, the ICO has provided tips for the responsible use of drones, inviting people to think of privacy considerations and to apply a common sense approach when recording and sharing images captured by a drone.
  • Journalistic exemption: In cases where personal data is collected through drones with a view to the publication of some journalistic, academic, artistic or literary material. In this case, processing would, under certain conditions, be exempt from many data protection obligations to the extent that such obligations would be incompatible with the purposes of journalism, academic, literary or artistic purposes which are sought by the processing.

Here to help

Robot manufacturing companies come up against many of the same legal issues as other product manufacturing companies. Having risk assessment procedures in place, as well as mechanisms to deal with potential faults, should reduce liability. However, robots are likely to be able to collect data and so data protection law also becomes important.

EM law specialises in technology and contract law. Get in touch if you need advice on Robot Manufacturing or have any questions on the above.


Legitimate Interests

Legitimate Interests – Lawful Processing of Personal Data

When processing personal data legally, organisations have six possible reasons or ‘bases’ to rely upon: consent, contract, legal obligation, vital interest, public task or legitimate interests. Most of these are unambiguous. Fulfilling a contract or protecting someone’s life for example. On the surface, ‘legitimate interests’ appears more open to interpretation. What will be considered legitimate? And whose interests will be taken into account? When all else fails, organisations often mistakenly look to legitimate interests as a base for processing that furthers their business interest. Seeing legitimate interests as a fall-back is misguided. In many respects it is just as stringent as any of the other possible bases.

Legitimate Interests - Legislation

The UK GDPR describes legitimate interests as “processing necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child”.

Legitimate interests is different to the other lawful bases as it is not centred around a particular purpose (e.g. performing a contract with the individual, complying with a legal obligation, protecting vital interest or carrying out a public task), and it is not processing that the individual has specifically agreed to (consent). Legitimate interests is more flexible and could in principle apply to any type of processing for any reasonable purpose.

Because it could apply in a wide range of circumstances, it puts the onus on you to balance your legitimate interests and the necessity of processing the personal data against the interests, rights and freedoms of the individual taking into account the particular circumstances. This is different to the other lawful bases which presume that your interests and those of the individual are balanced.

Three-part test

The ICO (UK data protection regulatory authority) interprets the legislation with a three-part test. The wording creates three distinct obligations:

  1. “Processing is necessary for…” – the necessity teste. is the processing necessary for the purpose?
  2. “… the purposes of the legitimate interests pursued by the controller or by a third party, …” – the purpose teste. is there a legitimate interest behind the processing?
  3. “… except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.” – the balancing teste. are the legitimate interests overridden by the individual’s interests, rights or freedoms?

Purpose test – what counts as a ‘legitimate interests’?

A wide range of interests may be legitimate interests. It could be your legitimate interests in the processing or it could include the legitimate interests of any third party. The term ‘third party’ doesn’t just refer to other organisations, it could also be a third party individual. The legitimate interests of the public in general may also play a part when deciding whether the legitimate interests in the processing override the individual’s interests and rights. If the processing has a wider public interest for society at large, then this may add weight to your interests when balancing these against those of the individual.

Examples

The UK GDPR does not have an exhaustive list of what purposes are likely to constitute legitimate interests. However, the recitals do say the following purposes constitute legitimate interests: fraud prevention; ensuring network and information security; or indicating possible criminal acts or threats to public security.

Therefore, if you are processing for one of these purposes you may have less work to do to show that the legitimate interests basis applies. The recitals also say that the following activities may indicate a legitimate interest: processing employee or client data; direct marketing; or administrative transfers within a group of companies.

However, whilst these last three activities may indicate legitimate interests, you still need to do some work to identify your precise purpose and show that it is legitimate in the specific circumstances, and in particular that any direct marketing complies with e-privacy rules on consent.

The necessity test

You need to demonstrate that the processing is necessary for the purposes of the legitimate interests you have identified. This doesn’t mean that it has to be absolutely essential, but it must be a targeted and proportionate way of achieving your purpose. You need to decide on the facts of each case whether the processing is proportionate and adequately targeted to meet its objectives, and whether there is any less intrusive alternative, i.e. can you achieve your purpose by some other reasonable means without processing the data in this way? If you could achieve your purpose in a less invasive way, then the more invasive way is not necessary.

The balancing test

Just because you have determined that your processing is necessary for your legitimate interests does not mean that you are automatically able to rely on this basis for processing. You must also perform a ‘balancing test’ to justify any impact on individuals. The balancing test is where you take into account “the interests or fundamental rights and freedoms of the data subject which require the protection of personal data” and check they don’t override your interests. In essence, this is a light-touch risk assessment to check that any risks to individuals’ interests are proportionate. If the data belongs to children then you need to be particularly careful to ensure their interests and rights are protected.

Reasonable expectations

Recital 47 of the UK GDPR says “the existence of a legitimate interest would need careful assessment including whether a data subject can reasonably expect at the time and in the context of the collection of the personal data that processing for that purpose may take place. The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing.”

The UK GDPR is clear that the interests of the individual could in particular override your legitimate interests if you intend to process personal data in ways the individual does not reasonably expect. This is because if processing is unexpected, individuals lose control over the use of their data, and may not be in an informed position to exercise their rights. There is a clear link here to your transparency obligations.

You need to assess whether the individual can reasonably expect the processing, taking into account particularly when and how the data was collected. This is an objective test. The question is not whether a particular individual actually expected the processing, but whether a reasonable person should expect the processing in the circumstances.

How do you apply legitimate interests in practice?

The ICO guidance states that organisations should undertake the three-part test and document the outcome, this process is referred to as a "legitimate interests assessment" (LIA). The length of a LIA will vary depending on the context and circumstances surrounding the processing. LIAs are intended to be a simple form of risk assessment, in contrast to a data protection impact assessment (DPIA) which is a "much more in-depth end-to-end process". A LIA is also a potential trigger for a DPIA. The ICO confirms that there is no specific duty in the UK GDPR to undertake a LIA, however, as a matter of best practice, one should be undertaken by organisations in order to meet their obligations under the UK GDPR accountability principle.

Once a LIA has been undertaken and an organisation has concluded that the legitimate interests basis for processing applies, then it should continue to keep the LIA under regular review. Where a LIA identifies high risks to the rights and freedoms of the individual, then a DPIA should be undertaken to assess these risks in more detail.

What else is there to consider?

The ICO also recommends that:

  • Individuals are informed of the purpose for processing, that legitimate interest is the basis being relied on and what that legitimate interest is. Organisations' privacy notices should also be updated to reflect this.
  • Where an organisation's purposes change or where it has a new purpose, it may still be able to continue processing for that new purpose on the basis of legitimate interests as long as the new purpose is compatible with the original purpose. A compatibility assessment should be undertaken in this case.
  • Organisations should be aware of individuals’ rights, for example, where legitimate interests is relied on as a basis for processing then the right to data portability does not apply to any personal data being processed on that basis.

Here to help

The concept of ‘legitimate interests’ as a basis for processing personal data predates GDPR. Many organisations are consequently aware of the concept. It should not, however, be taken for granted when organisations wish to further a business interest. As shown above, there are a number of obligations to consider, and therefore the basis should not be considered lightly or as a last resort.

If you have any questions on legitimate interests, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


International Transfers of Personal Data

International Transfers of Personal Data - What Are The Rules?

International transfers of personal data have been shaken up in recent memory. Most obviously Brexit has placed the EU and UK in separate data protection regimes rendering any transfer between them international, meaning they are now subject to new conditions. Additionally, data transfers to the US have been disrupted by the judgement in Schrems II. This landmark case led to the striking down of the EU-US Privacy Shield which enabled free flow of data to certain US-based organisations. For more information on the impact of Brexit read our blog.

Where does it all lead? It is easy to be overwhelmed by the complexity of the legal and political implications of these developments. However, as most organisations are realising, the simple solution continues to be Standard Contractual Clauses (SCCs). After an introduction to international transfers, this blog will focus on the use and future of SCCs. Which for the majority of organisations will be the most practical data transfer mechanism.

General principle for data exports to non-UK countries

International transfers of personal data to a country outside the UK (third country) may only take place if the controller and the processor comply with certain conditions. A transfer of personal data to a third country may take place if:

  • the UK has decided that the third country ensures an adequate level of protection

or

  • the controller or processor has provided appropriate safeguards; enforceable data subject rights and effective legal remedies for data subjects are available.

Third countries with adequate levels of protection

The UK has “adequacy regulations” in relation to the following countries and territories:

  • The European Economic Area (EEA) countries. These are the EU member states and the EFTA States. The EU member states are Austria, Belgium, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Ireland, Italy, Latvia, Lithuania, Luxembourg, Malta, Netherlands, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, Sweden. The EFTA states are Iceland, Norway and Liechtenstein.
  • EU or EEA institutions, bodies, offices or agencies.
  • Gibraltar
  • Countries, territories and sectors covered by the European Commission’s adequacy decisions (in force at 31 December 2020). These include a full finding of adequacy about the following countries and territories: Andorra, Argentina, Guernsey, Isle of Man, Israel, Jersey, New Zealand, Switzerland and Uruguay. In addition, the partial findings of adequacy about: Japan – only covers private sector organisations. Canada - only covers data that is subject to Canada's Personal Information Protection and Electronic Documents Act (PIPEDA). Not all data is subject to PIPEDA. For more details please see the EU Commission's FAQson the adequacy finding on the Canadian PIPEDA.

International transfers of personal data - adequate safeguards

If the third country has not been granted an adequacy decision then organisations can rely upon adequate safeguards. Schrems II has added an additional burden - before you may rely on an appropriate safeguard to make a restricted transfer, you must be satisfied that the data subjects of the transferred data continue to have a level of protection essentially equivalent to that under the UK data protection regime. This can be done by undertaking a risk assessment, which takes into account the protections contained in that appropriate safeguard and the legal framework of the destination country (including laws governing public authority access to the data). This assessment is undoubtedly complex in many situations. The ICO intends to issue guidance on this topic in due course.

Controllers and processors may provide adequate safeguards by:

  • A legally binding agreement between public authorities or bodies.
  • Binding corporate rules (agreements governing transfers made between organisations within a corporate group).
  • Standard data protection clauses in the form of template transfer clauses adopted by the Commission.
  • Standard data protection clauses in the form of template transfer clauses adopted by the ICO.
  • Compliance with an approved code of conduct approved by a supervisory authority.
  • Certification under an approved certification mechanism as provided for in the GDPR.

Is the restricted transfer covered by an exception?

If you are making a restricted transfer that is not covered by UK ‘adequacy regulations’, nor an appropriate safeguard, then you can only make that transfer if it is covered by one of the ‘exceptions’ set out in Article 49 of the UK GDPR:

Exception 1. Has the individual given his or her explicit consent to the restricted transfer?

Exception 2. Do you have a contract with the individual? Is the restricted transfer necessary for you to perform that contract?

Exception 3. Do you have (or are you entering into) a contract with an individual which benefits another individual whose data is being transferred? Is that transfer necessary for you to either enter into that contract or perform that contract?

Exception 4: You need to make the restricted transfer for important reasons of public interest.

Exception 5: You need to make the restricted transfer to establish if you have a legal claim, to make a legal claim or to defend a legal claim.

Exception 6: You need to make the restricted transfer to protect the vital interests of an individual. He or she must be physically or legally incapable of giving consent.

Exception 7: You are making the restricted transfer from a public register.

Exception 8: you are making a one-off restricted transfer and it is in your compelling legitimate interests.

International transfers of personal data - Standard Contractual Clauses

You can make a restricted transfer if you and the receiver have entered into a contract incorporating standard data protection clauses recognised or issued in accordance with the UK data protection regime. These are known as ‘standard contractual clauses’ (‘SCCs’ or ‘model clauses’).

The SCCs contain contractual obligations on you (the data exporter) and the receiver (the data importer), and rights for the individuals whose personal data is transferred. Individuals can directly enforce those rights against the data importer and the data exporter.

ICO guidance on Standard Contractual Clauses

The commentary on the ICO webpage on Standard Contractual Clauses (SCCs) after the transition period ends provides guidance on what the ICO expects from UK controllers in relation to restricted transfers, i.e. when they are seeking to export personal data from the UK to entities located in countries which do not provide an adequate level of data protection. As shown above, the SCCs represent one of a number of "appropriate safeguards" available to enable such transfers to take place. SCCs are often the most practical method for organisations when it comes to data transfers.

The ICO guidance states that UK controllers can continue to use the existing EU SCCs. The guidance goes on to state:

"You are able to make changes to those EU SCCs so they make sense in a UK context provided you do not change the legal meaning of the SCCs. For example, changing references from the old EU Data Protection to the UK GDPR, changing references to the EU or Member States, to the UK, and changing references to a supervisory authority to the ICO.

Otherwise you must not make any changes to the SCCs, unless it is to add protections or more clauses on business related issues. You can add parties (i.e. additional data importers or exporters) provided they are also bound by the SCCs."

ICO versions of the SCCS

The versions of the SCCs the ICO has created contain suggested changes. These are only suggestions but if you wish to deviate from these suggested changes they should be consistent with the principles set out in the above guidance extract and the guidance generally, i.e. it needs to make sense in a UK context and not change the legal meaning of the SCCs. The ICO versions act as a starting point therefore, making changes only where strictly necessary to make them make sense.

Schedule 21 of the Data Protection Act 2018 details the types of changes that can be made to the EU version for use by a UK controller but it does also seem to allow for use of the EU version as they are, without amendment, unless disapplied by the Secretary of State or the Information Commissioner (see paragraphs 7 and 8 of Schedule 21).

Exporting from both the UK and the EU

Ideally, if personal data is to be exported from both the UK and the EU to a jurisdiction not deemed adequate by both the UK government and the European Commission, the exports from each of the UK and the EU should be treated separately as, while virtually identical, the EU GDPR and UK GDPR are completely separate regulatory regimes. If SCCs are chosen as the appropriate safeguard, the safest option would be to have the data exports from the UK and the EU to be covered by different sets of clauses (or potentially, depending on risk, to use the EU SCCs with an additional set of amendments for the UK version).

This point is underlined in the original European Commission decision of 2004 which states each set of SCCs as a whole forms a model, so data exporters should not be allowed to amend these sets or totally or partially merge them in any manner. To meet the data transfer requirements under the UK GDPR and the EU GDPR, if a controller wants to use SCCs, they cannot be adapted beyond what has been recommended by both the ICO and the guidance from the EC on their use.

Retrospective?

It is important to point out that, looking retrospectively, if the EU SCCs were entered into prior to the end of the transition period, they will continue to be valid for restricted transfers under the UK GDPR. There will not be a need to replace the EU SCCs contracted before 1 January 2020 with updated UK SCCs.

New Standard Contractual Clauses

On 12 November 2020 the EU Commission published standard contractual clauses for international transfers of personal data to third countries under the General Data Protection Regulation ((EU) 2016/679) (GDPR). This was a draft implementing decision and Annex. The Commission has previously indicated that these clauses would be finalised before the end of 2020 although, as they require the opinion of the EDPB and EDPS, and consultation with member states under the comitology procedure, they will now come into force in 2021.

The Commission notes that the clauses are a modernisation of the previous clauses, designed to better reflect the use of new and complex processing operations involving multiple parties, complex processing chains and evolving relationships. They are designed to be flexible and allow for a number of parties, including for parties to accede to the clauses later ("docking clause"). They are drafted in a modular approach with general clauses followed by options for different processing circumstances.

Key points of interest include that the clauses:

  • Can be used by controllers and processors, including those not established in the EU but that are caught by the GDPR and cover both controller to controller and controller to processor options. They can also be used for EU processor to non-EU controller transfers and processor to sub-processor transfers, both of which are new options.
  • Can be included in a wider contract and additional clauses and safeguards can be added provided these are not contradictory or prejudice the rights of data subjects.
  • Should include rules for liability and indemnification between the parties and are enforceable by data subjects as third-party beneficiaries against the data exporter or importer.

What does this mean for the UK?

Under the UK-EU trade and co-operation agreement, the UK is obliged to not exercise certain powers under its own data protection legislation including producing its own SCCs during the four to six month extension period (starting on the 1st January 2021 – for more info see our blog). The ICO intends to consult on and publish new UK SCCs during 2021. With Brexit, the ICO and Secretary of State must keep the transitional arrangements for SCCs under review, and both are now able to issue new SCCs. It may be that at some point the EU SCCs will cease to be valid, for new and/or existing restricted transfers from the UK.

The extent to which the ICO, who are reviewing the new EU SCCs, are influenced by the new EU model clauses will come to be another example of how the two regimes wish to either spilt or merge. Given that the UK has already granted countries in the EU an adequacy decision (and seem to hope to get one in return), it is not overly speculative to suggest that the new EU SCCs will, in some form or another, be incorporated into UK data protection law. However, as noted above, this will not be possible until after the four to six month extension period the UK currently find themselves in.

Here to help

International transfers of personal data is a complex area of law and in a state of transition. As suggested above the most practical solution for a lot of organisations will be the use of SCCs but that’s not to say your transfers cannot be enabled any other way (see above). The extent to which organisations will have to review their positions will be based upon whether or not the EU grants the UK an adequacy decision and the extent to which the ICO incorporates the soon to be published new EU standard contractual clauses into their own. In any event organisations need to be on the lookout for when these new clauses come into force in both the EU and UK.

If you have any questions on Brexit and data protection, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Adtech

Adtech - ICO report into adtech and real time bidding

On 22 January 2021, the Information Commissioner’s Office (ICO) announced its resumption of an investigation into real time bidding (RTB) and adtech. The investigation had been put on hold by COVID-19. Simon McDougall, ICO Deputy Commissioner, commented in a statement that the “the complex system of RTB uses people’s sensitive personal data to serve adverts and should require people’s explicit consent, which is not happening right now.”

The ICO will continue its investigation with a series of audits focusing on digital market platforms. It will issue assessment notices to specific companies over the coming months, so that it can gauge the state of the industry.

What is adtech?

Adtech (short for advertising technology) is the umbrella term for the software and tools that help agencies and brands target, deliver, and analyse their digital advertising efforts. If you have come across the terms "programmatic" or "omnichannel," then you may already know a little about what ad tech does.

Programmatic advertising, for instance, buys target audiences instead of time slots: Think about buying ad space that reaches a particular demographic wherever it is instead of buying a prime time TV spot and hoping the right people are watching.

Omnichannel marketing reaches target consumers across all channels -- mobile, video, desktop, and more -- within the context of how they've interacted with a brand (those first seeing an ad will receive a different message from those who have engaged with that brand a number of times). Adtech methodologies seek to deliver the right content at the right time to the right consumers, so there's less wasteful spending.

What is real-time bidding?

Real-time bidding (RTB) is an automated digital auction process that allows advertisers to bid on ad space from publishers on a cost-per-thousand-impressions, or CPM, basis. CPM is what you pay for one thousand people to see your ad. Like an auction, the highest bid from relevant ads will typically win the ad placement.

ICO report

On 20 June 2019 the ICO issued an update report into adtech and real time bidding. Whilst not official ICO guidance, the report identified areas in which the current real time bidding system for programmatic advertising breaches data protection and e-privacy law. In particular the report highlighted:

  • Processing of personal data is taking place unlawfully at the point of collection with adtech companies relying on legitimate interests for placing and/or reading cookies (rather than obtaining the consent the Privacy and Electronic Communications Regulations (PECR) require). Also, adtech companies are unable to demonstrate that they have properly carried out the necessary legitimate interests tests and implemented appropriate safeguards.
  • Processing of special category data is taking place unlawfully as explicit consent is not being collected.
  • Adtech companies may not be carrying out the data protection impact assessments (DPIAs) required.
  • Privacy information provided to individuals lacks clarity whilst also being overly complex. The consent frameworks examined by the ICO (including the IAB Europe Transparency & Consent Framework) ensure neither transparency and fair processing for GDPR purposes generally, nor free and informed consent for GDPR and PECR purposes.
  • The profiles created about individuals are extremely detailed and are repeatedly shared among hundreds of organisations for any one bid request, all without the individuals’ knowledge. These practices risk breaching the requirements for data minimisation and the storage limitation principles.
  • Adtech companies are inconsistent in their use of technical and organisational measures to secure personal data and do not sufficiently consider how the law applies to international transfers which take place during real time bidding.

Progress so far

In January 2020, the ICO's Executive Director for Tech Policy and Innovation published a blog about progress so far (Adtech - the reform of real time bidding has started and will continue). He noted the ICO's continued concern about the issues already raised but added that the Internet Advertising Bureau (IAB UK) and Google are starting to make the changes needed.

The IAB UK has agreed a range of principles that align with the ICO's concerns, and is developing its own guidance for organisations on security, data minimisation, and data retention, as well as UK-focused guidance on the content taxonomy. It will also educate the industry on special category data and cookie requirements, and continue work on some specific areas of detail (IAB UK sets out actions to address ICO’s real-time bidding concerns, 9 January 2020). Google will remove content categories, and improve its process for auditing counterparties. The ICO also endorses Google's proposals to phase out support for third party cookies within the next two years. Other UK advertising trade bodies will also produce guidance for their members.

Moving forward

Due to sensitivity of the work, the ICO will publish its final findings, once it has concluded its investigation. In the meantime, Mr McDougall advises organisations operating in the adtech space to urgently assess how they use personal data, in particular their compliance with obtaining individuals’ consent, reliance on legitimate interests, deployment of data protection by design and default and use of data protection impact assessments.

Using legitimate interests as a legal basis in adtech

Relying on legitimate interests may be more workable than obtaining consent for the large number of behind the scenes ad tech companies involved in buying, selling and serving advertising. Using legitimate interests rather than consent means that there is no obligation to keep consent records and, perhaps less importantly, that data portability rights (a user’s right to be able to move data between suppliers) are not triggered.

However, the ICO states in its online guidance "When is consent appropriate?" that "If you need consent under e-privacy laws to send a marketing message, then in practice consent is also the appropriate lawful basis under the GDPR". The ICO Adtech Update expands on this:

  • Trying to apply legitimate interests when GDPR-compliant consent has been obtained would be unnecessary and could confuse individuals.
  • Where an individual has given consent they would expect processing to cease when they withdrew consent. However, an entity relying on legitimate interests might seek to continue processing in this scenario, which would be unfair.

The ICO Adtech Update also makes the point that reliance on legitimate interests for marketing activities is only possible if organisations are able to show that their use of personal data is proportionate, has a minimal privacy impact, and individuals would not be surprised or likely to object. The ICO considers that the processing involved in real time bidding (RTB) cannot meet these criteria and legitimate interests cannot be used for the main bid request processing. The ICO does not rule out use of legitimate interests for other purposes, such as a demand-side platform supplementing a bid request with additional information.

Data protection impact assessments (DPIAs)

Controllers should carry out a Data Protection Impact Assessment (DPIA) before beginning processing that is likely to result in a high risk to the rights and freedoms of individuals (Article 35, GDPR). The ICO has published a list of processing operations likely to result in such a high risk, for which DPIAs are mandatory. The ICO Adtech Update confirms that Real Time Bidding, as used in adtech, involves several such processing operations. The ICO draft Direct Marketing Code states that the type and volume of processing that you can undertake in the online world, and the risks associated with that processing, mean it is highly likely that a DPIA will be required before processing begins.

Data minimisation

The GDPR requires that personal data collected must be limited to what is necessary in relation to the purposes for which it is processed. The ICO Adtech Update states that the creation of detailed profiles, repeatedly updated with information about individuals' online acitivities, is disproportionate for the purposes of targeted advertising. It is also intrusive and unfair, in particular as individuals are often unaware that the processing takes place and the privacy information provided does not clearly inform them what is happening.

Data integrity and confidentiality

Under the GDPR personal data must be stored securely. The ICO Adtech Update noted that real time bidding often involves sharing personal data with adtech companies in non-EU jurisdictions, resulting in international transfers. Further participants have no real control over the other adtech companies with whom data is shared. Contractual controls are insufficient; appropriate monitoring and technical and organisational controls are also required.

Accountability

Data controllers must be able to demonstrate their compliance with the GDPR. The ICO Adtech Update notes that the complexities of the adtech ecosystem mean that many adtech companies will find it difficult to understand, document and be able to demonstrate how their processing operations work, what they do, who they share any data with and how any processors are vetted and controlled; and how they can enable individuals to exercise their rights.

Accuracy and storage limitation

Other GDPR requirements include that data must be accurate and kept up to date and that personal data must be kept for no longer than is necessary. The ICO Adtech Update highlights the fact that because of the vast number of adtech companies involved in real time bidding it is difficult to ensure compliance with these principles. The ICO Cookie Guidance states that it is necessary to check that the duration of any cookies is appropriate; any default durations should be reviewed.

Here to help

Adtech has revolutionised the marketing industry and was firmly in place before the introduction of GDPR in 2018. It is now the ICO’s aim to bring this boom industry in line with UK data protection law. If you have any questions on adtech and data protection, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


UK GDPR

UK GDPR - Data Protection After The Transition Period

UK GDPR, EU GDPR, DPA 2018, DP Regulations. Confused? Hopefully this blog will help you understand what is happening with data protection laws in the UK now that the Brexit transition period has ended.

The UK data protection authority, the Information Commissioner’s Office (ICO), is telling us that at the end of the Brexit transition period, data protection compliance should continue as usual. The key principles, rights and obligations remain the same. What then is the consequence of the Brexit transition period ending on data protection in the UK?

Most importantly, following the end of the transition period, the EU and the UK will be operating under different, albeit very similar, data protection regimes. This means that any transfer of data between the two regimes will be considered as such – i.e. between two independent data protection legal systems.

The Legislation – UK GDPR and the DP Brexit Regulations

A confusing aspect of the UK’s new data protection regime is its reference to legislation. There is mention of the ‘UK GDPR’ and ‘DP Brexit Regulations’. In order to clear up any misunderstanding it is useful to consider how data protection legislation operated before Brexit

Before Brexit data protection was mainly governed by two pieces of legislation: the General Data Protection Regulation ((EU) 2016/679) and the Data Protection Act 2018. The first being EU law and the second being the mechanism by which it was implemented into UK law.

With the coming of Brexit came a concern with what the UK government should do with all its EU law. The European Union (Withdrawal) Act 2018 sought to retain EU law already implemented in the UK, including GDPR. Simply put, retained EU Law is copied and amended before becoming UK law. The EU data protection law GDPR, in its retained form, is now known as the UK GDPR. This is in contrast to data protection law in the EU now known (in the UK) as the EU GDPR.

The Data Protection Act 2018 (DPA), although already being UK law, was also defined as retained EU law for the purposes of the European Union (Withdrawal) Act 2018 and therefore at the end of the transition period it will continue to be a main source of data protection law in the UK.

In order for the retained EU data protection law to work in the UK after the transition period it needs to be amended. The Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 (referred to here as the DP Brexit Regulations) is the legislation by which this will be achieved. The amendments made to the UK GDPR and the DPA by the DP Brexit Regulations will therefore merge to form the core of the UK’s data protection law. Organisations will need to consider two legal texts after the transition period: the UK GDPR and the DPA.

Changes made by the DP Brexit Regulations

The purpose of the DP Brexit Regulations is first and foremost to integrate EU data protection law, as it stands, into UK law after the transition period. Therefore most of the changes are relatively predictable. Here are a few:

  • The Information Commissioner (UK data protection authority) is no longer a party to EU GDPR co-operation, consistency mechanisms and will not have a seat on the European Data Protection Board (EDPB).
  • Amendments are made throughout the UK GDPR to change references to EU institutions, member states and decisions.
  • European Commission powers are transferred to the Information Commissioner or the Secretary of State. For example the Information Commissioner has the power to issue Standard Contractual Clauses (a mechanism by which data is transferred internationally).
  • Section 4 of the DPA is amended to make clear that it applies to personal data to which the UK GDPR applies and that it supplements and must be read with the UK GDPR.
  • A new section 17A in the DPA covers transfers based on adequacy regulations (a mechanism by which data is transferred internationally). The Secretary of State has the power to make “adequacy regulations” to specify that a third country, territory, sector or international organisation ensures an adequate level of data protection.

International transfers

At the end of the transition period, the UK would have been a third country under the EU GDPR, meaning that EU controllers and processors would need to ensure that an adequacy mechanism was in place to protect transfers i.e. Standard Contractual Clauses or Binding Corporate Rules.

However, on 24 December 2020, the UK and EU reached a trade and co-operation agreement addressing the arrangements following the end of the Brexit transition period on 31 December 2020 (as implemented by the European Union (Future Relationship) Act 2020).

Most significantly the agreement has introduced at least a four month period (extended another two months unless one of the parties objects) in which data can flow between the regimes without additional safeguards. The aim of the agreement is to give organisations breathing space while the Commission continues its assessment of adequacy for the UK. If the UK is granted an adequacy decision then data will continue to flow freely between the regimes after this period.

Data processed or obtained before the end of the transition period

From the end of the transition period the UK is required to continue applying “EU law on the protection of personal data” to the processing of EU personal data where the personal data was processed before the end of the transition period. It will therefore be helpful for organisations to know what data has been processed in the EU before the end of the transition period so that, should the regimes diverge, that data continues to have EU law applied to it. By contrast, personal data about UK data subjects processed in the UK before the end of the transition period will fall under the UK GDPR and DPA.

More to come - UK GDPR and EU GDPR to diverge?

The big next development in data protection and Brexit will be whether or not the commission grants the UK an adequacy decision. Organisations should have a clear idea of how they are going to confront the possibility that no adequacy decision is reached. This will mean reviewing data flows and the contracts that enable them.

The ICO is right to say that the data protection principles before Brexit will largely remain the same in the UK. The UK GDPR and DPA as a new legislative framework are more than anything else a replica of what has come before. But with an adequacy decision pending and the EU’s draft E-Privacy Regulation still being finalised and therefore without a hope of being applied in the UK, the two data protection regimes could split in significant ways in a relatively short amount of time.

If you have any questions on Brexit and data protection, data protection law more generally or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


Statement Of Objections To Amazon

EC Sends Statement Of Objections To Amazon - Big Data Law

On 10 November 2020, the European Commission announced that it has sent a statement of objections to Amazon as part of its investigation into whether Amazon's use of sensitive data from independent retailers who sell on its marketplace is in breach of Article 102 of the Treaty on the Functioning of the European Union (TFEU). The Commission has also opened a formal investigation into Amazon's allegedly discriminatory business practices.

What data is amazon collecting?

Amazon has a dual role as a platform:

  • It provides a marketplace where independent sellers can sell products directly to consumers. Amazon is the most important or dominant marketplace in many European countries.
  • It sells products as a retailer on the same marketplace, in competition with those sellers.

As a marketplace service provider, Amazon has access to non-public business data of third party sellers. This data relates to matters such as the number of ordered and shipped units of products, the sellers' revenues on the marketplace, the number of visits to sellers' offers, data relating to shipping, to sellers' past performance, and other consumer claims on products, including the activated guarantees.

Investigation into use of independent sellers’ data

In July 2019, the Commission announced that it had opened a formal investigation to examine whether Amazon's use of competitively sensitive information about marketplace sellers, their products and transactions on the Amazon marketplace constitutes anti-competitive agreements or practices in breach of Article 101 of the Treaty on the Functioning of the European Union (TFEU) and/or an abuse of a dominant position in breach of Article 102 of the TFEU.

Statement of objections to Amazon

The Commission has now sent a statement of objections to Amazon alleging that Amazon has breached Article 102 of the TFEU by abusing its dominant position as a marketplace service provider in Germany and France. Having analysed a data sample covering over 80 million transactions and around 100 million product listings on Amazon's European marketplaces, the Commission is alleging in its statement of objections to Amazon that:

  • Very large quantities of non-public seller data are available to employees of Amazon's retail business and feed into automated systems. Granular, real-time business data relating to third party sellers' listings and transactions on the Amazon platform is systematically feed into the algorithms of Amazon's retail business, which aggregates the data and uses it to calibrate Amazon's retail offers and strategic business decisions (such as which new products to launch, the price of each individual offer, the management of inventories, and the choice of the best supplier for a product).
  • This acts to the detriment of other marketplace sellers as, for example, Amazon can use this data to focus its offers in the best-selling products across product categories and to adjust its offers in light of the non-public data of competing sellers.
  • The use of non-public marketplace seller data, therefore, allows Amazon to avoid the normal risks of retail competition and to leverage its dominance in the market for the provision of marketplace services in France and Germany, which are the biggest markets for Amazon in the EU.

The Commission's concerns are not only about the insights Amazon Retail has into the sensitive business data of one particular seller, but rather about the insights that Amazon Retail has about the accumulated business data of more than 800,000 active sellers in the EU, covering more than a billion different products. Amazon is able to aggregate and combine individual seller data in real time, and to draw precise, targeted conclusions from these data.

The Commission has, therefore, come to the preliminary conclusion that the use of these data allows Amazon to focus on the sale of the best-selling products. This marginalises third party sellers and limits their ability to grow. Amazon now has the opportunity to examine the documents in the Commission's investigation file, reply in writing to the allegations in the statement of objections and request an oral hearing to present its comments on the case.

Investigation into Amazon practices regarding the “Buy Box” and Prime label

The Commission states that, as a result of looking into Amazon's use of data, it identified concerns that Amazon's business practices might artificially favour its own retail offers and offers of marketplace sellers that use Amazon's logistics and delivery services. It has, therefore, now formally initiated proceedings in a separate investigation to examine whether these business practices breach Article 102 of the TFEU.

Problems with digital platforms

In announcing these developments, EU Commission Vice-President Vestager commented that:

“We must ensure that dual role platforms with market power, such as Amazon, do not distort competition. Data on the activity of third party sellers should not be used to the benefit of Amazon when it acts as a competitor to these sellers. The conditions of competition on the Amazon platform must also be fair. Its rules should not artificially favour Amazon's own retail offers or advantage the offers of retailers using Amazon's logistics and delivery services. With e-commerce booming, and Amazon being the leading e-commerce platform, a fair and undistorted access to consumers online is important for all sellers.”

The report prepared for the Commission by three special advisers on "Competition Policy for the digital era" highlighted possible competition issues in relation to digital platforms. As part of the Digital Services Act package, the Commission is now considering the introduction of ex ante regulation for "gatekeeper" platforms, and consulted on issues related to this in June 2020

Big data regulation

It remains to be seen how these EC investigations will play out and whether the same principles can be applied to smaller online platforms. UK regulators also appear to be ramping up their interest in the overlap between competition law and digital business. Chief Executive of the UK Competition and Markets Authority (CMA), Andrea Coscelli, noted last month that the CMA is increasingly focused on “scrutinising how digital businesses use algorithms and how this could negatively impact competition and consumers” and “will be considering how requirements for auditability and explainability of algorithms might work in practice”.

If you have any questions on the EC’s statement of objections to Amazon, data protection law or on any of the issues raised in this article please get in touch with one of our data protection lawyers.


ICO guidance on AI

ICO Guidance On AI Published - AI And Data Protection

On 30 July 2020, the Information Commissioner’s Office (ICO) published its long-awaited guidance on artificial intelligence (AI) and data protection (ICO guidance on AI), which forms part of its AI auditing framework. However, recognising that AI is still in its early stages and is developing rapidly, the ICO describes the guidance as foundational guidance. The ICO acknowledges that it will need to continue to offer new tools to promote privacy by design in AI and to continue to update the guidance to ensure that it remains relevant.

The need for ICO guidance on AI

Whether it is helping to tackle the coronavirus disease (COVID-19), or managing loan applications, the potential benefits of AI are clear. However, it has long been recognised that it can be difficult to balance the tensions that exist between some of the key characteristics of AI and data protection compliance, particularly under the General Data Protection Regulation (679/2016/EU) (GDPR).

The Information Commissioner Elizabeth Denham’s foreword to the ICO guidance on AI confirms that the underlying data protection questions for even the most complex AI project are much the same as with any new project: is data being used fairly, lawfully and transparently? Do people understand how their data is being used and are they being kept secure?

That said, there is a recognition that AI presents particular challenges when answering these questions and that some aspects of the law require greater thought. Compliance with the data protection principles around data minimisation, for example, can seem particularly challenging given that many AI systems allow machine learning to decide what information is necessary to extract from large data sets.

Scope of the ICO guidance on AI

The guidance forms part of the ICO’s wider AI auditing framework, which also includes auditing tools and procedures for the ICO to use in its audits and investigations and a soon-to-be-released toolkit that is designed to provide further practical support for organisations auditing their own AI use.

It contains recommendations on good practice for organisational and technical measures to mitigate AI risks, whether an organisation is designing its own AI system or procuring one from a third party. It is aimed at those within an organisation who have a compliance focus, such as data protection officers, the legal department, risk managers and senior management, as well as technology specialists, developers and IT risk managers. The ICO’s own auditors will also use it to inform their statutory audit functions.

It is not, however, a statutory code and there is no penalty for failing to adopt the good practice recommendations if an alternative route can be found to comply with the law. It also does not provide ethical or design principles; rather, it corresponds to the data protection principles set out in the GDPR.

Structure of the guidance

The ICO guidance on AI is set out in four parts:

Part 1. This focuses on the AI-specific implications of accountability; namely, responsibility for complying with data protection laws and demonstrating that compliance. The guidance confirms that senior management cannot simply delegate issues to data scientists or engineers, and are responsible for understanding and addressing AI risks. It considers data protection impact assessments (which will be required in the majority of AI use cases involving personal data), setting a meaningful risk appetite, the controller and processor responsibilities, and striking the required balance between the right to data protection and other fundamental rights.

Part 2. This covers lawfulness, fairness and transparency in AI systems, although transparency is addressed in more detail in the ICO’s recent guidance on explaining decisions made with AI (2020 guidance). This section looks at selecting a lawful basis for the different types of processing (for example, consent or performance of a contract), automated decision making, statistical accuracy and how to mitigate potential discrimination to ensure fair processing.

Part 3. This section covers security and data minimisation, and examines the new risks and challenges raised by AI in these areas. For example, AI can increase the potential for loss or misuse of large amounts of personal data that are often required to train AI systems or can introduce software vulnerabilities through new AI-related code. The key message is that organisations should review their risk management practices to ensure that personal data are secure in an AI context.

Part 4. This covers compliance with individual rights, including how individual rights apply to different stages of the AI lifecycle. It also looks at rights relating to solely automated decisions and how to ensure meaningful input or, in the case of solely automated decisions, meaningful review, by humans.

ICO guidance on AI - headline takeaway

According to the Information Commissioner, the headline takeaway from the ICO guidance on AI is that data protection must be considered at an early stage. Mitigation of risk must come at the AI design stage as retrofitting compliance rarely leads to comfortable compliance or practical products.

The guidance also acknowledges that, while it is designed to be integrated into an organisation’s existing risk management processes, AI adoption may require organisations to reassess their governance and risk management practices.

A landscape of guidance

AI is one of the ICO’s top three strategic priorities, and it has been working hard over the last few years to both increase its knowledge and auditing capabilities in this area, as well as to produce practical guidance for organisations.

To develop the guidance, the ICO enlisted technical expertise in the form of Doctor (now Professor) Reuben Binns, who joined the ICO as part of a fellowship scheme. It produced a series of informal consultation blogs in 2019 that were focused on eight AI-specific risk areas. This was followed by a formal consultation draft published in February 2020, the structure of which the guidance largely follows. Despite all this preparatory work, the guidance is still described as foundational.

From a user perspective, practical guidance is good news and the guidance is clear and easy to follow. Multiple layers of guidance can, however, become more difficult to manage. The ICO has already stated that the guidance has been developed to complement its existing resources, including its original Big Data, AI and Machine Learning report (last updated in 2017), and its more recent 2020 guidance.

In addition, there are publications and guidelines from bodies such as the Centre for Data Ethics and the European Commission, and sector-specific regulators such as the Financial Conduct Authority are also working on AI projects. As a result, organisations will need to start considering how to consolidate the different guidance, checklists and principles into their compliance processes.

Opportunities and risks

“The innovation, opportunities and potential value to society of AI will not need emphasising to anyone reading this guidance. Nor is there a need to underline the range of risks involved in the use of technologies that shift processing of personal data to complex computer systems with often opaque approaches and algorithms.” (Opening statement of ICO guidance on AI and data protection.)

If you have any questions on data protection law or on any of the issues raised in the ICO guidance on AI please get in touch with one of our data protection lawyers.


Schrems II

Schrems II - EDPB publishes FAQs on judgment

Following Schrems II (in the case of Data Protection Commissioner v Facebook Ireland and Maximillian Schrems) the European Data Protection Board (EDPB) has adopted a set of frequently asked questions and responses (FAQs) concerning the judgment. For more information about that decision read our blog.

The Schrems II judgement

The European Court of Justice (ECJ) has invalidated the EU Commission’s decision approving the EU-U.S. Privacy Shield because U.S. intelligence agencies can access personal data relating to EU residents in ways that are incompatible with EU personal data protection laws and EU residents lack proper enforcement rights.

In addition, the ECJ ruled that the controller-processor Standard Contractual Clauses (SCCs), another widely used mechanism for international data transfers, remain valid. However, data exporters and importers must assess, prior to any transfer, the laws of the third country to which data is transferred to determine if those laws ensure an adequate level of protection of personal data.

Moving forward

The judgment was welcomed by the EDPB because it highlights the fundamental right to privacy in the context of the transfer of personal data to third countries. In response to the ECJ's ruling that the adequacy decision for the EU-US Privacy Shield is invalid, the EDPB invited the EU and US to work together and establish a complete and effective framework that guarantees the level of protection granted to personal data in the US is essentially equivalent to that guaranteed within the EU.

Schrems II: EDPB FAQs

Although the ECJ also determined in Schrems IIthat controller to processor standard contractual clauses (SCCs) remain valid as an adequate safeguard for data transfers, the EDPB commented that:

  • No grace period - the ECJ ruling applies with immediate effect. There will be no grace period during which organisations can remedy their Privacy Shield-based data transfers. In contrast, when the US-EU Safe Harbor framework was invalidated in 2015, the Article 29 Working Party granted a grace period until an appropriate solution was found with the U.S. authorities. It did so via a statement dated 16 October 2015, stating no enforcement action would be taken until the end of January 2016. However, while there will be no EU-wide grace period, national supervisory authorities will still have discretion over when to take enforcement actions in their territory.
  • The exporter and importer of the data being transferred must look beyond the protection provided by the terms of the SCCs and assess whether the country where the data is being transferred offers adequate protection, in the context of the non-exhaustive elements set out in Article 45(2) of the GDPR. If it is determined that the country of destination does not provide an essentially equivalent level of protection to the GDPR, the exporter may have to consider adopting further protective measures in addition to using SCCs. The EDPBis considering what those additional measures could include and will report in due course.
  • The judgment highlights the importance of complying with the obligations included in the terms of the SCCs. If those contractual obligations are not or cannot be complied with, the exporter is bound by the SCCs to suspend the transfer or terminate the SCCs, or to notify its competent supervisory authority if it intends to continue transferring data.
  • Supervisory authorities (SAs) have a responsibility to suspend or prohibit a transfer of data to a third country pursuant to SCCs if those clauses are not or cannot be complied with in that third country, and the protection of the data transferred cannot be ensured by other means.
  • Implication for other transfer mechanisms including BCRs. The threshold set by the ECJ applies to all appropriate transfer mechanisms under Article 46 GDPR. U.S. law referred to by the ECJ (i.e., the Foreign Intelligence Surveillance Act and the Executive Order 12333) applies to any transfer to the U.S. via electronic means, regardless of the transfer mechanism used for such transfer. In particular, the ECJ’s judgment applies in the context of binding corporate rules (BCRs), since U.S. law will also prevail over this cross-border data transfer mechanism. Similar to the SCCs, transfers taking place based on BCRs should be assessed and appropriate supplementary measures should be taken. The EDPB states that it will further assess the consequences of the judgment on transfer mechanisms other than SCCs and BCRs (e.g., approved codes of conduct or certification mechanisms).
  • Companies can rely on the derogations set forth under Article 49 of the GDPR, provided that the conditions as interpreted by the EDPB in its guidance on Article 49 of the GDPR are met. When transferring personal data based on individuals’ consent, such consent should be explicit, specific to the particular data transfer(s) and informed, particularly regarding the risks of the transfer(s). In addition, transfers of personal data that are necessary for the performance of a contract should only take place occasionally. Further, in relation to transfers necessary for important reasons of public interest, the EDPB emphasises the need for an important public interest, as opposed to only focusing on the nature of the transferring organization. According to the EDPB, transfers based on the public interest derogation cannot become the rule and must be limited to specific situations and to a strict necessity test.

Schrems II: Further clarification expected

The EDPB is still assessing the judgment and will provide further clarification for stakeholders and guidance on transfer of personal data to third countries pursuant to the Schrems II judgment. Data exporters and importers should closely monitor upcoming developments and guidance of the EDBP and national supervisory authorities, assess their existing cross-border transfers and consider implementing supplementary legal, technical or organisational measures in order to ensure they can continue to transfer personal data to third countries lawfully. Whilst the judgement most obviously applies to data transfers with the US it also has wider implications for transfers to any country outside the EU (third countries).

If you have any questions on Schrems II or data protection law more generally please get in touch with one of our data protection lawyers.