December 15, 2021
AI Law
Data Protection Law

Facial recognition technology, to be truly effective at scale, relies upon a substantial database of images. Back in 2020, Clearview AI Inc, a company marketing facial recognition technology to police around the world, told an Australian agency it hoped to have 30 billion images indexed by the end of 2020. Clearview is now subject to a potential £17 million fine by the UK’s Information Commissioner’s Office (ICO), for serious breaches of data protection rules (note that ICO fined Clearview AI Inc in 2022). The company also sees regulatory action in Australia and Canada as well as a clear stance taken by the European Commission in April 2021’s draft EU AI Act. The draft act specifically bans the use of biometric identification systems in public spaces for law enforcement.

With Amazon, Microsoft and IBM ruling out the sale of facial recognition technology to law enforcement back in June 2020, the future of such technology has been put into question. The investigation serves as an important reminder of the legal issues that will arise when monetising vast databases of sensitive data as well as data scraping.

Background

Clearview AI is incorporated in Delaware in the United States. The company provides a facial recognition technology search tool for registered users around the world, made available through a mobile and web application. The tool allows users to upload an image of an individual’s face and run a search against Clearview’s database of more than 3 billion images. The tool then displays likely matches and links to where the photos appeared.

The Office of the Australian Information Commissioner (OAIC) described Clearview’s facial recognition technology as functioning in five steps:

  • Automated image scraper – this tool operates as a web crawler, collecting images of individuals’ faces from publicly available sources across the internet (including social media). The web crawler also collects the source webpage URL, and any associated metadata (including the webpage title). The images and collected information are then stored on Clearview’s server.
  • Creation of vectors – the tool generates a mathematical representation of these scraped images using a machine learning algorithm.
  • Image uploaded – a registered user uploads an individual’s image through the app or website. The tool analyses this image and generates a mathematical representation.
  • Matching process – the tool compares the uploaded image against all scraped images.
  • Matched images – if the tool identifies sufficiently similar scraped images, the matched images are displayed alongside the uploaded image on the user’s screen. The user can then click the associated URL to be re-directed to the web page where the image was originally collected.

Although Clearview stated that it currently only offers facial recognition technology services to government customers for law enforcement purposes, its patent applications suggest proposed private sector uses, including:

  • to learn more about a person the user has just met, such as through business, dating or other relationship;
  • to verify personal identification for the purpose of granting or denying access for a person, a facility, a venue, or a device; and
  • to accurately dispense social benefits and reduce fraud (by a public agency).

Facial recognition technology – ICO and OAIC investigation

The UK’s Information Commissioner’s Office (ICO) and the Office of the Australian Information Commissioner (OAIC) opened a joint investigation into the personal information handling practices of Clearview AI Inc in July 2020. The investigation focused on the company’s use of data scraped from the internet and the use of biometrics in facial recognition technology.

OAIC finding

The OAIC commissioner Angelene Falk found that Clearview had breached Australians’ privacy by scraping biometric data from the web and disclosing it through a facial recognition tool. The determination ordered Clearview to cease collecting this data from individuals in Australia and destroy any existing data. It highlighted the lack of transparency around data collection practices, the monetisation of individuals’ data for a purpose entirely outside reasonable expectations, and the risk of adversity to people whose images are included in their database.

As the commissioner stated: “when Australians use social media or professional networking sites, they don’t expect their facial images to be collected without their consent by a commercial entity to create biometric templates for completely unrelated identification purposes”.

Facial recognition technology – ICO finding

The ICO’s preliminary view is that Clearview appears to have failed to comply with UK data protection laws in several ways including by:

  • failing to process the information of people in the UK in a way they are likely to expect or that is fair;
  • failing to have a process in place to stop the data being retained indefinitely;
  • failing to have a lawful reason for collecting the information;
  • failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR);
  • failing to inform people in the UK about what is happening to their data; and
  • asking for additional personal information, including photos, which may have acted as a disincentive to individuals who wish to object to their data being processed.

Clearview response

Clearview now has the opportunity to make representations in respect of these alleged breaches. Any representation will be considered by the ICO before a final decision is made – expected in mid-2022. All previous GDPR fines made by the ICO have been reduced after this process. British Airways’ proposed fine was £183 million, but they ended up paying £20 million.

Clearview will probably use similar arguments to the ones it made in response to the OAIC’s findings. Namely that Clearview, an American company, is not subject to Australian or UK data protection rules and it is not collecting data subject to the regulations. However, the OAIC found that Clearview had been handling Australian personal information, “sensitive biometric data”, and the ICO have already stated that Clearview had failed to “meet the higher data protection standards required for biometric data”. The UK GDPR regulates any organisation that processes UK personal data and therefore Clearview, although an American company, will still be subject to UK data protection laws.

Facial recognition technology – biometric data

The investigation and findings were heightened by the involvement of biometric data in the facial recognition technology. UK data protection rules consider biometric data to be ‘special category data’. The processing of special category data introduces additional data protection obligations. The UK GDPR defines biometric data in Article 4(14) as:

“personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic (fingerprint) data”

Examples include:

  • facial recognition;
  • fingerprint verification;
  • iris scanning;
  • retinal analysis;
  • voice recognition; and
  • ear shape recognition.

The ICO guidance goes into further detail about facial recognition technology. It states that “although a digital image may allow for identification using physical characteristics, it only becomes biometric data if you carry out ‘specific technical processing’… this involves using the image data to create an individual digital template or profile, which in turn is used for automated image matching and identification.” This was exactly what Clearview’s AI technology was capable of doing. The likelihood of Clearview therefore successfully arguing that the data they processed was not UK biometric data subject to the UK GDPR is highly unlikely.

In the case R (Bridges) v Chief Constable of South Wales Police and others [2020] EWCA Civ 1058, it was specifically stated that facial recognition technology was “one of the primary forms of biometric data, alongside fingerprints and DNA” and was therefore subject to the UK GDPR. The court deemed in this case, however, that the weighing exercise undertaken by the police of the benefits and negatives caused by such technology justified its use. It was stated that the technology was “deployed in an open and transparent way, with significant public engagement… use for a limited time, and covered a limited footprint… deployed for the specific purpose of seeking to identify particular individuals who may have been in the area and whose presence was of justifiable interest to the police”. Clearview’s sweeping data scraping is by comparison much less controlled and no legitimate engagement with the public was taking place.

Data scraping

We explored the issues involved in data/web scraping in our blog: ‘Web Scraping – Legal Issues’. Issues include intellectual property rights as well as data protection. If the data scraper does not have the necessary licence to use the data in the way it intends or the data scraped contains personal data subject to differing national data rules, potential legal challenges could arise. It does not matter if the data is publicly available and many media websites terms and conditions restrict the commercial use of data obtained from their sites. Facebook, Google and Twitter did in fact send cease and desist letters to Clearview back in January/February 2020 to stop them scraping data from their sites.

Here to help

This investigation illustrates the pitfalls when attempting to process vast amounts of sensitive biometric data for commercial purposes. Clearview describes itself as the ‘World’s Largest Facial Network’ – it is their processing of such large amounts of data which is now backfiring. Businesses looking to use or provide similar technology need to understand the importance of undertaking a proper data impact assessment in order to ensure they are acting lawfully.

EM law specialises in technology and UK data protection law. Get in touch if you need advice on data protection law or if you have any questions on the above.