Clearview AI Inc wins appeal against £7.5m Information Commissioner’s Office fine
James Castro-Edwards dissects the recent First-tier Tribunal’s ruling in Clearview AI Inc v The Information Commissioner [2023] UKFTT 819 (GRC) (17 October 2023)
On 17 October 2023, the First-tier Tribunal published its decision, allowing an appeal by facial recognition database company Clearview AI Inc, against the Information Commissioner’s Office (ICO), which had fined the company over £7.5m in 2022. The fine followed a joint investigation by the ICO and the Office of the Australian Information Commissioner (OAIC), which found that Clearview AI had committed multiple breaches of UK data protection law.
On appeal, the Tribunal found that the ICO did not have jurisdiction to fine Clearview AI, which was established in the US. The Tribunal accepted that the activities of a foreign company could fall within the territorial scope of the General Data Protection Regulation (GDPR) and UK GDPR. However, it found that Clearview AI’s activities were outside the material scope of both regulations.
The appeal was allowed on narrow grounds, which are unlikely to be available for many commercial companies. For non-UK commercial organisations that are engaged in the collection of publicly available personal data relating to UK citizens, the appeal confirms that the ICO can and will take enforcement action based on the extra-territorial scope of the UK GDPR.
In a world where personal information is regarded as a digital asset that can be monetised in ever more complex ways, the Clearview AI judgments serve as a reminder to global businesses that personal data which UK residents have posted online must still be handled in accordance with the law.
Clearview AI
Clearview AI is a Delaware incorporated company that does not have an establishment in the UK or the EU, and did not at the time of the alleged infringements. It has clients in the US and around the world, in countries that include Panama, Brazil, Mexico and the Dominican Republic. It does not have clients in the UK or the EU and has not offered services to commercial clients since 2020, following a settlement with the American Civil Liberties Union. All of its clients carry out criminal law enforcement and/or national security functions, and Clearview AI does not provide its service to any clients outside of this context.
Clearview AI provides a service which enables its clients to compare a facial image against billions of images in its databases, in order to find a match. The database was created by copying photographic images which were publicly available on the internet, along with additional information, such as the URL, the link to the social media profile and the name of the profile if the image was sourced from social media. Each facial image is then assigned a set of vectors using Clearview AI’s machine learning facial recognition algorithm. The facial vectors are then uploaded to a database, so that the vectors of faces that are similar to each other are digitally stored closer together than faces that are very different from each other. The clustering process facilitates the efficient provision of results to clients.
Where a client seeks to identify an individual from an image, it provides Clearview AI with a ‘probe image’ which is compared against the database to identify close matches, using automated facial recognition technology. The system then provides the customer with a list of stored images which bear a close likeness to the probe image, as well as additional information including the URL related to the original image. This information helps the customer to determine the identity of the individual in the probe image. The system does not say whether the images are the same person or not, which is a question for the client to decide.
Clearview stopped providing its services to commercial customers in 2020. Currently, its customers all carry out criminal law enforcement and/or national security functions, and only use the service for those purposes. Clearview’s customers are exclusively government agencies or government agency contractors, none of which are located in the UK or the EU.
ICO enforcement
The ICO announced the opening of a joint investigation with the OAIC on 3 November 2021. According to the ICO, the investigation revealed that Clearview AI had collected more than 20 billion images of people’s faces from publicly available sources around the world, to create an online database. The ICO took the view that, given the large number of UK social media and internet users, Clearview AI’s database was highly likely to include the personal data of people in the UK. In addition, Clearview AI’s service had previously been used in the UK on a trial basis, involving at least five UK law enforcement organisations. During this test phase, a number of searches using probe images were carried out, against which numerous matches were returned. The ICO concluded that this indicated that a substantial number of UK individuals were present in the Clearview AI database. According to the ICO, Clearview AI gave no indication of any intention to reduce the number of UK individuals in the database, or exclude the collection of further UK individuals’ personal data.
The data collected included images, metadata and URLs, which constitute personal data of the individuals to whom they relate. The facial vectors derived from the images constitute special categories of personal data. By collecting images from the internet, storing them in a database, generating vectors from the images and matching these against probe images, Clearview AI was processing personal data. The images collected from the internet and the probe images would inevitably show individuals engaged in certain activities, so both would disclose information about individuals’ behaviour.
The ICO found that Clearview’s collection, storage and derivation of facial vectors amounted to processing, which was at Clearview’s instigation. As such, Clearview AI carried out these processing activities as a controller. Clearview’s collection of probe images, its comparison to the Clearview database and providing a list of matches to the client also constituted processing, for which both Clearview AI and the client were responsible, so both were controllers.
Clearview AI’s processing took place both before and after the end of the Brexit implementation period, i.e., 11pm on 31 December 2020, which meant that, according to the ICO, it fell within the scope of both the GDPR and the UK GDPR, by virtue of Article 3(2)(b), since it concerned ‘monitoring’ the behaviour of individuals in the UK. The ICO took the view that, by seeking to match probe images, clients are monitoring the behaviour of the individuals in the probe images, who are likely to be of interest to law enforcement because of their behaviour or suspected behaviour. The probe images may also show suspects engaged in apparent criminal activity. Clients are also likely to be monitoring the behaviour of individuals that are identified as a potential match in the Clearview AI database, since they are likely to be able to ascertain individuals’ behaviour from the images. Clearview AI’s processing ‘related to’ the client’s monitoring of the individuals in the UK, a substantial number of whom were located in the UK, thereby triggering Article 3(2)(b). The ICO noted that the French data protection authority (the CNIL) had taken a similar position regarding the question of whether it had jurisdiction over Clearview AI’s processing, and whether Clearview’s processing of data subjects in the EU fell within Article 3(2)(b) GDPR.
The ICO found that Clearview AI’s processing breached a number of the principles of the GDPR and the UK GDPR (noting that there is no material difference between the relevant provisions in the two regulations), which were as follows:
- The fairness, lawfulness and transparency principle (Article 5(1)(a)): The processing was neither fair, lawful or transparent. It was unfair, since affected data subjects were not made aware that the processing was taking place, and would have not expected it. It was unlawful, since Clearview AI was unable to establish a lawful basis for processing personal data under Article 6, or special category personal data as required by Article 9. It was not transparent as it was invisible to the affected data subjects and Clearview AI had failed to provide the transparency information prescribed by Article 14.
- The storage limitation principle (Article 5(1)(e)): Clearview AI did not have a data retention policy, so could not ensure that personal data was not held for ‘longer than is necessary’; there was no evidence that stored images were ever removed from the database, instead, Clearview continued to grow the database.
- No lawful basis for processing personal data (Article 6): Clearview was unable to establish any of the lawful bases for processing personal data.
- No lawful basis for processing special categories of personal data (Article 9): Clearview AI was unable to establish a lawful basis for the facial vector data, which constituted special category data.
- No transparency information (Article 14): Clearview did not provide the necessary ‘fair processing information’ to the affected data subjects. This could only be obtained if the data subjects contacted Clearview AI.
- Failure to uphold data subjects’ rights (Articles 15, 16, 17, 21 and 22): Clearview impeded data subjects’ rights of access, rectification, erasure, their right to object and their rights in relation to automated decision making. In order to exercise their rights, the data subjects were required to provide additional personal data, in the form of a photograph, which had a disincentive effect.
- Failure to carry out a data protection impact assessment (DPIA) (Article 35): Clearview did not carry out a DPIA in respect of the processing of UK residents’ personal data.
In the light of its findings, on 18 May 2022, the ICO issued Clearview AI with a monetary penalty notice, imposing a fine of £7,552,800. The penalty took into account the invisible nature of the processing, the fact that it involved special category data (and may have involved children’s data) and involved novel or invasive technology that caused a high degree of intrusion into the affected individuals’ privacy.
In addition to issuing a fine, on 18 May 2022, the ICO issued an enforcement notice, ordering Clearview to stop collecting the personal data of UK residents from the internet, to delete any such personal data that it had already collected from its systems within six months, and refrain from offering its services to any customer in the UK. The ICO enforcement notice referred to the decision by the US District Court for the Northern District of Illinois, to demonstrate that it was practicable for Clearview AI to comply with the requirements of the enforcement notice.
In response to the Illinois Court decision, Clearview AI had stated that it had blocked all images in the database that were geolocated in Illinois from being searched; had constructed a ‘geofence’ around Illinois, had not collected facial vectors from images that contained metadata associated with Illinois and had not collected facial vectors from images stored on servers that displayed Illinois IP addresses or websites with URLs containing keywords such as ‘Chicago’ or ‘Illinois’.
Appeal to the First-tier Tribunal
Following the monetary penalty and enforcement notices, Clearview AI challenged the alleged breaches of the UK GDPR and disputed the ICO’s jurisdiction in an appeal to the First-tier Tribunal. The Tribunal is an independent body which is responsible for hearing appeals against decisions made by the Information Commissioner. Clearview’s appeal to the Tribunal was made on the basis that it is a foreign company, providing services to ‘foreign clients, using foreign IP addresses, and in support of the public interest activities of foreign governments and government agencies, in particular in relation to their national security and criminal law enforcement functions’.
In its decision on 17 October, the Tribunal found that the UK GDPR did not apply, so the ICO did not have jurisdiction. The Tribunal agreed that Clearview AI’s activities fell within the territorial scope of the UK GDPR, albeit that it was Clearview’s clients that were carrying out the monitoring, which Clearview facilitated rather than carried out. However, Clearview AI only provided services to non-UK/EU law enforcement or national security bodies and their contractors, which fell outside the material scope of the GDPR and UK GDPR.
Article 2 of the GDPR and the UK GDPR defines the material scope of each regulation, though the respective provisions are constructed differently. The effect in both cases is to exclude specified types of processing from the scope of the respective regulation. In this instance, the excluded processing is that which falls outside the scope of EU law, namely the acts of foreign governments. The acts of foreign governments fall outside the material scope of both the GDPR and the UK GDPR, since it is not for one government to seek to bind or control the activities of another sovereign state.
The Tribunal concluded that Clearview AI’s processing was outside the material scope of the GDPR as provided by Article 2, and was not ‘relevant processing’ for the purposes of Article 3 UK GDPR. Accordingly, the Tribunal found that the Information Commissioner did not have jurisdiction and allowed Clearview’s appeal.
Implications for non-UK commercial organisations
The decision affirms the extra-territorial reach of the UK GDPR, in particular the extent to which this applies to a company established outside the UK, but which is involved in monitoring the behaviour of individuals in the UK. While in this instance, the Tribunal found that the UK GDPR did not apply, it did so on narrow grounds, which may not be available for commercial organisations that provide services to commercial rather than government clients. Accordingly, non-UK organisations carrying out similar activities for commercial purposes would need to consider their obligations under applicable data protection law.
James Castro-Edwards is counsel at Arnold & Porter
arnoldporter.com