Public Law Project announces legal challenge against Home Office algorithm for detecting sham marriages
Pre-action letter claims that the tool is discriminatory and breaches data protection rules
The Public Law Project announced on 17 February that it has initiated a legal challenge against the Home Office’s use of an algorithm to detect ‘sham marriages,’ following an earlier separate ruling on the algorithm by the Information Tribunal which found evidence of “indirect discrimination” and “potential bias”.
The Home Office uses an automated ‘triage’ tool, which utilises a machine learning algorithm, to make decisions about whether couples planning to get married should be subject to a ‘sham marriage’ investigation. The tools output categorises couples according to either a ‘pass’ or ‘fail’. However, the Public Law Project claims that Home Office data show that the tool fails certain nationalities at disproportionate rates.
The Public Law Project has issued a pre-action letter to the Home Office, which sets out the legal grounds for the challenge by the national legal charity. The letter alleges that: (1) the outputs of the triage tool appear to indirectly discriminate on the basis of nationality; (2) the Home Office does not appear to have discharged its Public Sector Equality Duty to eliminate unlawful discrimination and advance equality of opportunity, which the courts have established is more demanding when using novel digital systems; (3) the secrecy around the system breaches transparency rules under the General Data Protection Regulation (GDPR); and (4) if manual review of ‘fail’ cases by a human does not always take place, this would go against government policy, and place the Home Secretary in breach of the Immigration Act 2014 for delegating decisions to a machine learning algorithm.
Commenting on the legal challenge, Legal Director at the Public Law Office, Ariane Adam said, “Couples who fail face invasive and unpleasant investigations and can have their permission to marry delayed without even being told that a machine was involved in the decision-making process. Home Office data show that the triage tool fails certain nationalities at disproportionate rates that are inconsistent with their contribution to migration in the UK. The information available demonstrates prima facie indirect nationality discrimination, with some nationalities, including Greeks, Bulgarians, Romanians and Albanians, disproportionately failing triage. It also suggests that there is no manual review in every ‘fail’ case. If that is in fact the case, the operation of the tool would be unlawful and would not conform to the Home Office’s own policy. The Home Office’s refusal to be transparent about the triage tool may also violate data protection obligations.”