Immigrant rights campaigners have begun a ground-breaking legal case to establish how a Home Office algorithm that filters UK visa applications actually works.

The challenge is the first court bid to expose how an artificial intelligence program affects immigration policy decisions over who is allowed to enter the country.
Foxglove, a new advocacy group promoting justice in the new technology sector, is supporting the case brought by the Joint Council for the Welfare of Immigrants (JCWI) to legally force the Home Office to explain on what basis the algorithm “streams” visa applicants.

The two groups both said they feared the AI “streaming tool” created three channels for applicants including a “fast lane” that would lead to “speedy boarding for white people”.

The Home Office has insisted that the algorithm is used only to allocate applications and does not ultimately rule on them. The final decision remains in the hands of human caseworkers and not machines, it said.

A spokesperson for the Home Office said: “We have always used processes that enable UK Visas and Immigration to allocate cases in an efficient way.
“The streaming tool is only used to allocate applications, not to decide them. It uses data to indicate whether an application might require more or less scrutiny and it complies fully with the relevant legislation under the Equalities Act 2010.”
Cori Crider, a director at Foxglove, rejected the Home Office’s defence of the AI system.

“The Home Office insists its ‘visa streaming’ algorithm has no racial bias, but that claim is pretty threadbare. We’re told the system uses nationality to ‘stream’ applicants green, yellow and red – and it’s easy to guess who ends up in the green queue and who gets pushed to the back of the bus in red. If your algorithm singles out people for a digital pat-down and offers speedy boarding to white people, well, that’s unlawful.”
In its pre-action legal letter this month to the home secretary, Priti Patel, the Joint Council for the Welfare of Immigrants argues that even the streaming process will affect the final decision on visas.

The case is being backed by a gofundme.com page titled “Deported by algorithm”.

Its letter to the home secretary states: “An individual visa applicant allocated by the streaming tool to the ‘Red’ category because of their nationality might still be granted a visa. However, their prospects of a successful application are much lower than the prospect of an otherwise equivalent individual with a different nationality allocated to the ‘Green’ category. To similar effect, the same ‘Red’ application is likely to take much longer than the ‘Green’ one, again involving less favourable treatment of the applicant because of their nationality.”
Among the information the JCWI seeks from the Home Office is all “policy and guidance documents that deal with the process of streaming visa applications and the use of the streaming tool”.
It is also demanding all the technical details that drive the streaming tool be revealed, along with further information such as case-working targets into each of the three categories, and if there have been any complaints about the deployment of the algorithm.

The JCWI argues that the use of the streaming tool is a more modern version of a visa entry system ruled unlawful by the House of Lords in 2005. It concerned Roma applicants who were said to have been treated with more suspicion and subjected to more intense and intrusive questioning than non-Roma applicants. The Lords concluded that the “stereotyping of Roma as being less likely to be genuine visitors” to the UK was unlawful.

The Home Office has emphasised that the new system is fully compliant with the Equalities Act 2010.

It added that out of more than 3.3m visa applications to the UK by the end of June this year, 2.9 million people were given entry into Britain.

The Guardian

Leave a Reply

Your email address will not be published. Required fields are marked *