PLEASE NOTE, says the Foreign Affairs computer for tens of thousands of visa applications. Is that discrimination?

--

In February of this year, a message appeared on the networking site LinkedIn, designed as a tile, with blue letters and a red-blue logo. It’s from SigmaRed Technologies, a small Canadian software analytics company. In the message, the company congratulates itself on the praise of a client. “We are overjoyed to say that we have received a nice compliment from one leading EU government”, writes SigmaRed. If you want a quote, you can email the contact address.

It government is that of the Netherlands, more specifically the Ministry of Foreign Affairs. A few months earlier, it commissioned a report from the virtually unknown company in Europe, which manages IT professionals in India from Toronto. The assignment to SigmaRed: investigate whether the algorithm used to assess visa applications unintentionally discriminates against certain applicants – a risk that NRC wrote about it a year ago.

Also read
Be careful with this visa application,” the algorithm warns, which encourages discrimination. The ministry ignores criticism

After a few months, SigmaRed’s conclusion is clear: there is nothing wrong, the Ministry of Foreign Affairs is doing its job properly, no one is being discriminated against. A conclusion that, as appears from the LinkedIn post, is gratefully accepted in The Hague.

The SigmaRed report, which was made public on Monday, offers the ministry a way out of a difficult situation. There has been another report in a drawer at the department for over a year now, from the National ICT Guild. This authoritative government IT service comes to a completely opposite conclusion. Namely that the Foreign Affairs visa algorithm discriminates against travelers from countries in Africa and the Middle East. Well-intentioned travelers from that region are often labeled by the computer as “risky applicants”, according to that report. That is undesirable.

Only when the second investigation was completed did the first become public

Last year, NRC became aware of the critical investigation by the National ICT Guild. The same applied to various members of the House of Representatives. NRC requested the report through an appeal to the Public Government Act (Woo), the parliamentarians directly to the minister. But instead of releasing the research, the minister stalled for time and in the meantime had a second opinion written by the Canadian SigmaRed. When that report – which concluded that there is “no disproportionate discrimination on the basis of age, marital status or age” – was completed, the ministry made both investigations public.

“Unfortunately, the ministry has not been able to accept the final report of the National ICT Guild for a number of reasons,” Minister of Foreign Affairs Hanke Bruins Slot (CDA) wrote to the House of Representatives on Monday, on the first day of the May recess. “Because of the inadequate report, the ministry has requested another external party to conduct an investigation.” And he concludes, according to the minister, “no bias” in assessing the applications.

‘Just a little bit different’

The National ICT Guild, part of the Ministry of the Interior, describes itself on its website as “an ambitious tech organization that does everything just a little bit differently.” More than seventy programmers, data scientists and other specialists work at the Guild on the government’s “digital expertise”. Particularly since the Benefits Affair, this has increasingly focused on the way in which the government deals with issues surrounding responsible AI, fair algorithms, ethics and privacy.

Leif de Kloet, former derivatives trader and the leader of the Guild, said in 2022 that he “became enthusiastic about complex IT and data issues at the government”. He hoped that as many internal clients as possible would find him “as an equal partner.”

One of the first major algorithm studies by the National ICT Guild is at the Ministry of Foreign Affairs. The department has asked De Kloet to examine the visa algorithm. The reason: there is internal criticism and travelers are also complaining. Digital services are faltering and applicants grumble that they are not told why their visa application has been rejected.

The algorithm was supposed to improve the service. ‘Decision officers’ at the visa department use the system to assess the hundreds of thousands of visa applications per year for short stays in the Netherlands. It works with risk scores for visa applications. By comparing an applicant with other travelers based on, among other things, nationality, age, gender and place of application, the algorithm calculates a score. A low score means: this application may be submitted to the ‘fast track‘, the official only has to look at this briefly. A high number means a greater risk of problems. That request will be in the ‘intensive track‘ to the decision official. They may choose to interview the applicant or ask for additional documents.

Thanks to the risk score, civil servants can get through the enormous pile of applications more quickly. And that also means it is a cutback. Without the algorithm, the ministry told NRC last year, “several dozen” additional decision-making officials would be needed.

Also read
‘It’s like talking to a robot,’ say Surinamese people applying for a visa for the Netherlands

‘No human dimension’

In 2022, the internal privacy supervisor of the Ministry of Foreign Affairs objected to the algorithm at the visa service. The resulting scores have a “possibly discriminatory character,” he wrote in an internal document. How can you calculate the chance that someone will make a fraudulent application based on someone’s nationality or gender? And wasn’t one of the lessons of the Benefits Affair that these types of systems unintentionally encourage discrimination?

The consequences can be enormous. In the event of rejection, people cannot travel to family or loved ones. They miss business opportunities, births, weddings, funerals. That a rejection is accompanied by great pain was evident from dozens of letters that NRC received after publication of the article about this problem. “This is exactly what happens when everything is taken over by computers and algorithms. There is no human touch anymore,” one reader wrote. “This is a hugely under-exposed issue that affects many thousands of people, including myself,” wrote another.

The ministry firmly rejected the criticism. The risk scores are nothing more than a helping hand for the decision-making officials, according to the department. They make the decision. And the decision makers are not guided by the label ‘risky’, not even unconsciously. “The profiles do not lead to the ‘seek and you will find’ mechanism,” the ministry wrote to NRC.

The ministry hoped to remove all doubts with the research by the National ICT Guild. A research design was devised in consultation with the Guild, but the enthusiasm at the Ministry of Foreign Affairs evaporated as soon as the conclusions of the IT service became clear. At the beginning of 2023, the Guild confirmed all previous objections from the privacy regulator. The algorithm saw a very strong bias. Applicants from specific countries are often incorrectly classified by the computer in the ‘strict control’ category. People with one nationality are up to seventy times more likely to be extra checked than applicants with another nationality, the service said.

Applications from Morocco and Egypt, among others, are checked extra strictly

In the reports that the Ministry of Foreign Affairs sent to the House of Representatives on Monday, the names of those countries have been redacted. Research by NRC shows that applications from Morocco, Tunisia, Egypt, Ghana, Ethiopia, Iran and Iraq, among others, are often wrongly checked extra strictly. This concerns tens of thousands of applications per year.

The critical report from the ICT Guild was coincidentally ready in the same month that NRC, in collaboration with the journalistic collective Lighthouse Reports, published about the algorithm, in the spring of 2023. In response to the articles, several MPs asked for an investigation, without know that this has already been done. The ministry keeps them under that delusion. If Denk explicitly asks for the biastest, the minister fails to state that this report already exists and refers to previously answered parliamentary questions and an old “progress report”.

Compel disclosure

NRC makes a Woo request in the same period, but is told that there are “personnel problems” and that “the situation in the world” is causing delays. There are “many files” and new “research results” are awaited. The ministry repeatedly promises to “complete the request quickly.” Ultimately, NRC went to court in February this year to force disclosure.

In the meantime, the ministry has a second report drawn up by SigmaRed. The basis for this is that the National ICT Guild “deviated from the research question posed by the Ministry of Foreign Affairs” and “made assumptions that are not consistent with the actual working method” of the computer system. The investigation would also overlook the fact that Foreign Affairs decision-makers make the final decision on the visa. Any errors that the computer makes with risk scores will be corrected in this way, the minister reasons.

At the end of April, the NRC department emails that the Woo pieces are now actually coming. But before they are sent to NRC, they have already been published on the House of Representatives website.

Also read
WRR chairman Corien Prins: ‘Politicians are not yet paying sufficient attention to the consequences of AI and the dangers of digital disruption’

Corien Prins, professor and chairman of the WRR: “Politicians are not yet sufficiently sensitive to the fundamental changes related to AI and the dangers of digital disruption.”

‘Undesirable bias’

NRC submitted the two reports to Cynthia Liem, associate professor of intelligent systems at TU Delft. Liem often raises issues surrounding problematic algorithms, including in the Benefits Affair and in Rotterdam, where computer models were used to hunt welfare fraudsters. “SigmaRed does not explain its research method well, and I do not understand some of the things they write down,” she says. “The National ICT Guild does that better. This investigated how often people from certain countries ended up in the intensive track, only to subsequently receive a visa. They have therefore been wrongly exposed to additional checks. This appears to happen much more often with certain nationalities than with others. You can see that as an undesirable bias.” You should not judge this lightly, says Liem. “It is similar to what happened in Rotterdam: many innocent single mothers on social assistance were subjected to extra checks.”

The ministry’s defense that decision-making staff are not influenced by the risk score does not convince her. “We know people are sensitive to these kinds of scores.”

The report from the Rijks ICT Gilde states that officials with the fast track are told: “Application falls into a group with little to no risk of abuse of the visa procedure.” The intensive track contains the message: “PLEASE NOTE: Application falls into a group with an increased risk of abuse of the visa procedure.” Liem: “It seems logical to me that an employee would respond to ‘PLEASE NOTE’. It’s even problematic if it doesn’t. If these types of announcements have no effect, why is this system being used at all?”

Leif de Kloet of the National ICT Guild says in a response that his organization’s research shows “that there are indications that the algorithm may discriminate.” He calls for additional research, “both qualitative and quantitative.” De Kloet says that he has offered to have the Guild research validated by a third party. However, the Ministry of Foreign Affairs has chosen “to approach another party, unknown to us, to conduct another bias test.”

SigmaRed does not answer questions. The ministry said that “Foreign Affairs had a clear investigation question.” “That question has been answered by the SigmaRed report.”

Comment? [email protected]




To share




Email the editor

The article is in Dutch

Tags: NOTE Foreign Affairs computer tens thousands visa applications discrimination

-

PREV Cherry grower Janny can now use pesticides after all
NEXT State aid to KLM again? That won’t happen, will it? – Joop