Important Information
The Dutch Tax Scandal

The successors to Omtzigt and Leijten

lawyer and algorithm

The successors to Omtzigt and Leijten

by Marco de Vries

Algorithms receive attention in the main line agreement of the forming parties. Some MPs have personal experience with risk profiling.

The Department of Education Implementation (DUO) has discriminated even more seriously than previously thought. This is evident from independent research presented at the end of May. The algorithm profiled students who lived close to their parents. For this reason, students with a migration background were unintentionally checked much more often than the others. A typical case of proxy, explained Arthur van der Linden, lecturer in Taxation and Technology at Tilburg University. On the day that the DUO investigation was published, he spoke at a round table discussion of the Permanent Parliamentary Committee for Internal Affairs.

A proxy is a group property that allows the algorithm to nonetheless profile ethnically in a roundabout way. Machine learning algorithms quickly go wrong with it. A classic algorithm is like a speed camera that only records speeding cars. But as soon as the color or type of the driver is also stored, the AI can make correlations and increase the chance that red sports cars will receive a receipt more quickly, even if they don't drive too fast at all. “Machine learning means that the training data creates profiles of people who do or do not comply with the standard.” The 'smarter' the software, the greater the risk of unintended effects. The outline agreement of the forming parties pays a lot of attention to this topic. Omtzigt has not only won his constitutional court, but also, for example, a “right to err”: making a mistake is allowed and the government's reminder and collection costs must be reduced. The agreement also promises more guarantees and transparency regarding the use of algorithms and AI. At the same time, the number of civil servants must be significantly reduced and the penalty rate increased. Software vendors will present themselves with the latest automation. Who can see what's built into the software? The coalition agreement wants a “scientific standard for the use of models and algorithms”, but market developments are happening much faster than those in science. Now that Omtzigt is forming and Renske Leijten wrote her book about lawyer Eva Gonzálex Pérez and the start of the Benefits scandal, new representatives of the people must take over. During the committee meeting, two names stood out. Sandra Palmen who, as a lawyer at the tax authorities, already concluded in 2017 that the fraud policy was derailed and racist. Later, the management lied that they never saw the memo. Now Palmen is a member of parliament for NSC and with the round table discussion she wanted to be updated about ethnic profiling in 2024. Next to her was Mpanzu Bamenga, who was picked out of line by the military police in 2018, along with four other people of color. He filed a complaint, went to court and was proven right on appeal last year. Now, as a Member of Parliament, he was questioning ICT officers from the police and military police for D66. By the way, they came off better than the tax authorities and the VNG, both stuck in the denial phase. Perhaps that's because of the gap between leadership and the workplace. Everyone is talking about ethics committees, algorithm APKs, precautionary obligations and bias assessment. But the actions of the police and military police are visible; wrong decisions about benefits or surcharges are not. The VNG and the tax authorities swear that everything is going well now, but victims, the ombudsman and Amnesty International were not comfortable about it during the meeting. Amnesty director Dagmar Oudshoorn also spoke from personal experience. “The absence of recognition increases the risk of recurrence,” she warned.

Date
09 July 2024
Author (s)
research
Source
No items found.
Readers' comments
No items found.