Amersfoort prohibits the use of non-traceable algorithms
Amersfoort prohibits the use of non-traceable algorithmsPosted on October 12, 2022 in Information society, 5 commentsThe Amersfoort City Council is concerned about algorithms, I read in the city magazine. Indeed, the city council recently found that it is impossible for citizens to see - ergo, to control - which actions the outcome of algorithms lead to. That is why it has been decided not to use algorithms whose outcome cannot be traced back for the time being. Which is a nice step, but one that raises many questions, because what algorithms are involved in Amersfoort? It looks like the municipality has no registration in the Algorithm Register, but that is one of the intentions of another motion that was recently passed. Here we also see a little more clarity about Amersfoort's concerns: The use of nationality and ethnicity as a data variable in all risk models, profiles, systems, selection and blacklists within the municipality is used unless they receive positive advice via the ethical assessment framework; Also to ensure that self-learning algorithms in risk classification models do not use these indicators; A publicly accessible algorithm register based on the Helsinki or Amsterdam model on the website of the Municipality of Amersfoort; In that register, to reveal the datasets of algorithms used by the municipality of Amersfoort. In a study in 2019, the NOS found extensive use of predictive algorithms, resulting in a high risk of discrimination. So that was due to not properly understanding or explaining the behavior of those things. Amersfoort also appeared in that study: In other cases, it also proved difficult for governments themselves to find out what is actually happening. For example, the municipality of Amersfoort predicts the chance that a particular student will become a high school dropout. But the municipality of Amersfoort was unable to find out exactly how that works, and what data it is based on.... After publishing this story, the municipality of Amersfoort announced that no data such as ethnicity is used. However, the algorithm does take into account whether a student, for example, has changed schools or remained seated. This is in line with the national picture that municipalities use predictive algorithms to detect fraud and crime, estimate what kind of care someone needs, and monitor students' educational progress. And when using AI in the social domain like this, the risk of bias is huge: datasets may contain implicit past bias or be influenced by selective composition. I know a municipality where there was a good dataset of the creamy white gymnasium but not of the very diverse school community, guess who ran major risks on the intensive process of preventing school leaving? The tricky part is that it's not as simple as omitting the nationality or ethnicity column from your Excel file. This factor is often rather a strong correlation with real problems, such as language delay since primary school, silent poverty or unfamiliarity with the system for requesting additional assistance. A little bit of statistics still makes the cases with those problems look flawless, and anyone who compares that to the ethnic standard comes to the conclusion of discrimination. So which is incorrect, but the data says the wel.Dat, it is a very good step by Amersfoort and I think that many municipalities can learn from this. I would especially welcome that fourth point, especially if it includes where the data comes from and shows that the data says something about the problem.ArnoudI know that the municipality of Amersfoort is concerned about the use of algorithms that may be discriminatory or opaque to citizens. That is why the city council has decided not to use algorithms whose outcome cannot be traced for the time¹. The city council also passed a motion to exclude the use of nationality and ethnicity as a data variable in all risk models, profiles, systems, selection and blacklists within the municipality, unless they receive positive advice via the ethical assessment framework². The municipality also wants to set up a publicly accessible algorithm register on the website, in which the datasets of algorithms used by the municipality are made public.2 I think this is a good step by the municipality of Amersfoort to deal transparently and responsibly with algorithms. I hope that other municipalities can also learn from this. 😊 Source: Conversation with Bing, 1/25/2024(1) Amersfoort prohibits the use of non-traceable algorithms. https://blog.iusmentis.com/2022/10/12/amersfoort-verbiedt-gebruik-van-niet-herleidbare-algoritmes/.(2) MOTION Supervision of the use of algorithms - PvdA Amersfoort. https://amersfoort.pvda.nl/wp-content/uploads/sites/517/2023/04/AANGENOMEN-2022-123M-GroenLinks-Amersfoort-voor-Vrijheid-D66-PvdA-Partij-voor-de-Dieren-en-CDA-Algoritmeregister.pdf.
.avif)