MOTION Supervision of the use of algorithms
Program: 4.1 Governance and Services 2022/123M
ADOPTED
October 4, 2022
MOTION Supervision of the use of algorithms
Amersfoort, October 4, 2022
The council of the municipality of Amersfoort
whereas:
a. Government agencies are collecting more and more data from citizens, and this is increasingly being collected
analysing using Artificial Intelligence (AI). However, the use of AI has major risks, such as bias. When the data (with which the AI system is trained) is biased, there is
the risk of discrimination. When the AI system is trained with that data, the system will
that include biases in his decisions (machine learning algorithms);
b. There is a risk that if insufficient consideration is given to the possible adverse consequences of using (machine learning) algorithms, this can lead to exclusion, for example,
promoting institutional racism or discrimination;
c. The Council of State has warned the government about the consequences for people of the automation of government processes and, in particular, the use of machine learning algorithms;
d. Among other things, the benefits affair has taught us to be very cautious about using (self-learning) algorithms;
e. There is currently a lack of clear laws and regulations regarding the use of (self-learning) algorithms in the government system;
f. The risk of discriminatory algorithms has not been removed by the “nationality” indicator
to remove, because also based on other data variables such as place of birth, zip code
whether even an IP address can be built a profile that can be discriminated against;
g. It should be open and controllable for citizens, what actions by the municipality to the
are executed using algorithms;
h. Algorithm vendors generally do not agree to publish the source code
of the algorithms they have developed;
i. As a result, it is not possible for council members or citizens to gain insight into the functioning of
algorithms that are used by the municipality and therefore a democratic defect occurs.
requests the college:
1. The use of nationality and ethnicity as a data variable in all risk models, profiles, -
systems, selection and black lists within the municipality are used to exclude unless
they receive positive advice through the ethical assessment framework;
2. Also ensure that machine learning algorithms in risk classification models do not use these indicators;
3. To set up a publicly accessible algorithm register based on the Helsinki or Amsterdam model on the website of the municipality of Amersfoort;
4. To reveal in that register the datasets of algorithms that the municipality of Amersfoort uses;
Program: 4.1 Governance and Services 2022/123M
ADOPTED
October 4, 2022
Youssef el-Messaoudi Tom of Lamon and Lisa Overmars
GroenLinks Amersfoort for Freedom (D66)
Rob Smulders Natanja Vreugdenhil Arie den Ouden
PvdA Party for the Animals (CDA)
.avif)