Important Information
Tax authorities

'The bias we see is a reflection of a much larger social problem'

'The bias we see is a reflection of a much larger social problem'

Alexander, AI ethicist/data scientist

June 27, 2023

When Alexander Laufer joined the tax authorities as a data scientist four years ago, he started an ethics study into non-discrimination within the tax authorities as a side project. That side project kept getting bigger, so it quickly grew into a full-time job. A job that's right for Alexander with his experience, insights and ambitions.

“We are really at the beginning of a major change in how we apply algorithms”

What exactly do you do in your role?

“A large part of my work focuses on analyzing data sets for bias, with the aim of ensuring that the algorithms we use do not unintentionally discriminate. For example, we have a product that can identify requests from entrepreneurs as not compliant and releases statements based on that. By looking at the frequency with which people are unfairly disadvantaged, I see whether there are certain groups of people who are unintentionally disadvantaged more often by such an algorithm. This is based on various personal characteristics, such as gender. For example, does it happen that women without a good reason are more often selected for a check-up? This is partly technical, but it is also about the tax authorities as an organization. That's why I regularly give workshops about awareness, both inside and outside our organization. '

Can you tell us more about that?

“In addition to giving workshops, I am regularly asked for things like interviews, presentations or lectures at universities. In my role, I also work a lot with universities and governments. For example, we have a partnership with the University of Amsterdam, where I help shape research in terms of content. But, for example, I also come up with research questions that are relevant to the tax authorities and where the UvA can then help answer them. I am also chairman of the ethics and analytics sounding board group, and I am part of an expert group at the tax authorities. Each part of the organization can submit questions to our advisory committee, but when it comes to analytics, these questions will first go to us as an expert group. '

What makes this work so interesting for you?

“What makes it particularly interesting for me is that you also see some inequalities in society in the results of our algorithms. The bias we see is a reflection of a much larger social problem — one that we can't just solve either. We map all this data and, based on that, we can take targeted measures to improve it. Everyone in the Netherlands has to deal with the tax authorities, so I can really make an impact on a large scale with my work. The theme is high on the agenda and has now become one of the strategic goals of the tax authorities, including it in the annual plan. This also means that we set an example to other departments and government agencies. Think about policy makers: how do you know if a model is discriminatory? How do you test something like that? '

How do you envision your future?

'The field has actually just been born, so there is still plenty to do. We are really at the beginning of a major change in how we apply algorithms. There are a lot of developments in the field, so there is still a lot for me to learn. That's why I read a lot of media and keep a close eye on new publications in the scientific literature. The nice thing is that the tax authorities really gave me the opportunity to specialize in this way. A lot of space is being made for personal development, so that has certainly helped me get to where I am today. In the coming period, I would therefore like to continue to develop in the field. '

Date
25 September 2024
Author (s)
research
Source
No items found.
Readers' comments
No items found.