The Slimme Check (smart livelihood check) project developed and piloted the use of an algorithm to help determine which social assistance applications require further investigation. The Participatieraad, a citizen council focused on Work, Participation, and Income, provided valuable input on the design and use of this algorithm.

Project overview: 

The municipality of Amsterdam provides social assistance benefits to Amsterdam residents who are entitled to them.  As not everyone who applies for these benefits is eligible, applications that may be inaccurate or unlawful are further investigated by an employee of Enforcement Work and Income.  Following three years of research and development, from April to August of 2023 the municipality piloted the “Slimme Check” algorithm, which is meant to support employees in determining whether or not a social assistance application requires further investigation.  The algorithm made transparent and explainable which data leads to the decision to further investigate an application. It also aimed to reduce bias and increase effectiveness compared to the preexisting way of working.   

Citizen participation approach: 

During the research and development process of the Slimme Check algorithm, Work Participation and Income (WPI) sought the advice of the Participatieraad, or participation council.  The Participatieraad, which has been discontinued as of 2024, was a body of up to 15 citizens and representatives of interest groups who provided solicited and unsolicited advice to the municipality on topics related to Work Participation and Income.   

At the end of 2021, the Slimme Check team made an in-depth presentation to the Participatieraad about the purpose of the algorithm, the data it uses, the different parameters it takes into consideration, the bias analysis conducted, and how the algorithm impacts ways of working for social assistance allocation. They then answered questions and had an in-depth discussion.  Afterwards, the Participatieraad took the time to deliberate before issuing a formal advice on the use of the Slimme Check algorithm. 

Participation outcome: 

In March of 2022, the Participatieraad concluded that the use of the Slimme Check algorithm should be stopped because it would negatively affect the fundamental rights of citizens without adding significant value.  More specifically, they expressed concern about the following points: 

  • Bias built into the algorithm through variables such as year of birth.  They argued that considering age, among other variables, as a risk factor for fraudulent behavior is discriminatory. 
  • Use of personal data.  They found processing personal data in this way to detect a small percentage of irregularities in social assistance applications to be disproportionate.  Essentially, they felt that the municipality was infringing on the privacy of benefit recipients more than was necessary. 
  • Transparency and democratic control.  They expressed overarching concern about the municipality’s use of algorithms on citizen data without transparency or democratic control. 

The feedback of the Participatieraad was used by the Research and Development team to remove potentially discriminatory variables from the algorithm itself.  Additional care was also taken to demonstrate responsible use of personal data.   

Pilot: 

The municipality piloted the Slimme Check algorithm from April to August of 2023.  The pilot had the following goals: increase effectiveness of detection of unlawful applications, decrease proportion of false positives, decrease bias, and make the decision-making process more transparent and explainable. 

The final evaluation of the pilot, which took place at the end of 2023, revealed that the Slimme check model appeared to perform better than the old method, and the number of false positives is decrreased. However, the bias analysis was inconclusive, as they could not determine whether there was a discriminatory effect in the risk model. 

Expert evaluation:

In 2023, the Slimme Check team sought expert advice from Hans de Zwart, researcher at the Reponsible IT Lectorate of the Amsterdam University of Applied Sciences. He monitored the development of the Slimme Check algorithm and provided advice on its use in the social assistance application process from a scientific and ethical perspective.  

In early November 2023, he presented a policy recommendation to the municipality and city council calling for an end to the further use of the Slimme Check algorithm. He cited two primary reasons: 

  • Using the algorithm would carry an inevitable risk of automating unequal treatment 
  • The necessity and usefulness of using the algorithm was not sufficiently demonstrated to justify that risk 

Furthermore, he questioned the desirability of using AI among a group of particularly vulnerable people, such as residents applying for social assistance. 

Political process 

  • In November of 2023, Alderman Groot Wassink decided to stop the further development of Slimme Check, due to the inability to meet the condition regarding the prevention of unintentional and unwanted discrimination with certainty.
  • In February of 2023, the final evaluation of Slimme Check was presented to the city council committee on Social, Economic Affairs and Democratization (SED).  As a result, GroenLinks and D66 requested an additional technical session in March going into depth about how Slimme Check works as well as the results of the evaluation.  
  • The technical session took place in March, including a presentation on Slimme Check, Q&A, and  discussions about using AI models and determining bias. The final decision on the continuation of Slimme Check was set for the following  city council meeting on April 17.
  • On April 17, the council members agreed with the Alderman's decision to stop the Slimme Check and therefore refrain from extending the pilot or further development of the Slimme Check. 

Conclusion: 

Citizen participation played a crucial role in the development phase of the Slimme Check algorithm.  The development team faced an important challenge in presenting the project in an understandable way and clearly explaining the variables used to identify risk of unlawful social assistance applications.  This information then allowed the Participatieraad, which represented the most impacted stakeholders of the social assistance allocation project, to participate meaningfully. The advice of the Participatieraad highlighted key citizen concerns, namely related to bias and use of personal data.  They also identified specific variables which could contribute to discrimination.  The development team was then able to make changes before launching the pilot in 2023. The pilot yielded positive results based on the chosen evaluation metrics, but in the end, the decision-making was in the hands of the city council, whose concerns also reflected those of the Participatieraad, ultimately leading to the discontinuation of the Slimme Check project. 

You can learn more about the (now archived) Slimme Check algorithm on Amsterdam’s municipal algorithm register