Article

Citizen conversation on the use of algorithms by the municipality of Amsterdam

An overview of the approach and findings from a citizen conversation held in April 2023 on the topic of citizen participation and the municipality’s use of algorithms

In the context of the collaborative research conducted by Digital Rights House and Gemeente Amsterdam, citizen conversations are small group discussions intended to facilitate dialogue and knowledge-sharing with and between citizens on a chosen topic.  In April of 2023, a citizen conversation was held at Digital Rights House on the topic of citizen participation and the municipality’s use of algorithms.

Goals

Citizen Conversations are meant to allow for a dialogue with citizens where information about algorithms can be communicated in an accessible way, and where they can share their thoughts and ideas in a low-pressure setting.  Sessions consist of a relatively small group of citizens,  and are structured in a way that encourages them to participate freely but purposefully.  For this study, the idea was to test out a citizen conversation model that could be iterated upon and replicated by different stakeholders and on different topics. 

The Citizen Conversation aimed to address broader questions such as the perception of algorithms in general, and the perception of algorithms used by the municipality.  The algorithm register was incorporated into the conversation by using it to show examples of algorithms in use by the municipality.  This then prompted more discussion about the citizens’ desire to be informed about and participate in the municipality’s use of algorithms.

Description

 The first citizen conversation was held on April 12, 2023, and was framed as a “trial run” to help Digital Rights House and the Municipality of Amsterdam determine how to organize such sessions moving forward, as well as to gather initial data on how to engage citizens on the topic of algorithms.  A 90-minute-long agenda was created with ice-breakers, an informative presentation on algorithms using two examples from Amsterdam’s algorithm register, guided group discussion, and collaborative sense-making. 

Overview of session

The session started with the question  “what do you think of when you think of algorithms?” 

Most participants immediately thought of social media, which reflects the fact that it is one of the most commonly known uses of algorithms that affects the average person on a daily basis.  None of the responses specifically reflected the use of algorithms in public space or for public services.  Many answers focused on the technical nature of algorithms, such as “coding”, “computers”, and “data analysis”, with some highlighting their innovative qualities, i.e. “new technology”, “smart tech”, and “automated”.  At the same time, others focused on their complexity (“hard to explain what it is”, “difficult technology”). This exercise also brought to light participant concerns about the risks and impacts of algorithms, such as “influencing our thoughts”, “manipulation”, and “data breach”.  One participant brought up the citizen “right to know” as a key element of the governance of algorithms.  In the discussion that followed, participants self-reflected on how the first things that came to mind regarding algorithms were shaped by their own exposure and experience.

The moderator then provided the following definitions of algorithms and algorithmic systems.

  • Algorithm: A process which generates an output from an input (UN Habitat)
  • Algorithmic System: A system that uses one or more algorithms, usually as part of computer software, to produce outputs that can be used for making decisions.
    • Functional definition: a system that uses automated reasoning to aid or replace a decision-making process that would otherwise be performed by humans (Ada Lovelace Institute)

After establishing a common definition of algorithms, the moderator then shifted the focus to the local context by asking “what do you know about the use of algorithms in Amsterdam? What comes to mind?”

In general, participants did not know much about the use of algorithms by the municipality, and didn’t realize many public services were powered by algorithms.  However, the examples brought up by some participants sparked very engaged discussion and a number of follow up questions, showing a curiosity and interest to learn more about this topic.  The topic that brought up the most debate was the use of surveillance cameras in public space.  Participants wondered if these cameras used algorithms to identify and track people.  They felt that what the cameras film, how they collect data, and what they are for is not transparent.  Other examples that came up were the automated traffic light system, which was seen as a potential positive use of algorithms for increased efficiency, along with parking enforcement.  Participants shared their negative experiences with automation, arguing that it is not always as “smart” as you think, and is often incapable of detecting nuance or dealing with exceptions.  As such, they highlighted the value of speaking to a human from the municipality, as opposed to a robot or machine.

At this point, participants were introduced to Amsterdam’s Municipal Algorithm Register as “a tool that citizens can use to learn more about how the city uses algorithms”.  The register was then displayed on a large screen, and participants were quickly walked through two algorithms in use by the municipality: Automated Parking Control, and Top 400/600.   This was used as a jumping off point for the main discussion activity of the session.  All participants were given time to answer a set of questions, detailed below. Afterwards, participants split into groups of two.  Each group was responsible for interpreting and synthesizing the responses to one of the questions, then presenting their conclusions to the rest of the participants.  This approach was inspired by the concept of collaborative sense-making described by Seebohm & Smith[1].

  1. If an algorithm is being used in your city to make decisions that could affect you or people you know
    1. How would you want to learn about it, and what would you want to know?

Participants emphasized the need to know the purpose and goal for the municipality’s use of an algorithm, as well as what data is collected and how it is used. One brought up the desire to be informed about their (digital) rights and how to protect themselves from the potential harms of using algorithms. Regarding how they would like to learn about the use of algorithms in their city, participants underlined the importance of honesty and transparency, and mostly favored a multi-channel approach - “a website alone is not enough”.  Ideas included campaigns on different channels, such as social media, television, flyers, print ads, newspapers, and in public space (billboards and trams).  They also suggested direct communication, such as a letter or text message. Although media, specifically social media, were seen as powerful communication tools by participants, some said it can also lack credibility, especially if news does not come directly from the municipality.

  1. Do you care that the decision is being made using an algorithm?

Almost all of the answers to this question were nuanced and conditional, showing that participants are willing to accept the use of algorithms if certain criteria are met.  Some brought to light the limitations of algorithmic decision making, such as the lack of sensitivity and nuance, and called for human controls and checks. Others emphasized that the use case of the algorithm matters; while the automation of simple decisions for increased efficiency may be acceptable, dealing with more sensitive personal information and decision-making is less accepted.  In addition to human oversight, other key conditions included transparency and ethical use.

  1.   If the municipality is thinking about using a new algorithm in Amsterdam
  2. a)    What should they take into consideration?

The importance of centering citizens, and humans more broadly, was the salient theme in the responses for this question; “[the citizens] are their city so they and their wellbeing should be taken into account”. The group tasked with commenting on this question separated the answers into the risks and rewards that should be identified before an algorithm is rolled out, along with who may be affected.  “Risks can be misuse of privacy, violation of human rights, and long-term safety in public space.  Benefits like citizen well-being and the obvious efficiency/ accuracy should also be considered” they wrote.    

  1. b)    Who should they talk to?

Along the same lines, participants felt that the municipality should initiate open discussion with citizens and affected communities when considering using a new algorithm in the city.  They suggested involving other stakeholders such as civil society, activists, and people representing citizens and working for human rights.  Furthermore, the municipality’s strategy could involve going to community centers and housing developments at the neighborhood level, as well as reaching out to people by mail.

  1.   Do you want to give your input on the use of technology in and by the municipality? Do you feel that you are able to?

Although the responses varied, the participants summarized them well: “In general, everyone wants to have the right and ability to be included.  While not everyone feels like they can, they would like to have the option to engage and give their opinions.”  Even those who did not necessarily feel the need to participate wanted to have the option to do so.  Some specifically mentioned that everyone should be able to participate through channels such as the algorithm register or through surveys.  While one respondent stated that they do not currently feel able to participate, most felt that they could, although they were not clear as to how. 

The session was concluded by asking participants what meaningful citizen participation means to them.  One described it as a process “whereby a representing group of citizens is actively engaged, made aware/ educated and involved in the decision-making towards a particular outcome (and that their impact actually carries weight)”.  This definition involves three phases: the awareness-raising that is required before participation is possible, participation which takes the form of involvement in decision-making, and follow up on the participation.  These themes were recurrent in many of the answers, particularly the desire for citizens to have impact or influence.  One response also emphasized the need for complex topics such as algorithms, and how they are used, to be explained to citizens in an accessible way in order to enable meaningful participation.  Others described the nature of the participation more specifically; for example, conversation should be open, it should be at the neighborhood level, and it should occur both in person and online.  Furthermore, citizens should be ‘representative’, meaning that “the people involved in participation efforts represent/ reflect the people affected as well as possible.”  Finally, one respondent added that citizens should be able to directly speak and ask questions to someone from the municipality as part of the participation process.

Participant Feedback

Participants were also invited to provide feedback on the session’s content and structure.  To begin with, it is helpful to ask questions to evaluate people’s baseline level of understanding before diving into examples.  Defining key terms such as algorithms and algorithmic systems is also critical.  Participants also found a walk-through of the algorithm register helpful, albeit information-dense. The approach to presenting the register within the time constraints at hand needs to be further developed.  For this type of in-depth discussion, participants favor small groups from 6-8 participants.  Larger sessions should be broken up into smaller breakout sessions in order to better facilitate conversation.  Approaches where people are given time to write down their ideas (ie using post-its) then share them with the group can make it easier for people who are shy and hesitant to participate.  After the session, resources should be provided for participants who want to learn more about the topic.

They also provided the following input about participating on the topic of algorithms in general.    Citizens may be more likely to participate on this topic via surveys or other virtual channels than in person sessions such as this one.  They would want to participate in more involved, in person discussions on issues that immediately impact them, their family, and their immediate community or neighborhood (for example, the local school attended by a family member).  With technology and algorithms that are not necessarily visible and present, it’s harder to feel that there’s an immediate impact, and therefore it is harder to feel the need to participate in a more engaged way.  Citizens might be just as, if not more, concerned about the use of algorithms by the private sector than by the municipality, and may want to know the municipality’s response to those concerns.

Conclusion

When participants think about algorithms, public services do not immediately come to mind. As such, most did not realize that many public services in Amsterdam are powered by algorithms. However, once examples were brought to the table, participants were very interested and curious about the topic, opening up a space for questions and discussion.  Concerns that emerged related to the use of algorithms were the risks of surveillance and lack of human oversight. 

When an algorithm is being used to make a decision that could affect them, citizens want to know what data is being collected, whether there are privacy measures in place, and whether the use of the algorithm is properly justified.  Their acceptance of algorithmic decision-making in general is conditional upon factors such as human oversight and ethical use.  They value transparency and communication from multiple channels about when an algorithm is being used. 

If the municipality is considering implementing a new algorithm, participants feel that they should strongly consider risks and benefits to impacted citizens.  They should also initiate open dialogue with these citizens along with other interested stakeholders, such as civil society and advocacy groups. 

Participants believe that everyone should have the opportunity to give their input on the municipality’s use of algorithms, but not all felt that they are currently able to.  To the group, meaningful participation is accessible, allows for open dialogue, and has a clear impact and follow up.  It is also as inclusive as possible.

Overall, the Citizen Conversation was conducive to fostering more in-depth dialogue as well as information sharing.  This participation format allows citizens to learn from each other and reflect on their experiences.  However, it is quite involved and time-consuming, which means that it would probably only attract people who have the time and who are interested in the topic.  In this case, the session was hosted by Digital Rights House; working with partner organizations in the future could make this approach more scalable, but only if there is a clear way for citizen input to reach relevant stakeholders in the municipality.  Furthermore, these sessions could benefit from having a representative of the municipality available to answer questions and concerns about the use of algorithms.

 

[1] Seebohm, L., & Smith, N. Learning to Listen Again: How people experiencing complex challenges feel about engagement and participation through the Covid-19 pandemic. Centre for Public Impact UK.

 

All rights reserved