Article

Outreach interviews on the use of algorithms by the municipality of Amsterdam

An overview of the approach and findings from interviews conducted with citizens about the municipality's use of algorithms

As part of a collaborative research project with Gemeente Amsterdam, Digital Rights House went to parks and community centers in May 2023 to interview a handful of Amsterdam residents about the municipal use of algorithms and citizen participation. The defining characteristic of an “outreach interview” is that citizens are met where they are, whether it be a park, community center, or other public space.  The “interview” takes the form of an informal conversation led by guiding questions, varying in length depending on the interest of the participant.

Goals

The guiding principle behind this approach is meeting the citizens where they are, instead of asking them to go out of their way to participate. As such, it allows for access to a different, and potentially more diverse audience than the other channels. It also prioritizes utilizing existing neighborhood spaces, from parks to community centers, in order to build a trusted, localized network, and as a result, better reach citizens. This presents an opportunity to work with partner organizations, such as Digital Rights House, who has a focus on citizen listening and is well positioned to quickly roll out this type of participation/research approach.

Description

Groups of two or three representatives from Digital Rights House were sent to a local park and a community center with the goal of speaking to citizens about their views on algorithms, and whether or not they are interested in in participating on the topic. Three of the conversations were held with groups of two citizens, for a total of seven participants over four conversations.   Potential participants were approached and asked if they would be willing to speak to us as part of a research project on digital human rights.  The following served as the guiding questions for this research method, but the conversations were adapted to the context and participants.

  1.   Have you ever heard about the use of algorithms by the municipality of Amsterdam? What comes to mind? 
  2.   Do you care if an algorithm is used to make a decision that would affect you?  (Be prepared to give examples listed below if people need clarification)
  3.   Would you want to give your input on the municipality’s use of algorithms? Do you feel that you’re able to? (If no and there is time, ask why not )

Typically, a broader question, such as “what comes to mind when you think of algorithms?” was used as an ice breaker to ease into the conversation. 

Findings

Interview 1

In general, this participant seemed to have a good understanding of algorithms.  Their attitude towards algorithms was nuanced, depending on the purpose of the algorithm, who is using it, and what data is collected.   They expressed some distrust of the government, and the idea that they would try to influence citizens using algorithms.  Another key concern was the risk of surveillance.  Nevertheless, they needed to hear examples of algorithms used by the municipality before giving their opinion. 

They viewed the use of algorithms to identify and help people who are at risk of falling into debt, for example, as positive.  Although it was not personally relevant to them, they saw this as an application that helps citizens, especially because people who are struggling financially or are marginalized are less likely to ask for help from the government.

To them, the party responsible for collecting and processing data was a key factor in determining their attitude towards the algorithm.  For the example of the parking control algorithm, they would prefer if an external firm owned the data, rather than the government. The motive behind collecting the data was also an important factor.

Although they thought the register was a good thing, they didn’t think that they would try to use it.  They felt that they were too old to keep up with new technological changes.  They also felt that participation should be the choice of every citizen.  They should have opportunities to participate, and they should be informed of these opportunities, but it should be up to each person. 

Interview 2

In contrast, the two participants in the second interview did not know much about algorithms.  They expressed a general uneasiness around technological change, as well as a distrust of government and banks. They felt stuck in a state of frustration and lack of understanding with everything becoming digital and automated. They were afraid of being controlled by algorithms and did not understand that they could be used to make services more efficient.  While they were worried about privacy broadly speaking, they also felt that they had nothing to hide from the government, and don’t take any precautions with regard to their data. 

Once they were given examples of algorithms from the register, they were more concerned about the outcome of an algorithmic decision – such as receiving a parking fine- than about the use of an algorithm itself.  They were not particularly interested in the algorithm register because they felt that they would not use it.

Interview 3

In general, the two participants in this conversation were very skeptical about the use of algorithms, although the topic remained a bit mystified by them. However, they were more worried about the commercial use of algorithms, for example in advertising, than about their municipal use.  This was partially because they did not expect the municipality to use algorithms.

After hearing about the examples from the register, they maintained a negative view on algorithms, and preferred that the same decision/ task be done by humans.  They felt that decisions on issues such as parking are quicker and harsher when they are automated.  Regarding welfare-related algorithms such as Vroeg Eropaf, they would rather the government stay out of their business instead of trying to intervene.

They were primarily concerned about privacy and what would be done with their information, such as parking data, for example. It was important for them to know if their data is handled by the municipality or by a private company.  Regarding the use of crowd sensing algorithms, they were concerned that cameras would be used for surveillance and facial recognition even if that wasn’t their stated purpose. 

In general, they liked the idea of the register and said that they would use it, because they do not currently feel that they know what the government is up to.  They were also curious if the register was populated by the government or by an external party, and as such whether it would be accurate.  They felt that they would want to know up front whether a decision affecting them is made using an algorithm.

These two participants immediately expressed that they had an opinion to share on the topics of digital rights and algorithms.  Although they would be interested to participate on this topic, they felt that they don’t know enough. Furthermore, they are open to different channels of participation, from a survey to a face-to-face conversation.

Interview 4

The only example of an algorithm used by the government that these participants knew about was the welfare fraud detection algorithm that was at the center of the benefits scandal. Due to the fact that their interactions with the municipality were essentially limited to administrative paperwork, they were surprised that they would have a need for algorithms. Their primary worries regarding the use of algorithms concerned the collection of sensitive data by the government, along with privacy and the impact of technology on children.

When discussing examples of algorithms from the register, their reactions varied significantly depending on the purpose of the algorithm and the data used.  Concerning the parking control algorithm, one mentioned that their spouse had gotten multiple tickets in a row from a scan car.  Nevertheless, they found that using an algorithm for this purpose is honest and straightforward, because the rules for parking are standard and should be followed in any case.

However, in the case of Vroeg Eropaf and other algorithms that deal with sensitive information and outcomes, this participant expressed fear that discriminatory decisions would be made based on their financial situation, for example.  They felt that the use of algorithms to make social decisions could reinforce the same cycle of disadvantage, and were concerned about which data is being collected and how it is being used – what happens if you are identified by this algorithm?  They were afraid that they would have a “record” that would stay with them, and that decisions would be made about their life and their children’s lives without their say.

When asked about sharing their opinion with the municipality on the topic of algorithms, this participant responded that “nobody asked”.  Although they would be interested in sharing their thoughts and opinions, they did not know that their point of view could be valuable to the municipality. Furthermore, they felt that they do not currently have the opportunity to participate.

Conclusion

Many participants did not realize that the municipality uses algorithms, partially due to a very limited view of the municipality’s functions.  While few said that they would use the register, its existence was seen as a positive step towards transparency.  Participants did not feel that they currently have a say in the municipality’s use of algorithms, but they believe that everyone should have the opportunity to participate.  At the same time, most of them felt that they did not know enough about the topic or did not feel legitimate to share their opinion.       

Regarding their attitude towards algorithms, participants were concerned about the use of sensitive data and being profiled by the government.  However, some felt that the social benefit of using algorithms for purposes such as welfare allocation might outweigh the costs.  Overall, privacy and control were key concerns about both public sector and commercial use of algorithms.  In fact, the lines were often blurred about algorithm and data ownership, indicating a need for further transparency on the issue.

 This participation approach proved to be valuable in reaching people who have less knowledge about algorithms, and who wouldn’t otherwise think to participate.  It also ended up including more older people, as well as busy moms who shared their insights while watching their kids.  This shows the importance of meeting citizens where they are in order to achieve more equitable and inclusive participation.

Conversations were most insightful when they were shaped around people’s concerns.  Furthermore, providing examples of algorithms used by the municipality was crucial in establishing the baseline level of understanding necessary for the conversation.  As such, if this approach is to be scaled, the interviewers on the field would have to be trained to be able to provide examples, and even answer questions or refer people to resources.

Based on the conversations, it seems that this approach may be best implemented through partner organizations as opposed to directly by the municipality. Firstly, multiple participants expressed distrust of the government, and asked if the interviewers were representatives of the municipality. Secondly, working with partner organizations would allow access to different neighborhoods and local community networks, and could also be integrated into existing citizen listening efforts, as was the case with Digital Rights House.

All rights reserved