Visit hbs.edu

Elizabeth M. Adams on civic tech as advocacy work

Elizabeth M Adams thumbnail with play triangle

Civic tech aims to enhance the relationship between people, their community, and government by centering and amplifying the public’s voice in the design and implementation processes of AI-enabled technology. Without public oversight, communities face over-policing, loss of data privacy protections, and the consequences of human bias directing technology used to govern society. It is therefore essential to include diverse perspectives in civic tech solutions to ensure proper representation and consideration for communities of color and other vulnerable populations that are most negatively impacted.

In this episode, our hosts Colleen Ammerman and David Homa speak with Elizabeth M. Adams about the roles and responsibility of government in tech, the ethical implications of technology, and the long game of advocacy work. Elizabeth is a technology integrator working at the intersection of cybersecurity, AI ethics, and AI governance with a focus on ethical tech design. Currently, Elizabeth is a fellow at Stanford University’s Digital Civil Society Lab in partnership with the Center for Race and Comparative Studies in Race & Ethnicity. 

Watch the episode with Elizabeth M. Adams

Read the transcript, which is lightly edited for clarity.

Colleen Ammerman (Gender Initiative director): Today, we are joined by Elizabeth Adams, a technology integrator working at the intersection of cybersecurity, AI ethics, and AI governance with a focus on ethical tech design. Currently, Elizabeth is a fellow at Stanford University’s Digital Civil Society Lab in partnership with the Center for Race and Comparative Studies in Race & Ethnicity. Welcome, Elizabeth. We are very excited to talk with you today.

Elizabeth M. Adams (Stanford University fellow of race & technology at the Center for Comparative Studies in Race and Ethnicity): Thank you. I’m super excited to be here.

David Homa (Digital Initiative director): Elizabeth, thanks so much for joining us. Let’s start with the big picture here. Share with us your perspective on what constitutes “civic tech,” and what are some of the ways that intersects with efforts to foster racial justice?

EA: So, that’s actually a very good question. In my mind, “civic tech” is really the process of bringing government, people, and community together to share in the decision-making process around services or technology that communities could be impacted by. And so, when we talk about a racial equity framework, I feel like I’m in the best place in Minneapolis because the city of Minneapolis has adopted a racial equity framework in all of the work that it does and all of the decisions that it makes. So, obviously, when we have conversations about technology, transparency, [and] the things that are going on in the city of Minneapolis, racial equity is at the top of mind. It makes my job a little bit easier, so I don’t have to work so hard to educate those at the city around racial equity. What I’ve spent most of my time doing from a civic tech perspective is educating people on why technology transparency is so important and why we need to break down the entire lifecycle — from how technology is designed by a company, to how it’s procured by the city, to how users are trained to use that technology. Because if they’re bringing their human bias and they’re using this technology to govern society, we need to just make sure that technology works for all.

DH: That’s really super interesting. Do you think it’s a greater challenge for some people to understand how bias seeps into technology than maybe other sectors and what may be driving some of that difference?

EA: I do think so and I’ll tell you why. Because when I’m seated at the table with elected officials, and appointees, and commissions, not many of them are technologists. So, I can speak [about] this to data scientists and engineers and they get it. But it takes a while. And out of all the elected officials and Minneapolis city council members that I’ve spoken with, there might be one or two who actually get how bias can creep into technology. Most of the time, when you’re talking about vulnerable populations and communities of color, and you’re talking about equity or equality, you’re talking about it from a housing perspective, or an education perspective, or for jobs, or health. And, what I think people don’t realize is that technology runs underneath all of that. It’s all about data and what happens with that data and how that data is harvested and archived and used and, in some cases, profiled. So, I think that just because people are not technologists by nature, or many people that are making decisions around data policy and other policy concerns [are not technologists], that’s part of the challenge for what I see in this space.

CA: I guess, to me, it’s sort of the next step to our initial question — what is civic tech and how does this relate to racial inequality? And, you just talked a bit about the fact that often people who are making policy decisions or in those discussions don’t really have a solid grasp on how bias shapes technology. So, [what] I’m curious to hear you talk about — and I’m sure this is kind of what your work is really about in a lot of ways — is how you bring them along. How do you educate people? What are the effective ways, right, to get everybody to a point where they can understand applications?

EA: So, that’s an excellent, excellent question. I will teach people across all spectrums. But unless I’m talking with a data scientist, I don’t get super technical. And, even when I’m talking with a data scientist, or an engineer, or an architect, I still don’t get super, super technical because when you start talking ones and zeros, then everyone is the same. So, what I started off doing a couple years ago was just really creating learning events. And, I created avatars and I created personalities for these avatars. But I did not show their faces. I did not show that the male avatar was a Black man or the female was a Black woman. I would just use examples that this person runs the soccer league, or this person is a champion in their community for food security or cleanup.

“when you’re talking about… equity or equality, you’re talking about it from a housing perspective, or education, or jobs, or health. And, what I think people don’t realize is that technology runs underneath all of that.”

Then at the end of the experience, I wanted people to understand that these are your neighbors, right? If technology is impacting them and these are people that you like, shouldn’t we have some conversations about this? So, I found those ways to be really effective. By having these very, kind of, educational experiences, it really helps to bring people along when you’re not talking over them, and you’re talking with them, and you’re allowing them to participate in that process.

CA: That’s great. It sounds like part of what you’re doing — and especially hearing you talk about creating these personas and profiles — is kind of helping people move from the purely abstract to something that feels a little bit more tangible, or [that] they can connect to more and understand. Then it sounds like that motivates them to realize how important this is. Like, you’re kind of bringing them along to get them sort of incentivized and to prioritize these issues.

EA: Yeah. And, you know what else is interesting? When you start having an initial conversation with someone about racism, people get defensive immediately. So, you have to kind of break down those barriers and talk about issues that are affecting all of us. That’s part of how I’m able to kind of navigate some of these really sticky conversations that really, at the heart of it, are about racism, about inequality, about human bias. They’re about biases from the folks who are developing the code, because maybe they don’t have enough lived experience with people of diverse backgrounds. But you have to just kind of… for me, that’s what I’ve done. I’ve just used the experiences to bring people along by helping them understand that this really needs to work for all. Technology needs to work for every single human. And, to really make sure that the conversation is human-centered.

DH: Facial recognition is obviously a big topic in the world today. Are there specific examples or situations you’ve come across where people at first thought like, “oh, well, this is a perfectly good use,” and then you help them realize what some of the stumbling blocks might be?

EA: Yeah, and I still talk about that today. So, I actually don’t think all technology is bad. Let’s talk about facial recognition technology from that perspective. If a child is lost in the mall, right, and they can use facial recognition to see where that child might have gone, what store, or where they have navigated around the mall, obviously that would be a good use of facial recognition. If someone is coming into your building and they shouldn’t be coming into your building, and maybe you might need to identify them because they harm someone in the building, that would be, to me, an acceptable use of facial recognition technology. Or, if someone’s grandparent was lost on the street, right? You’d want to be able to find them and bring them back safely.

But when you start using technology to profile people and overreach into communities and start, as I mentioned, profiling and taking that data and aggregating it with, let’s say, license plate readers or an Amazon ring camera, that’s when it becomes harmful, and there are organizations that use [technology] for that purpose. That’s where my work begins — kind of helping people understand why these facial recognition systems don’t typically work for Black women. And, a lot of it has to do with the training data. There’s not enough diversity in the data once the technology is brought together and then it’s sold. Also, the people who are designing and developing it aren’t necessarily understanding of the second- and third-order consequences of their work. They are selling a product and then they are trusting that those who are using the product are equipped enough to understand if there is bias or artificial intelligence nudging happening within their technology.

DH: What advice would you give to people who are working on technologies, like you said, who may not be thinking about the second- or third-order ramifications? Someone, maybe a data scientist, is working on a project. They’re building models. How should they be thinking about that? Maybe they work for a big company. What would your advice be?

EA: Well, I think it’s interesting question to talk about from an individual perspective, because I think it’s a little harder to reach an individual than it is maybe an organization or an academic institution. Because individually, when I’ve talked to data scientists, they actually think that they are doing the right thing. They have no idea. One of the suggestions I do talk to them about is just do a search, an internet search, on some of the biases in technology and see if maybe that can’t inform your work. When I’ve talked to academic institutions, I’m like, maybe you can bring in a historian so you can see how some of what has happened in our country, or maybe across the world, might be impacting how technology is designed and what people think. Or, just bring in a guest, a guest lecturer. At the city, I spent a number of meetings with the coalition that I’m a part of, and we’ve just made our way around, and started having these learning events with the city attorney and the city clerk, and the division of race and equity, and again, those [people] are not necessarily technologists, to just help them understand these are some tools. Start with the internet. So, I would say that, to me, is the easiest thing, because that’s what I did exactly almost two and a half years ago when I saw a video called, “AI, Ain’t I A Woman” by Joy Buolamwini. I knew instantly that my experience with racism, prejudice, discrimination, and then my love for technology — that this is how it would merge. And, I used the internet to figure out what was going on in the space and I followed my curiosity.

DH: That’s great. And when you bring in those experts, make sure you pay them.

EA: Pay them. [Laughter] Pay them well.

CA: That is a great point, right? Because people who have been doing this work have been doing it for a long time. This is a whole body of research and knowledge that people have been working on and that is important, right? And [it] is something that really can help make progress. I just watched that video that you referenced. Ethi, our creative director, found it for us and shared it with Dave and [me] before this interview. It was great. And, it was such a cool thing to see visualized, as something that we already know from years of scholarly research, which is that gender is racialized. So, you just can’t — gender is not separate from race, right? The way that we perceive and understand gender is highly racialized, right? Which you see then in this video with all of these faces of Black women being interpreted as male or masculine. It’s such a vivid illustration of that. I just found that very powerful because you can say that to somebody, right? You can say, “well, gender is racialized, you know, let me tell you why.” But to actually see that, I think it was very, very powerful. Really kind of drives that home.

EA: I agree. When I first started doing my events, I would share Joy’s video and people would be amazed, and it created a great opportunity for conversation at the end of every session about why this work is so important to unpack the entire design lifecycle. But, in addition to how individuals can learn more about it, there are lots of companies who are now standing up responsible AI teams, where they are working through the process of understanding what this means so that before their tool hits the streets, they’ve at least gone through some gates and some checks to balance it. But without legislation, we are really at the hands of these organizations and these companies deciding for themselves and policing themselves to make sure that their products are the best for all of us.

DH: And that brings us to an interesting point, where when they’re building these products, what are the best ways to get people whose lives are impacted by these technologies into the process? What can organizations be doing?

EA: So, this is such a good question, because I tell people this all the time, and we just kind of have this conversation, which is: you can find someone doing the work and bring them in on a consultant basis, like to consult with you. You don’t have to create this massive diversity and inclusion team and start asking your employees to come in and help you solve and solution these problems. There are organizations that have been doing this work for a very, very long time. I just honestly believe that it is around communication. There’s no pipeline issue. There’s no lack of organizations. I don’t care what city it is who are not doing racial equity work. Someone is doing racial equity work. And, in the life of Zoom now, you can certainly, certainly find someone across the world if you need to, to pull them into the conversation. So yeah, there’s many, many different ways. And so, for me, this is just so, so important to continue to have these kinds of conversations to educate people that it’s not as hard I think as we make it. It’s certainly not hard for me to find a group to have the conversation with. It wasn’t hard for me to find a data scientist to talk to and ask them some very, very basic questions. And, I think people have to want to, once they are aware that there are possibly issues in their technology.

CA: It sounds like part of what you’re saying, too, is get the motivation, identify the people with the knowledge and expertise that can then help you go from awareness and motivation to, okay, what do I need to know and understand and get a more sophisticated view on so I can then go in the right direction.

EA: Well, and I would agree. So, let’s just think about this. You want to design something in your house — you want a porch, or you want a deck. What do you do? You do your research, and you can kind of go find out and make sure that it’s appropriate. And, in this day and age, I just cannot believe that there are companies out here who are developing facial recognition technology or some other technology that is AI-enabled that don’t know that it could possibly harm portions of our communities. So, that to me is just… Here I am, though, still trying to live this double life of finding joy and happiness and doing that while leading in this space of digital justice and making sure that people are still aware, and it’s a struggle. But, if I can do it, I think others can, too. And I think we owe it to our world to just be — offer those skills so that we can all live in communities that thrive.

“maybe they don’t have enough lived experience with people of diverse backgrounds… Technology needs to work for every single human.”

So, if I could just take a step back. My family has been in Minnesota since the late 1800s. My great-great-grandfather was the first Black firefighter in St. Paul, Minnesota, and he served and eventually retired as a captain in 1926. So, think about what was going on in our country then. And, of course, I’ve had several family members since [then] that have been involved in racial equity work, and my mom was instrumental, before her untimely death, in getting the first urban playground established here in the city. And so, I come from a history and a legacy of people who’ve shown up for this work. So, when I show up to a conversation like this, I’m not showing up because of some recent tragic event.

I’m showing up because I have a legacy. I’m showing up because I have a lived experience. And so, it was very personal for me with what happened with George Floyd, the murder of George Floyd. Not only are we dealing with the pandemic of COVID, but now we have another racial injustice pandemic. And it was very, very difficult for me and my family.

I withdrew, because I needed to figure out how to center myself. What was my space going to be like if I was going to continue to do this work? Because, like I said, I spent a whole year and a half working really, really hard with the committee and helping them understand. And when I joined the Racial Equity Community Advisory Committee, they weren’t talking about technology. They weren’t talking about video cameras and video surveillance. And so, I spent a lot of time doing that legwork. In order for me to continue in this space, I cannot dip into trauma-filled conversations. I won’t dip into trauma-filled conversations because I have to selfishly take care of myself. So, yeah, and as a practitioner, it’s extremely hard. You don’t just wake up one day and say, “Oh, I’m going to be a practitioner, and I’m going to help a city of a half a million people move towards a more tech transparent city where racial equity is at the top of the top of the conversation.” And, I’m thankful that our city had done that work in 2017 before I got involved in this work. But, it has to come from within.

CA: So, would love to hear you talk a little bit about how you do create change, not just through describing the problem, but figuring out solutions — kind of doing that with people who are coming from lots of different perspectives and may not be well versed in the problem, different stakeholders, sort of the complexity of trying to do that day-to-day. Would love to hear you reflect on that.

EA: Well, there’s no short-term solution. So, before I actually got really into the data policy stuff, I spent a year on the Racial Equity Committee learning about civic tech, learning who were the players in the city of Minneapolis, learning what their concerns were, and showing up to these conversations — really sometimes not saying anything, even though I knew that I had some advocacy work that I wanted to discuss with them.

And so, it’s really about relationship management and respect and understanding what a particular city councilperson, or person who runs a division at the city of Minneapolis, like, what are their major challenges, and how can you help them while still advocating for what you believe could help improve the city? There’s not like a blueprint. You just show up, and you mess up sometimes, and you say things that maybe aren’t appropriate in the city meeting, and you don’t know, but kind of having the courage to learn out loud, as I say, and kind of learn forward. I don’t consider it falling forward or failing forward, but learning forward and just taking those chances.

“without legislation, we are really at the hands of these organizations and companies policing themselves to make sure their products are the best for all of us.”

And, there’s a lot of people that I work with that do the same things. We’re trying to figure it out together and sometimes are stumbling over each other, especially in the coalition. So, we started forming, and then we started storming. But you typically storm first. So, we form first, and then we storm first, and now we’re norming, and now we’re performing.

CA: I love that “learning forward.” That’s great. And, I think part of what I hear you saying is that to do this kind of work on the ground, like in city government and in the community, you have to have a learning mindset. Right? If you don’t have that learning mindset, then you’re going to get stuck, it sounds like. Is this kind of happening?

EA: Yes, to that point. But the other thing I want to make sure is that if it wasn’t for the folks out there protesting, the folks who are out on the streets, really raising the awareness of why these issues are so important for us to address, my work would be a lot harder. So, it takes so many people in the community for things to turn. It’s not just the folks behind the scenes, you know, working in the meetings. And because it is, it’s a lot of work.

CA: Technology is so powerful — and these tools like facial recognition technology and different kinds of surveillance tools, and just the technology, is ever more powerful. So, it seems very important to be doing this work to try to make sure there is a human-centered approach to the development.

EA: Technology is impacting all of our lives. I have been working on pay and gender equity for just about 20 years as a technologist in D.C. So, I ran a systems integration lab that was around $53 million and 200 employees. It was in D.C., so there wasn’t really an issue around diversity and inclusion in the technology, right? It was more around pay equity and gender equity, making sure that the right opportunities were given to everyone. But coming back to Minneapolis, it has been… it just seems like the topic. And that’s why I say it almost feels like a tour. You know, you can only do this for so long because it really can become a part of the fabric of who you are if you don’t help other leaders, you know, give them an opportunity in this space, as well as understand what your personal limits are.

And, I just want to say this. While this conversation is enjoyable, it still takes something out of me, right? Because we’re talking about a subject, we’re talking about technology that possibly could harm people that look like me. And, to continue to show up every day telling people [that] I don’t want technology to harm people that look like me. So, that’s why I do this work. But it’s good. I think that we’re recording it so that, again, people can kind of hear that others need to step into the space. We need more people to kind of show up and help with this work.

CA: So, we do have a wrap-up question that we ask everyone — is there anything that we haven’t asked you that you want to talk about or anything that you haven’t had a chance to speak to? Any resources that you want to share? What’s a takeaway you’d like to leave people with?

EA: So, if you would have asked me this question maybe earlier in the year, I would have told people to read as much as they can about biases in AI. I would have told people to go write articles, go host their own learning events, go do whatever they can, write a short e-book. But, here I am on the other side of George Floyd, the murder of George Floyd, and I think it’s been a common theme that we’ve talked about throughout this conversation, [which] is reaching for the highest point of happiness that you can. Because guess what? There’s going to be another murder, right? We’ve seen that. There’s going to be more protests. There’s going to be another company coming out with biased data, and they will wait until the community says it’s harming vulnerable populations or communities of color and then they may go try and fix it. There will be constantly folks working on data policy. There’ll be new elected officials. There’ll be, you know, new divisions that are created within city and state and federal governments. But reach for the highest point of happiness and work in that joy space, because that’s the only way you can keep showing up.

“I just cannot believe that there are companies out here who are developing facial recognition technology or some other technology that is AI-enabled that don’t know that it could possibly harm portions of our communities.”

And, I say that because, as I mentioned, in 1885, diversity and inclusion started for my family then, when my great-great-grandfather, William Gaudette, became the first Black firefighter and retired a captain. So, for over a 100 years, this is a conversation that’s been happening. Maybe it wasn’t directly around technology, but it’s still a lived experience for Black people in this particular country. That’s why I say if you want to do this work, you’re going to have to find a way to make sure that you can survive doing this work. And so, that would be my message to others. And, surviving doesn’t necessarily mean you can’t be happy and you can’t find joy. Conversations like this give me a lot of joy, because I can be myself. I can be a proud Black woman. I can stand here and say, “I love being a proud Black woman,” and still go off and have a difficult conversation.

Again, if you were to ask me six months ago, it would have been study, study, study, study, study, become an expert, and then that’s how you’ll make it. Now it’s, you know what, the stuff is going to be here. All these problems — it’ll be a new problem tomorrow. So, find your center and find that happiness and find that joy.

CA: It’s a long game.

EA: It’s a lifetime game for Black people. It’s… we get no generations off, we get no generations off.

DH: And with that powerful note, that’s a wrap on our interview, but the conversation continues.

CA: We want to hear from you. Please send us your questions, ideas, comments, suggestions. Reach out to us at justdigital@hbs.edu.

Engage With Us

Join Our Community

Ready to dive deeper with the Digital Data Design Institute at Harvard? Subscribe to our newsletter, contribute to the conversation and begin to invent the future for yourself, your business and society as a whole.