Michael Running Wolf is the founder of Indigenous in AI and has worked on language revitalization through speech recognition, as well as on augmented and virtual reality for the protection of Indigenous land and culture. As a software engineer, Michael has worked with Amazon and is now a clinical instructor at Northeastern University in Canada.
This is an extended cut of the interview from AI from Above that has been edited for ease of reading.
What inspired you to get involved with AI research?
I was still living in Bozeman, Montana, where I met my wife Caroline. We found commonality in our love of our tribal heritage. She’s Apsáalooke, Crow, and I’m Northern Cheyenne, Blackfeet and Lakota. We got involved with a nonprofit called Friends of the Madison Buffalo Jump who are protecting an important archeological site from private development. It’s a sacred site for at least 90 tribes and it’s critical to our shared intertribal history. There is ancient evidence of teepees being set up there. But what people see now, is just a circle of rocks that often get dismantled by tourists. We thought, ‘Okay, what if we use augmented reality to place a teepee in the middle that people can see on their phones?’ We wanted to make sure they are sustained, so we went out and collected data
We wanted to show Montana’s government that this could be a way to enhance their park systems with technology, but our ultimate goal was to show the connection between the language, land, and culture. Part of the idea was to have voice interaction. As a user, you’d walk around and be able to hear the local Indigenous names of sites, and you would be able to practice speaking the languages too. But at the time, the technology to do this didn’t really exist. So, I said, “Okay, I want to be an AI researcher, because who else is going to do this?”
How did your experience as a software engineer for Amazon inform your views?
I loved my experience at Amazon, and I loved the teams I worked with. I worked on the privacy organization within Alexa, and it gave me this perspective on how to protect customer data at scale, while also making sure they have access to it. How do we do that for Indigenous people? Is there a middle ground where communities can build datasets and systems that protect their knowledge, without being exploited? I’ve become a big advocate of Indigenous data sovereignty, where we fundamentally reconsider how AI research and data governance is conducted. Say a community has a dataset of sacred plants and they want to collaborate with researchers, but only with specific researchers. How do we organize the legal and technical infrastructure for that?
What are your thoughts on access to geospatial data?
One way to protect the integrity of important archeological sites across North America is to keep them ‘off the radar’. It’s sort of in conflict with this new movement of open data where people want to assemble large datasets. Indigenous communities have resisted publicizing these important archaeological sites, because once they become known it’s really hard to protect them. And this affects everything: economy, ecology, people’s relationship to their religion, their spirituality and of course their language, and heritage. For instance, there are a few important medicinal plants, and some of them are increasingly rare due to climate change. In the 70s and 80s pharmaceutical companies would just go into Indigenous lands and take these plants to research using Indigenous knowledge. And that was very harmful. There’s this one plant we call Big Medicine that only grows in very specific biomes. It is so rare and difficult to obtain, that my dad would say, “Leave your cell phones in the car. We’re not even going to risk it being geotagged.”
But there need to be different tiers of relationship with technology, because there are other resources that are important to digitize and track, like climate change or pollution levels. More of us are joining technical fields. I myself am a computer scientist, and I’m happy to know that there are others out there starting to build land resource tooling. One of my friends worked for the tribe to build a digitized map of important resources on our land, for tracking and maintenance. It just wouldn’t exist without us Indigenous engineers, simply because no one else is interested in doing this tooling.
Indigenous peoples have always been on the frontlines of protecting the ecology of sacred sites from the conflicting interests of governments and corporate groups. That’s also why I joined the pipeline protests in North Dakota in 2016 because the Dakota Access Pipeline pipeline would destroy sacred sites. And it did, when it was constructed. My collaborators and I went with press credentials and a VR camera that we secured from Google, and a grant from Facebook.
How do you feel about collaborations with big tech companies?
It’s hard to build partnerships with large corporations, but some do show interest. For example, Google did a VR experiment on my wife’s reservation. School kids in Kentucky could put on a headset and experience my wife’s tribe. So there is value in how corporations can create understanding between Indigenous and non-Indigenous communities.
But there’s also a tendency for companies to see data as a resource. People say, “Data is the new oil” and I think that’s apt. Like oil, the extraction of data can be harmful to the local environment and the world. Companies take data into their databases and it becomes proprietary information, and then communities lose access to it. They lose the relationship to their own information—be it geographic or voice data. We need to change the paradigm to say that data is human.
If you go into a community with a mindset that data is just a resource or a monetary valuable thing, you’re fundamentally harming the community, and you’re also diminishing the value of this data. I would argue that it’s not just Indigenous communities facing these risks, it’s everyone, wherever our data is being exploited.
Portrait photo of Michael Running Wolf is by Hannah Yoon (CC-BY) 2022
Mozilla has taken reasonable steps to ensure the accuracy of the statements made during the interview, but the words and opinions presented here are ascribed entirely to the interviewee.