Digital Infrastructures of Race and Gender | Safiya Umoja Noble | Tuesday, 30.01.2018

Robots, Race, and Gender

Last week, I attended a meeting organized by Gendered Innovations at Stanford University in Northern California. While there, I was thinking about the algorithmically-driven software that will be embedded in anthropomorphized computers – or robots – that will be entering the market soon. In this post, I want to offer a provocation, and suggest that we continue to gather interdisciplinary scholars to engage in research that asks questions about the re-inscribing of gender in both the software and hardware. It was a new experience to sit at the table with some of the leading women in the field of robotics in the United States, and learn about the ways that the field not only marginalizes women scholars, but also rarely engages with social science and humanities research on gender and race.

At the meeting, I foregrounded the ways that race and gender are used as hierarchical models of power in the design of digital technologies that keep oppression intact, or that exacerbate and obfuscate oppression. I am mostly concerned with who benefits, who loses, and what the consequences and affordances are of emergent technologies. Notably, the digital is one powerful place where gendered and racialized power systems are obscured, because when artificial intelligence or automated decision-making systems discriminate against you, it’s difficult to take AI to court, or exert our rights over computer code and hardware, as I argue in my forthcoming book Algorithms of Oppression: How Search Engines Reinforce Racism:

Let us take the case of redlining in access to housing or financial institutions. Historically, the practice of redlining has been most often used in real estate and banking circles that create and deepen inequalities by race, where, for example, people of color are more likely to pay higher interest rates or premiums just because they are Black or Latino, especially if they live in low income neighborhoods. On the internet, and in our everyday uses of technology, discrimination is also embedded in computer code and, increasingly, in artificial intelligence technologies that we are reliant upon, by choice or not. As it stands today, more of these decisions about access are gleaned from complex and opaque algorithms that use disparate data sets to arrive at a decision about our worthiness, such as access to credit or capital.”

Predictive technologies are with us, particularly in decisions that affect policing, or insurance risk, or whether we are at risk for committing future crime, or whether one should be deported. They are also part of the logic that can drive some types of robots, particularly those trained on vast data sets, in order to make decisions or engage in various behaviors.

Robots, for example, also run on algorithmically-driven platforms that are embedded with social relations. Digital media platforms are increasingly selling robotic devices in the home with pretty names like Amazon’s “Alexa,” or Apple’s “Siri,” which reminds me of the important work of Professor Miriam Sweeney at the University of Alabama who studied the way in which Microsoft’s Ms. Dewey was a highly sexualized, racialized, and of course gendered avatar for the company. Her interview at Bitch magazine on Fembots is a perfect entry point for those who don’t understand the politics of gendered and anthropomorphized robots.

Robotics, in many ways, are data gatherers as much as they are expressions of the data dreams of their designers. It’s no surprise that we see a host of emergent robotic designs that are pointed toward women’s labor: from robots that clean to robots that have sex. But what does it mean to create hardware in the image of women, and what are the narratives and realities about women’s labor that industry, and its hardware designers, seek to create, replace, and market?

The gender and robotics meeting left me considering the ways that robots and various forms of hardware will stand in as human proxy, running on scripts and code largely under-examined in their adverse effects upon women and people of color. My concern is that we go beyond a liberal feminist understanding of “women’s roles” and work, and instead think about how robots fit into structural inequality and oppression, and to what degree capital will not only benefit from the displacement of women through automation, but to what degree the reconstruction of stereotypical notions of gender will be encoded in gender-assigned tasks, free from other dimensions of women’s intellectual and creative contributions. Indeed, we will continue to see robots function as expressions of power.

Now, more than ever, we need experts in the social sciences and digital humanities to engage in dialogue with activists and organizers, engineers, designers, information technologists, and public policy makers before blunt artificial intelligence decision-making eclipses nuanced human decision-making, particularly as it’s deployed in robotics. This means we must look at privatized developments happening in industry robotics research and development labs, which are deployed on various publics. We have very little regulation about human-robotics interaction, as our legal and political regimes are woefully out of touch with the ways in which social relations will be transformed with the mass introduction of robotics we engage with in the home and at work.

We have to ask what is lost, who is harmed, and what should not be forgotten, with the embrace of artificial intelligence and robotics in decision-making. We have a significant opportunity to transform the consciousness embedded in artificial intelligence and robotics, since it is in fact, in part, a product of our own collective creation.