Technology is not unbiased, according to a scholar investigating the phenomenon of technological racism. As people recognize the embedded biases within technology, the growing and multifaceted tech justice movement is working to counter these biases, added the scholar.
Ruha Benjamin, a sociologist and professor of African American studies at Princeton University whose work explores the social dimensions of science, technology and medicine, spoke during the “Race to the Future? Values and Vision in the Design of Technology and Society” webinar hosted on Aug. 13 by the AAAS Dialogue on Science, Ethics and Religion program. DoSER facilitates dialogue between scientific and religious communities by hosting symposia and lectures on topics at the interface of science, ethics, and religion; training and supporting scientists on engagement with faith communities; and helping seminaries integrate science into their core curricula.
Webinar viewers were invited to consider which prejudices and values are incorporated into technologies such as search engines and AI algorithms, and to identify methods to dismantle technological racism.
Technology is often spoken about as if it were a force separate from human influence, Benjamin said. Yet “human beings are behind the screen: our values, our ideologies, our biases and assumptions.”
Benjamin also pointed out that the biases extend beyond individuals to the systems as a whole and the historical data inputted into the machines. Much in the way that racism exists in legal, educational and health systems, it also becomes codified in computer systems, she said. For instance, searching for images of “professional hairstyles” and “unprofessional hairstyles” on Google brings up results that equate naturally Black hair with a lack of professionalism – search results that echo real-life biases, she said.
“There is a pattern that emerges that is not out of thin air,” Benjamin said. “Technology is not creating this pattern. It’s reflecting back at us a pattern that we often take for granted and fail to look at.”
A study examining health care algorithms that guide patient care, which Benjamin explored in Science in 2019 provides another example. The study found that bias in the algorithm favored treatments for white patients over Black patients, even when the Black patients were sicker.
“Human beings are designing these systems,” said Benjamin. “The training data, the way that the systems are learning to make quote-unquote ‘intelligent decisions,’ is mirroring the so-called intelligence or thinking of human beings.”
Such examples of technological discrimination are part of what Benjamin has labeled “the new Jim Code,” a conception she explores further in her 2019 book, “Race After Technology: Abolitionist Tools for the New Jim Code.” The term echoes the legitimization of anti-Black racism under Jim Crow, but unlike with those overt laws, technological discrimination is often hidden, where it can flourish out of sight while society suggests we have progressed beyond such biases, she said.
Accordingly, dismantling technological discrimination requires concerted work rather than simply allowing technology to replicate or amplify bias, Benjamin said. “We are not going to naturally progress out of these discriminatory systems,” she said.
Diversifying the tech workforce to broaden the views of the people shaping our digital infrastructure is one step, but it is not enough, Benjamin said.
Benjamin shared with webinar viewers how they can learn more about the growing tech justice movement that is seeking to ensure that the values of anti-racism are incorporated and reflected in technology. The Principles for Workers’ Data Rights offers a way to imagine “a new ecosystem” for how technology is created and implemented, and community organizations such as Data for Black Lives and Detroit Community Technology Project are growing coalitions in support of tech justice.
Benjamin urged viewers not to be discouraged by the challenging work that lies ahead, as small steps are what will dismantle technological discrimination and implement data justice, she said.
Said Benjamin, “If inequity is woven into the very fabric of our society – it's everywhere – then each twist, coil and code is a chance for us to weave new practices, politics, patterns.”