Skip to main content

In Plenary Address, Sociologist Ruha Benjamin Discusses How to Dismantle the ‘New Jim Code’

news_210210_ruha_full
Ruha Benjamin is hopeful that the degree of equity built into new technologies will continue to improve. | AAAS

Egregious acts of racial violence have rightfully been a leading cause of public uproar over the past year, but there is a more insidious form of racism that also requires urgent attention, sociologist Ruha Benjamin said Tuesday.

Benjamin, a professor of African American studies at Princeton University, studies the relationship between innovation and inequity and has written three books on the topic. During a plenary address at the 2021 Annual Meeting of the American Association for the Advancement of Science, she argued that the same racial prejudice that may lead a police officer to kill an innocent citizen is coded into emergent technologies — even those ostensibly designed to promote justice.

“I really love the theme: Understanding Dynamic Ecosystems,” Benjamin said, referring to the theme of the 187th AAAS Annual Meeting. “One thing I hope to contribute to the conversation are a set of conceptual tools or interpretive lenses that we can use to better understand our ecosystem.”

“I want to offer some social scientific insights that I hope can make our vision of this world a bit clearer, so that we can diagnose our reality with greater precision,” she added. “So that ultimately, we can transform it.”

Benjamin began her talk by noting that instances of racial violence are not isolated events and belong to a wider context in which anti-Blackness pervades policing, education and society at large. In 2016, for instance, an eye-tracking study by the Yale University Child Study Center showed that preschool teachers disproportionately focus on African American children when told to look for challenging behavior.

Such distortions in our ability to see the world accurately are rooted in the history of science, Benjamin noted. Renowned scientists, including early 19th-century French naturalist Georges Cuvier, were responsible for promoting pseudoscientific racial hierarchies and other racist concepts.

“Those ideas continue to infect and distort our vision today,” Benjamin said. “Technology is one other arena in which we have to be watchful and reckon with the role of anti-Blackness.”

In the two dominant stories regarding the eventual societal impacts of artificial intelligence and machine learning, robots either grow to dominate humanity, or they save us, making society more efficient and equitable. While these narratives — one dystopian and the other utopian — seem like opposites, Benjamin argues that they share the underlying fallacy of ignoring the fact that existing societal power dynamics inevitably make their way into new technologies.

“How do we reimagine the default settings of technology and society?” she asked. “As a first step, I want to suggest that we have to move beyond a techno-deterministic understanding of this relationship. By that I mean the assumption that technology is in the driver’s seat, and we are either harmed or helped. The human agency is missing from the script.”

A 2019 study published in Science, AAAS’s flagship journal, found evidence of racial bias in a commercial algorithm used widely in the U.S. health care system. Because the algorithm uses health costs as a proxy for health needs and providers spend less on Black patients than on equally sick white patients, the algorithm reduces the number of Black patients identified for extra care by more than half. In a commentary on this research, Benjamin wrote that the automation of racial discrimination is a growing concern.

“Indifference to social reality on the part of tech designers and adopters can be even more harmful than malicious intent,” Benjamin said during her plenary address. “Race neutrality, it turns out, can be a deadly force. This combination of coded bias and imagined objectivity is what I’ve termed the new Jim Code.”

“In my grandma’s generation, she may have walked up to the hospital and seen a huge ‘whites only’ sign,” she added. “Now I can go through the front door, yet there may be an automated system making decisions about resource allocation that has a similar pattern of discrimination.”

Other examples of the “new Jim Code” abound. In 2018, for instance, Amazon decided to stop using an artificial intelligence hiring tool after machine-learning specialists found it to be biased against women.

Recent developments, however, give Benjamin hope that the degree of equity built into new technologies will continue to improve. The Algorithmic Accountability Act, introduced in the U.S. Senate in 2019, aims to create protections around automated decisions in everyday lives. Tech workers — from Microsoft employees to the Alphabet Workers Union — are increasingly speaking out against their companies’ complicity in creating harmful products. Meanwhile, Data 4 Black Lives, the Detroit Community Technology Project, and other tech justice organizations are galvanizing communities to take proactive approaches to designing better tech.

“If an ahistoric, asocial approach to science and technology captures and contains, then a historically and socially grounded approach can open up possibilities and pathways,” Benjamin said. “It can create new settings and code new values and build on critical intellectual traditions that have continuously developed insights and strategies grounded in justice. My hope is that we all find way to build on this tradition.”