Eight months ago, when Luis Rivera arrived on Capitol Hill as a Congressional Science & Technology Policy Fellow, he faced a disheartening reality. Though still very excited for his fellowship, he was troubled to learn that out of 92 opportunities for 33 STPF fellows in Congress, 85 were for Democrats and only 7 were for Republicans.
“It is concerning because it speaks to the anti-science perspective that too many Republicans have,” he says. “…It is not even implicit, it’s very clear that Democrats are more interested in having scientists in their office than Republicans.”
For Rivera, an Associate Professor of Psychology and the Principal Investigator of the Rutgers Implicit Social Cognition Lab at Rutgers University, perspective is critical. It’s also key to his STPF fellowship with AAAS, which was sponsored by the American Psychological Association, in the office of U.S. Senator Ron Wyden for Oregon. Rivera is looking into the process and technology used in deciding which federal inmates get released or go to home confinement during COVID-19, among other projects.
“My portfolio in Senator Wyden’s office focuses on technology and bias, such as algorithmic bias,” he says in reference to the development of an algorithm that was part of the Formerly Incarcerated Reenter Society Transformed Safety Transitioning Every Person Act of Congress (known as the First Step Act). The algorithm Rivera is looking at, known as PATTERN, assesses risk for reoffending among people who are in the federal prison system. According to Rivera, this algorithm has flaws that go back to the biased data that has been collected about minorities over the years.
“The federal prison system is trying to make decisions that are fair to everybody,” he says. “The problem is that when you rely on algorithms like PATTERN then you rely on a history of bias in the criminal justice system.”
And, that’s just one of his projects. As an experimental social psychologist, Rivera spends most of his time studying implicit social cognition. In this, he often uses the Implicit Association Test or IAT.
“Whether it is the kind of bias around (thinking someone is associated with) weapons, whether it is homophobia…because there are thousands of papers and millions have completed the IAT, we now have a sense of implicit bias patterns at the group level,” he says. The IAT helps Rivera and other psychologists understand how thoughts are likely to translate into implicit stereotypes and prejudice. For example, the negative, stereotypical view that black men are often criminal, or that people of color are recipients of welfare.
Rivera’s work examines the conditions under which implicit stereotypes and prejudice are expressed, for example in criminal justice settings, and also their relation to discrimination, which has implications for promoting and maintaining social inequalities. These biases, he argues, can be structural, and they are baked into existing statutes, laws, and even acts of Congress by people who might not be aware of their own implicit biases. Rivera’s emphasis is on implicit biases and how they affect individuals who identify with groups that are marginalized and historically underrepresented like black people, Lantixs, gay people and women.
“There are a lot of African Americans who are working hard toward developing professional careers, to have what they define as success,” Rivera says. “The problem is that our success relies on opportunities that are unfairly unavailable to African Americans.”
These inequalities were what motivated Rivera to explicitly link research from his lab to federal legislation and policy. The pandemic and the subsequent bills that were introduced by Congress amplified some of his earlier concerns.
“Some elected officials on the Hill feel very strongly about meritocratic policies related to COVID-19,” Rivera says. “They feel that people need to work hard individually, and the government should not be giving a bailout to individuals because they have to work to fight COVID-19.”
While Rivera is quick to note that the system from a marginalized group perspective is full of inequalities, there are ways to reverse structural inequality through the use of policy and reforms, aimed at the top and for the purpose of giving fair opportunities to all.
In his STPF fellowship, Rivera considers himself very fortunate because he was matched with Senator Ron Wyden’s office which had a specific interest in having a fellow who has worked on bias and stereotypes. The Senator was looking at the intersection of artificial intelligence and bias, for example issues related to algorithms and biases, facial recognition technologies. Rivera is also expected to investigate this technology searching for hints of bias.
“I don’t think that many Americans realize how much technology is perpetuating and maintaining social inequalities,” Rivera says. “When people think about algorithms for example, they are thinking these are computers, so they have to be objective, they have to be right,” he says. “But it turns out that people who are working on this technology are people who have biases and that the data driving these algorithms are biased as well.”
While at Wyden’s office, Rivera learned about an algorithm used by health insurance companies to steer resources away from black patients even when they were most in need of health care. “I am talking about a subset of algorithms that can have data that are biased and therefore perpetuate inequities,” he says. “There are no federal policies that address accountability and transparency as it relates to what goes into those algorithms.”
Rivera has also studied how bias can impact voters. He researched the elections of Barack Obama and Donald Trump in 2016, publishing The Psychological Legacy of Barack Obama: The impact of the first African-American President of the United States on Individuals Social Cognition.
“Whether it is prejudice against black people, whether it’s prejudice against different minorities, whether it is favoritism towards white people, biases appear to have played a role in how individuals voted in 2016,” he says of the paper’s findings.
In Wyden’s office, Rivera and other staff are homing in on the role of technology companies when it comes to how people obtain accurate information. This work came after Wyden’s office had an investigation into the micro-targeting of political ads on Facebook and Google platforms, around the time when the Senate Intelligence Committee put out their report confirming Russia’s involvement in the 2016 campaign and also their effective use of social media. This project is especially important, Rivera says, because it shows how bad actors are able to use social media platforms to maintain their power and also influence voters.
In Rivera’s work with other staff, he is seeing one possible solution for the problem.
“In the absence of legislation, we have reached out to Facebook and Google, asking that they incorporate policies and guidelines that are able to identify misinformation and disinformation that can potentially affect the election in November,” he says.