Algorithmic Bias
“Although the impulse is to believe in the objectivity of the machine, we need to remember that algorithms were built by people” (Head et al. 13)
Overview & Examples
Because we often assume that algorithms are neutral and objective, they can inaccurately project greater authority than human expertise. Thus, the pervasiveness of algorithms—and their incredible potential to influence our society, politics, institutions, and behavior—has been a source of growing concern.
Algorithmic bias is one of those key concerns. This occurs when algorithms reflect the implicit values of the humans involved in their creation or use, systematically “replicating or even amplifying human biases, particularly those affecting protected groups” (Lee et al.). In search engines, for example, algorithmic bias can create search results that reflect racist, sexist, or other social biases, despite the presumed neutrality of the data. Here are just a few examples of algorithmic bias (Lee et al.):
- An algorithm used by judges to predict whether defendants should be imprisoned or released on bail, was found to be biased against African-Americans.
- Amazon had to discontinue using a recruiting algorithm after discovering gender bias: The algorithm was penalizing any resume that contained the word “women’s” in the text, because the data was based on resumes historically submitted to Amazon, which were predominantly from white males.
- Princeton University researchers analyzed algorithms and found that they picked up on existing racial and gender biases: European names were perceived as more pleasant than those of African-Americans, and the words “woman” and “girl” were more likely to be associated with the arts instead of science and math.
- Numerous articles Links to an external site. have examined the role that YouTube’s recommendation algorithm might play in radicalizing viewers.
Challenging the Algorithms of Oppression
Dr. Safiya U. Noble, Associate Professor at UCLA (Departments of Information Studies and African American Studies) is the author of the book, Algorithms of Oppression: How Search Engines Reinforce Racism. She is also Co-Director of the UCLA Center for Critical Internet Inquiry, and co-founder of the Information Ethics & Equity Institute. In the video below [3:43], Dr. Noble discusses her findings about algorithmic bias in Google search results, particularly for women of color.
Fighting Bias in Algorithms
Joy Buolamwini, MIT researcher, Rhodes Scholar, Fulbright Fellow, poet of code, and founder of the Algorithmic Justice League Links to an external site., found that the algorithms powering facial recognition software systems were failing to recognize darker-skinned complexions, because they were based on data sets that were largely white and male. Now she’s committed to fighting bias in machine learning, which she calls the “coded gaze.” In the following video [8:44], she explains her work with facial recognition and also asks important questions about how algorithms influence critical decisions, like: Who gets hired or fired? Do you get that loan? Do you get insurance? Are you admitted into the college you wanted to get into? Do you and I pay the same price for the same product purchased on the same platform?
Weapons of Math Destruction
Cathy O’Neil has written several books on data science, including Weapons of Math Destruction. She was the former Director of the Lede Program in Data Practices at Columbia University Graduate School of Journalism. In the following video [13:11], she explains how algorithms are not fair and objective, and may in fact “automate the status quo” and “codify” sexism and bigotry. She concludes that these secret “black box” algorithms, created by private companies, can hide ugly truths, often with destructive results.
Sources
“Algorithms of Oppression, Faculty Focus: Safiya Umoja Noble Links to an external site..” YouTube, uploaded by USC Annenberg, 28 Feb. 2018.
“The Era of Blind Faith in Big Data Must End: Cathy O’Neil Links to an external site.” by TED Links to an external site. is licensed under CC BY-NC-ND 4.0 Links to an external site.
Head, Alison J., Barbara Fister, and Margy MacMillan. “Information Literacy in the Age of Algorithms Links to an external site..” Project Information Literacy, 15 Jan. 2020. Licensed under CC BY-NC-SA 4.0 Links to an external site.
“How I’m Fighting Bias in Algorithms: Joy Buolamwini Links to an external site.” by TED Links to an external site. is licensed under CC BY-NC-ND 4.0 Links to an external site.
Lee, Nicole Turner, Paul Resnick, and Genie Barton. “Algorithmic Bias Detection and Mitigation: Best Practices and Policies to Reduce Consumer Harms Links to an external site..” Brookings, 22 May 2019.
Text adapted from “Digital Citizenship Links to an external site.” by Aloha Sargent Links to an external site. and James Glapa-Grossklag Links to an external site. for @ONE Links to an external site., licensed under CC BY 4.0 Links to an external site.
This page "Algorithmic Bias" by Kaela Casey is licensed under a Creative Commons Attribution 4.0 International License and is a derivative of "Algorithmic Bias Links to an external site." from Introduction to College Research by Walter D. Butler, Aloha Sargent, and Kelsey Smith licensed under a Creative Commons Attribution 4.0 International License, published by Pressbooks.