Prof claims Google search algorithms 'privilege whiteness'
- A University of Southern California professor claims that Google’s search algorithms “privilege whiteness” and “discriminate against people of color.”
- Safiya Umoja Noble argues that "big data" is not "neutral" because programmers "hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy."
- According to Google, however, the algorithms in questions had already been revised prior to the book's publication, though the company acknowledges that search results may still "mirror" real-world biases.
A University of Southern California professor argues in a new book that Google’s search algorithms “privilege whiteness” and “discriminate against people of color,” a claim Google says was outdated before the book was even published.
Algorithms of Oppression: How Search Engines Reinforce Racism was written by Safiya Umoja Noble, a USC professor who teaches classes such as “Interpreting Popular Culture” and “Race and Ethnicity in Arts and Entertainment.”
The book—reviewed by Campus Reform—frames Google’s search engines as carelessly racist and sexist, as search inquiries can sometimes yield content that perpetuates stereotypes against women and women of color.
“Data discrimination is a real problem” writes Noble, who goes on to claim that that Google employs a “biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color.”
In her book, Noble refers to this as “technological redlining” and “algorithmic oppression.”
“While we often think of terms such as ‘big data’ and ‘algorithms’ as benign, neutral, or objective, they are anything but,” she contends. “The people who make these decisions hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy.”
Noble goes on to cite James Damore’s infamous “Google Memo,” falsely claiming that Damore was “arguing that women are psychologically inferior and incapable of being good at software engineering as men” to bolster her claim.
Rutgers University Psychology Professor Lee Jussim, on the other hand, reviewed the memo and found it to be “nearly” correct in outlining gender differences between men and women, as Campus Reform previously reported.
Noble goes on to explain that “what this anti-diversity screed has underscored for me…is that some of the very people who are developing search algorithms and architecture…promote sexist and racist attitudes openly at work.”
To support her claims, Noble largely relies on an index of Google’s autofill suggestions and an assessment of what images appear when certain keywords are searched. For example, at the time Noble was writing her book, a Google search for “women cannot” allegedly auto-filled with suggestions such as “Women cannot: drive, be bishops, be trusted, speak in church.”
Searching for “women should not,” meanwhile, yielded autofill suggestions such as “have rights, vote, work, box.” Noble also recounts that a seemingly innocuous search for “black girls” once produced a “Google search results page filled with porn.”
As of press time, however, these claims are no longer valid, as Google has revised its search engine protocols.
Reached by Campus Reform, Google spokeswoman Lara Levin did not deny that Noble’s findings were once true, but said that Google engineers have been working to reduce the frequency with which stereotypes and biased images show up in search results.
“We've made great strides to improve our systems and prevent the types of issues Dr. Noble describes, which is why many of the examples she reports are no longer present. This was true when the book was published; we have not manually fixed anything in response,” said Levin.
“For examples where we still have room for improvement, and for any other issues that could emerge in the future, we will continue to work on scalable solutions to address these problems,” Levin added.
Levin also suggested that any current racist or sexist results are not caused by Google deliberately, but rather are likely due to “how content producers label their images” and how Google search engines inadvertently “mirror biases” that exist in the real world.
“We understand that this can cause harm to people of all races, genders and other groups who may be affected by such biases or stereotypes, and we share the concern about this. We have worked, and will continue to work, to improve image results for all of our users,” she added.
Campus Reform asked Noble if she was aware that Google had updated its search algorithms months if not a few years before her book went to press—as she does not appear to acknowledge this in her book—but she did not respond despite repeated requests.
Follow the author of this article on Twitter: @Toni_Airaksinen