Oxford University Press's
Academic Insights for the Thinking World

Prejudice you aren’t aware of (what to do about it)

Employment, education, healthcare, justice, housing. These are some of the central services in society because they help people live the best life they can. But it will come as no surprise to most people that access to these services and treatment at their hands differs greatly depending on whether you are a man or a woman, the way you are racialized, your sexuality, whether or not you have a disability, and so on. In the US, defendants on trial for the murder of a white person are more likely to be sentenced to death the more stereotypically black their facial features are perceived to be. Asian Americans are less likely to be recommended for certain cancer screening tests than white Americans with the same symptoms. In the UK, half of the country’s population are women, but women make up only 38% of permanent academics at its higher education institutions, and only 22% of the professoriate.

What causes these disparities? No doubt there are many factors that interact with each other. Explicit prejudice will be part of the story. Failing to value black lives as highly as white lives likely plays a role in the first example – note that, when the murder victim is black, there is no discrepancy in the sentencing along racialized lines. And the sorts of microaggressions and sexual harassment that feed a hostile and misogynistic academic climate are sure to contribute to the under-representation of women in UK academia. But here we’d like to focus on just one factor, namely, implicit or unconscious bias. Social science research over the past few decades has lead psychologists to believe that we all carry with us unconscious associations based on stereotypes; and that these associations drive our behaviour in certain circumstances. You might, for instance, carry an implicit association between being a woman and being bad at mathematics, and that might influence whom you hire for a job that requires numeracy; or between being a young black man and being aggressive, and that might affect where you choose to live; or between being an immigrant and being a benefit fraudster, and that might affect how you vote in an election. These associations can come in different degrees of strength; and they need have no basis whatsoever in fact. They are often formed by internalizing the way that people from the groups in question are portrayed in the society you live in. When a young black person is killed in a police shooting, the media tends to choose a picture of them that portrays them as solitary or confrontational; they rarely choose the equally available photograph of the young man at his grandmother’s birthday party, or the young woman graduating from college. These discrepancies in the representation of certain groups in the media become fixed in our unconscious stereotypical associations, which we often call upon when we make a decision – who to hire; who to sit beside on the train; how to vote. What’s more, once in place, the stereotypes are reinforced because we are so susceptible to confirmation bias, a widespread psychological phenomenon that leads you to ignore evidence that tells against your favoured hypotheses, and accords great weight to evidence that supports them.

Chess by phil1256. Public domain via Pixabay.
Chess by phil1256. Public domain via Pixabay.

In one important study of implicit bias, researchers submitted hundreds of fictitious applications in response to a broad range of job advertisements in the Boston and Chicago areas. Some bore typically white names; others bore typically black names. The applications with the white names received 50% more callbacks than those with black names. In another study, trauma surgeons were presented with fictitious vignettes describing clinical scenarios accompanied by photographs of the black or white patients described. The surgeons were much more likely to assume a history of alcohol abuse in the black patients. In a third study, people were placed in a simulated scenario and asked to identify which of the characters in that scenario were armed. They were much more likely to misidentify an item as a gun when it was held by a black character than by a white character. In each of these three studies, and in the many, many others in this area, researchers believe that at least part of the difference in the way people are treated is due to implicit bias against certain groups. Potential employers associate being white with ability and diligence, and they associate being black with laziness and incompetence. In the shooter scenarios, people associate being black with an involvement in gun crime. The associations have no validity, of course; but that doesn’t weaken the biases to which they give rise.

I’ve been talking as if everybody harbours these implicit prejudices – that’s because they do. While people from the group that the bias works against will have slightly weaker biases, they will still have them. Women will have anti-women biases; and black people will have anti-black biases. And the same goes for people who are explicitly and publicly egalitarian. We publicly-committed feminists and advocates for racial justice are just as likely to have some degree of implicit bias towards women and people of colour as those who are not so committed. As philosophers Jennifer Saul and Michael Brownstein put it, we too are “part of the problem.”

So the phenomenon is extremely widespread. And this makes it particularly urgent that we try to find ways to reduce these biases or at least mitigate their effects. Of course, one way to do this is to remove the food that nourishes them. Balanced reporting in news outlets; more nuanced, less stereotypical, and offensive portrayals of women, people of colour, and transgender people in television and film; diverse groupings in positions of power; and so on. These are all large-scale societal interventions that would help enormously. But at the individual level, there are also a number of effective strategies. Psychologists have found that your brain will call on your implicit associations rather than your explicit (and hopefully less prejudiced) thought processes when it is making a judgment in a rush, or under stress, or when there is little information to go on, or when it’s distracted thinking of something else. So we can try to avoid making important decisions about other people – who will be tasked with a management role, who you will vote for, how you will treat a particular trauma victim in your clinic – when you are in such situations. Your bias towards someone from a particular social group can also be reduced by increasing your contact and interactions with people from that group, at least so long as the conditions of the contact are right – it must be informal and personal and it must be on an equal footing and when you pursue a common shared goal together; bias is not reduced in circumstances in which white participants are in a management role, while black participants play a subservient administrative role, for instance. Finally, you can reduce the strength of your bias by thinking about counter-stereotypical exemplars. These are people – like mathematicians who are women and political leaders who are African-American – who provide counters to the prejudiced negative stereotype that is held about the group to which they belong. Considering such people before making a decision will mitigate the effects of your bias on that decision.

What is so unsettling when we learn about implicit biases is that they control our behaviour in ways we disavow; and they do so without our conscious consent. It is as if we learn of an inner demon hell-bent on sabotaging our best-laid egalitarian plans. But there are ways to quiet these monsters that lurk below the level of consciousness. And as research progresses and knowledge of these mitigation strategies increases, the effects of these demons will, we hope, diminish.

Featured image: Lego doll amphitheatre by eak_kkk. Public domain via Pixabay.

Recent Comments

  1. L. Miguel García

    Very Interesting ideas, Richard. After reading it, I wonder a couple of questions, for example, if biases are unconscious or implicit, to what extent should we hope to make them conscious or explicit? Maybe there exists a limit for this task. On the other hand, I think that unconscious bias can play a pragmatic role in many circumstances, for instance, if we live in a very unsafe place, maybe it is useful to have the tacit assumption that people is dangerous and commit crimes, even when we find someone who does not threaten our lives at all. Thanks in advance!

  2. ben

    Numeracy is not a word

  3. Alex

    While I agree with your comments on implicit bias in spirit, I don’t think we can talk about them without discussing the structural oppression that make these conditions possible in the first place.

    Whereas you suggest reducing biases and mitigating their effects (liberal reform), if there is to be any meaningful change, simply adding more positive representations of POC in the media and thinking about successful counterstereotypes is hardly enough. That only allows us to keep believing we live in a world of equitable opportunity with everyone more or less on the same playing field, instead of one that fundamentally relies upon certain bodies (white, middle-class, male) being deferentially valued over and at the expense of everyone else. Which, of course, has a long history of practice, resistance, and theory.

  4. Richard Pettigrew

    L. Miguel García – thanks for the comment! On your first question: There certainly are ways to make your implicit biases explicit. You might be interested to look at the Implicit Association Test (https://implicit.harvard.edu/implicit/takeatest.html). Though it’s worth noting that there’s some controversy over the extent to which the results of these tests actually predict the bias in people’s behaviour. On your second question: Social psychologists think that the reason we have the psychological mechanism that gives rise to implicit biases is precisely because it is useful in certain situations. If we have to make a decision quickly about which berries to eat, it is useful to have an implicit association between red berries and danger, for instance, even if this isn’t universally valid. The problem arises when we apply these to individuals who are members of particular social groups, and when the associations are formed by evidence that is skewed by the sorts of structural oppressions that Alex refers to in the comment below. Implicit biases are born out of structural oppressions — we form them because of our observation of a status quo that has been formed by structural oppression — and at the same time they feed those structural oppressions.

  5. Richard Pettigrew

    Ben – thanks for the comment! I’m not sure I understand it, though. Numeracy certainly isn’t a word — it’s a skill. Did you mean that ‘numeracy’ is not a word?

  6. Richard Pettigrew

    Alex – thanks for the comment! I completely agree. It’s hard to convey this in a short blogpost. I wanted to highlight the phenomenon of implicit bias, since it’s the one that I’ve looked at most carefully myself, and since I think that it is an important factor in perpetuating the structural oppressions that you mention. But I absolutely don’t want to claim that it’s the only factor, nor even close to the most important. I also agree that, even if we could use these mitigation strategies to remove all implicit bias, there would still be enormous work to do in removing structural oppressions. And you’re absolutely right that there’s a danger that, when we employ these small fixes to implicit bias, we use this to reassure ourselves that we live in a fair world. So I take all of your points — I didn’t intend to suggest otherwise in the post; but I see that perhaps I should have emphasised this more explicitly.

Comments are closed.