In partnership with Erik J. Girvan, JD., PhD, of the UO Law School, the Division of Equity and Inclusion hosts a series of Implicit Bias workshops for faculty and staff at the University of Oregon. Information about upcoming workshops and recordings of previous workshops can be found below.
What is implicit bias?
Project Implicit defines implicit social cognition as the "thoughts and feelings that occur outside of conscious awareness or control."
Try an Implicit Associations Test (IAT) for yourself at Project Implicit.
Tool for Identifying Implicit Bias: Awareness of Common Shortcuts
Short cuts can lead to biased assessments (either positive or negative) in evaluation if we are not motivated to avoid them and skilled in doing so. These shortcuts can lead to erroneous conclusions that candidates are unqualified or a bad fit. They can also adversely affect the fairness and equity of a review process.
- Snap Judgments – Making judgments about the candidate with insufficient evidence. Dismissing a candidate for minor reasons or labeling a candidate “the best” and ignoring positive attributes of the other candidates. Having a covert agenda furthered by stressing something trivial or focusing on a few negatives rather than the overall qualifications. Often occurs when the hiring or review process feels rushed.
- Elitist Behavior (also called “Raising-the-Bar”) – Increasing expectations for women and underrepresented minority candidates because their competency doesn’t strike committee members as trustworthy. Downgrading the qualifications of women and minorities, based on accent, dress, and demeanor. In short, uneven expectations based on a candidate’s social identity.
- Negative Stereotypes – Characterized by presumptions of incompetence. Research shows that the work of women and underrepresented minorities is scrutinized much more than majority faculty, at all stages of an academic career.
- Positive Stereotypes – Dominant group members are automatically presumed to be competent. Such a member receives the benefit of the doubt, negative attributes are glossed over and success is assumed. Also called the “original affirmative action” because dominant group members are automatically presumed qualified and thereby given an unearned advantage.
- Cloning – Replicating oneself by hiring someone with similar attributes or background. Also refers to undervaluing a candidate’s research because it is not familiar, as well as expecting candidates to resemble someone whom the search committee is replacing. Cloning limits the scope and breadth of approaches and perspectives in research, teaching and service.
- Good Fit/Bad Fit – While this judgment may be about whether the person can meet the programmatic needs for the position, it often is about how comfortable and culturally at ease one feels with her/him.
- Wishful Thinking – Insisting racism, sexism, and other forms of prejudice no longer exist.
- Euphemized Bias
- Visionary: Members of dominant groups are evaluated based on their potential whereas underrepresented groups are judged on their accomplishments and their track record only. For example: “He has vision” or “She lacks vision.”
- Star: Used when the speaker is an infatuated fan of the candidate under consideration. (For example: “It’s clear he’s a rock star”). Others should ask the speaker to explain his/her use of the term and support it with evidence.
- Committed, single-minded focus or hard-worker: These terms could be used to exclude those who have demanding family commitments, cloaking a bias against care-givers.
Related reserach & resources
Implicit Bias in Decision-Making: An Introduction
Thursday, February 11th | Noon -12:30 p.m.
How can someone's race, sex, age, or other characteristics influence how we see and treat them even when we are genuinely trying to be unbiased? What concrete steps can we take to help prevent this from happening? To help answer these questions, this workshop introduces the concept of implicit bias. Through a mix of short presentations, lively activities, and discussions, we will explore some harmful side effects of how our brains naturally perceive, categorize, and draw inferences about the world, including other people. We will also examine when this kind of bias is most likely to occur. And we will talk about what practical steps we can all take to try to reduce or eliminate it as well as what has been shown not to work.
Implicit Bias in Decision-Making: Specific Applications
Tuesday, February 16th | Noon -12:30 p.m.
Knowing about implicit biases is not enough to reduce it or keep it from impacting what we do. Building on the introduction to implicit bias, in this workshop participants will work with the presenters to identify specific policies and practices in their workplace that are most likely to be affected by implicit bias and brainstorm concrete changes that they can make to minimize those effects.