Natalia Washington |
I am a graduate student in the Philosophy department at
Purdue University. My research interests lie at the intersection of philosophy
of mind, cognitive science, moral psychology, and scientific psychiatry—and
especially in externalist viewpoints on these subjects.
In a forthcoming paper with Dan Kelly, we defend a kind of
social externalism about moral responsibility in the case of implicit bias, a
particular kind of “imperfect cognition.” For those who aren’t familiar,
implicit biases are unconscious and automatic negative evaluative tendencies
about people based on their membership in a stigmatized social group—for
example, on gender, sexual orientation, race, age, or weight. Because implicit
biases operate without our conscious awareness, one might worry about the
prospects for holding individuals responsible for their behaviors when they are
influenced by biases, as mounting evidence suggests.
Our work addresses this challenge, and applies philosophical theories of moral responsibility to behaviors influenced by implicit bias. The driving question: whether anything about the nature or operation of implicit biases themselves guarantees that behaviors influenced by implicit bias should inevitably be excused. Our answer is no. We argue that there are clear cut cases where an individual can be held responsible for behaviors influenced by biases she does not know she has, and which she would disavow were she made aware.
The key idea is that an individual’s epistemic environment
bears on her level of responsibility. By way of analogy, a student who does not
know her exam date is held responsible for the contents of the class syllabus.
A doctor is responsible for the changing state of medical know-how and
know-that. Neither individual is in a position to claim ignorance.
Thus consider the difference between a hiring committee
member who, under the influence of a bias she does not know she harbors,
unfairly evaluates a stack of CVs in 1988—before we knew much of anything about
implicit bias—and a hiring committee member who does the same in 2013. They may
each claim that they did not know they were making biased evaluations, and
should therefore be excused. But our committee member from the present day is
not in a position to make recourse to this excuse.
Like her counterpart from the past, the hiring committee
member from 2013 did not know that she evaluated the CVs unfairly, but she
ought to have known. We now know much more about these things, and the
relevant knowledge is out there in a way that it wasn’t back in 1988. As
someone in charge of a hiring process in a time and place where it is known
that the perceived gender or race of an applicant has a biasing effect, it was
her responsibility to do something to mitigate that effect—for example by
evaluating the CVs blindly.
Knowledge can affect moral responsibility. One upshot
of this argument is that increases in knowledge can raise the standards of what
we can be held responsible for. Raising our level of responsibility can also
gain us freedom from the unwanted effects of our imperfect cognitions. Thus,
another upshot worth investigating—especially as egalitarians—is that being
influenced by implicit bias is not inevitable. For more on overcoming bias see
recent work by Alex Madva and Michael Brownstein.