Читать книгу Everyday Bias - Howard J. Ross - Страница 6

If You Are Human, You Are Biased

Оглавление

Our conscious motivations, ideas, and beliefs are a blend of false information, biases, irrational passions, rationalizations, prejudices, in which morsels of truth swim around and give the reassurance albeit false, that the whole mixture is real and true. The thinking processes attempt to organize this whole cesspool of illusions according to the laws of plausibility. This level of consciousness is supposed to reflect reality; it is the map we use for organizing our life. —Erich Fromm, German psychologist and psychoanalyst

Interviews can be challenging to almost anybody and in almost any circumstance, but there are few circumstances more confronting than a medical school student admissions interview. Imagine. You have worked hard your whole life to be a good student, and even an elite student. Medical school admissions are among the most competitive processes people will ever face. Virtually every other candidate you are competing against has an outstanding résumé with exceptional grades. The interview process weighs heavy on people’s decisions because it often separates the merely good students from those who have the intelligence and the presence to be a good doctor.

The challenge, of course, is that interviews are subject to many unconscious biases based on any number of extraneous factors relating to the candidate being interviewed, the interviewer, and the environment in which the interview is being conducted. Two physicians at the University of Toronto, Donald Redelmeier and Simon Baxter, decided to explore one of these more extraneous factors.[1] They were curious about the observation as to how it seemed that prospective students interviewed on rainy days tended to get lower ratings in their interviews than people interviewed on sunny days.

Now I’m sure anybody reading this will agree that determining whether to accept students into medical school, or any other academic program for that matter, based on what the weather is on the particular day they are scheduled for interviews, is the height of folly. How absurd would it be to base a decision on whether to admit a student, based on something so obviously random and out of the student’s control?

Absurd, perhaps. Nonetheless, it happens.

Redelmeier and Baxter collected the results of medical school interviews that were conducted at the University of Toronto between 2004 and 2009. They compiled all of the scores from the interviews, almost all of which were conducted in the early spring. The scores ranged from 0 to 20.3. A score of 10 or less was considered “unsuitable,” 12 “marginal,” 14 “fair,” 16 “good,” 18 “excellent,” and 20 was considered “outstanding.” They then researched the Canadian National Climate Archive to track the weather on the days that the interviews were conducted.

Over the course of that time, Redelmeier and Baxter identified 2,926 candidates who were interviewed. The demographics of the interviewees were found to be unrelated to the results. However, those interviewed on rainy days were rated lower than those who were screened on sunny days. In fact, when they compared the results against the students’ scores on their primary testing mechanism, the Medical College Admission Tests (MCATs), they found that the difference in interview scores was equivalent to the students reducing their MCAT scores by 10 percent! Given the intense competition between high-performing applicants, this is enough to determine whether or not, or perhaps, “weather or not,” a student may get accepted, or even become a doctor at all.

Is it likely that interviewers responsible for choosing students for medical school were likely to have said to themselves, “It’s raining out so I think I’ll give this student a lower score,” or is it far more likely that they were unconscious to the impact that the weather made upon their mood? And the manner in which their mood influenced their perceptions of students? Most of us can certainly imagine that a bad weather day, dealing with traffic, and so forth, could impact our mood, and that our mood could impact an interview, but do we consider those influences when we are doing the evaluation of the person?

It is not a far stretch to consider that similar environmental or other concerns might affect us when we are conducting hiring interviews in business or making other business decisions, grading student papers, or determining hundreds of other choices, including those that are seemingly insignificant as well as very significant.

Unconscious influences dominate our everyday life. What we react to, are influenced by, see or don’t see, are all determined by reactions that happen deep within our psyche. Reactions which are largely unknown to us.

In a way, we all know this to be true. Most people have, at some point in their lives asked themselves what made them do or not do a certain thing. We find ourselves curious as to why we don’t always act in a way that is consistent with what we would like to do. Why do we eat too much, or lose patience with our loved ones, even as we had consciously appealed to our “higher” selves to do otherwise? We often have a hard time motivating ourselves to do things, even when we have determined that they are important. The comedian Flip Wilson built a whole career in the 1960s and 70s upon the punch line of “the devil made me do it!” a line of thinking most of us can relate to in those moments when it seems like someone or something else is dictating our actions or choices.

We are constantly making decisions that are influenced by unconscious biases. In fact, even when our biases seem conscious, they may be influenced by a pattern of unconscious assumptions that we have absorbed throughout our lives. It is like a polluted river. We may do everything we can to clean the river as it flows downstream, without having any consciousness about the pollutants that are being dumped in it by a factory or sewage plant upstream.

Consider the biases that people clearly have in our society today toward LGBTQ people. We have gone through a generation in which we have seen breakthroughs in marriage equality: the end of “Don’t Ask, Don’t Tell” in the military; a dramatic shift in the presence of LGBTQ actors and actresses and themed programs in the arts; a lesbian elected mayor of Houston, Texas, and even a gay man running for president. And yet, bias against LGBTQ people continues to proliferate.

A May 13, 2013, Gallup poll found that 45 percent of the American public believed that same-sex marriages should not be valid.[2] Even after two July 2013 rulings by a conservative U.S. Supreme Court cleared the way for same-sex marriage in California and established, by declaring unconstitutional the Defense of Marriage Act, that same-sex couples were eligible for federal benefits under the law, overt discrimination and resistance to the rights of LGBTQ people still persists. Even in the entertainment industry, where most people see a great deal of open expression of sexual orientation, a Screen Actors Guild-American Federation of Television and Radio Artists study found that “the survey, based on responses from over 5,600 union members, showed nearly half of lesbian and gay respondents and 27 percent of bisexual respondents ‘strongly agreed’ that producers and studio executives believe that lesbian and gay performers are less marketable.”[3]

However, are even these overt biases truly “conscious”? While there is no doubt many people are aware of the fact that they are uncomfortable or downright hostile to LGBTQ people, the cause for those animosities might still be unconscious. From where do these biases come? Most of us were probably quite young when we started to hear that “boys should play with these toys, but not those.” How old were most of us when we first saw modeling among the people around us about what was “normal” and what was “sick,” “sinful,” “gross,” or other such descriptors? When we started going to our places of worship and hearing about biblical readings? When we heard people telling jokes about gays or lesbians?

As Brett Pelham, the associate executive director for graduate and postgraduate education at the American Psychological Association, has said, “virtually all bias is unconscious bias. We have learned to trust women to be nurturing and men to be powerful, for example, in much the same way that Pavlov’s puppies trusted ringing bells to predict the arrival of meat powder. . . . Being biased is how we get through life without evaluating everything afresh every time we experience it.”

Even when our biases are conscious downstream, their upstream causes may be very much hidden in our unconscious. For a long time, it has been our general belief that stereotypes and biases were the purview of bigoted people. However, an explosion of studies about the unconscious over the past two decades is revealing a truth that is very uncomfortable. All people use biases and stereotypes, all of the time. And all of us do so without realizing that we are doing it.

In any case, what is bias? Why do we have it?

Bias has been defined as “a particular tendency or inclination, especially one that prevents unprejudiced consideration of a question.”[4]

While we have generally thought about bias in relationship to people and prejudice, we have biases in all aspects of our lives. We are biased toward particular kinds of television shows or movies, certain foods or kinds of foods, as well as certain kinds of books or stories. Virtually any preference we have is likely to have some bias associated with us. And they are, for the most part, unconscious.

This doesn’t mean that every time we make a wrong determination about somebody that it is based on bias. In that sense, it is important to distinguish between what we might call “logical fallacies” and biases. People do sometimes follow faulty logic that leads to an error in reasoning. When we take a position about something based on that faulty logic, we call that a fallacy. Biases, on the other hand, result from times when we have some kind of “glitch” in our thinking. These may result from social conditioning, belief systems that we have been taught or exposed to, particular incidents that we remember, or any number of other assumed “truths” that we have picked up along the way.

The question of bias has entered the political arena, as well as the question of whether biases can often be associated with one political philosophy or another. However, the degree to which we see ourselves as “progressive” or “liberal” on these issues, or the degree to which we may have been the victim of other people’s biases has little or no impact on the unconscious biases we may possess. Ironically, on an unconscious level, somebody (even a person of color) who sees himself as liberal on racial issues, for example, may have unconscious biases that are not much different from those possessed by an overt racist. Or somebody who sees herself as progressive on gender issues might still have hidden gender-based biases.

For instance, consider the attitudes that people have toward men and women regarding who is more suited to a career and who is more suited to staying at home. When researchers at the University of Virginia asked men and women to respond on a conscious level as to how strongly they associated women with careers, the differences between men and women were quite pronounced. Women were almost twice as likely to see a connection between women and careers and men almost twice as likely to not see that connection. However, when tested to see what their unconscious attitudes are to the same question, the disparity almost disappeared. It turns out that on an unconscious level, the differential is less than 20 percent. On an unconscious level, we all have absorbed the same stereotypes and have similar internal value systems, often completely inconsistent with our conscious values!

How might this difference in perception show up on a day-to-day basis? Perhaps, in assumptions that leaders make about a woman’s willingness to travel and be away from her family or take an overseas job assignment. Or in how willing a woman might be to ask for something that she needs, or a raise in pay. Or in how much credibility we give to claims of sexual harassment. Or in how much a man might listen to a woman’s point of view. Or how comfortable men or women feel about women with children working on flextime arrangements, even when it is stated company policy to allow such arrangements! The dissonance between our conscious value systems and our unconscious drivers can cause confusion to both ourselves and other people who are observing us.

These are often subtle perceptions. Like the story about the father and son in the airplane crash, we don’t consciously say, “I’m going to ignore the possibility that the doctor could be the mother or the other gay father!” Yet, those images or thoughts don’t even occur to us as we contemplate the problem. Bias serves as a fundamental protective mechanism for human beings.

Psychologist Joseph LeDoux has referred to bias as an unconscious “danger detector” that determines the safety of a person or situation, before we even have a chance to cognitively consider it.[5] For example, in more primitive times, if we came across a group of people around the river drawing water, we had to decide instantly whether it was “them” or “us.” The wrong choice might have led to our death. We learned, through evolution, that making those determinations quickly could save our lives. Unconscious bias comes from social stereotypes, attitudes, opinions, and stigma we form about certain groups of people outside of our own conscious awareness, and can be fed by snippets of information that we might get from biased media or social media or other sources, which are often taken out of context.

The same is true when we encounter other circumstances in life. We teach our children to have a “bias” about the danger in crossing streets. We want them to instinctively stop at the curb when they are chasing a ball or walking to school. We do the same when we are determining whether a stove is hot or cold. We cautiously touch it to test it. Our minds have been wired to protect us in this way.

The important part to realize is that we have these biases for a reason. Imagine if you didn’t have any biases and you went out into the world. How would you know whether somebody approaching you was “friendly” or not? How would you determine how to relate to different circumstances? If somebody approached you with a knife in their hand, raised high in the air, would you look at them and say, “I wonder what that is and what you plan to do with it?” or would you immediately switch into “fight or flight” mode and defend yourself?

To manage and negotiate an extremely complex and busy world, we have developed the capacity to compartmentalize things and people we are exposed to on a regular basis. We put them in observable categories so we can quickly determine how they fit into our background of experience and then determine what we can expect from them in the future. Gender, race, sexual orientation, age, and so on, are all such categories. For instance, it makes it easier to know that somebody with gray hair is likely older, as opposed to not having any idea of the age of the person with whom we are dealing. It is not a big jump, then, for the mind to associate qualities and values to those categories, for example: good or bad; right or wrong; smart or stupid; safe or unsafe.

One of the most powerful ways we do this is by creating stereotypes. We begin to learn how to “read” different kinds of people. As we encounter them, we instantly compare them to other people we have encountered before. Were the others friendly, safe, and welcoming? If so, then we are likely to feel comfortable with these individuals. On the other hand, were the others hostile or unfriendly? Then the mind sends a different message: Be careful! Stereotypes provide a shortcut that helps us navigate through our world more quickly, more efficiently, and, our minds believe, more safely.

Of course, even when we haven’t encountered a particular kind of person before, we may have the same judgments and assessments based on things that we have heard or learned about “people like that.” As far back as 1906, William Graham Sumner, the first person to hold an academic chair in sociology at Yale University, identified the phenomenon of “in-group/out-group bias.” Sumner wrote that “each group nourishes its own pride and vanity, boasts itself superior, exists in its own divinities, and looks with contempt on outsiders.”[6] This phenomenon is magnified when the “in” group is the dominant or majority culture in a particular circumstance. Because the dominant cultural group in any environment usually creates the standard and acceptable norms and behaviors for that group, people from nondominant groups often will be seen as “different,” “abnormal,” “less than,” or even “sick” or “sinful.” Business cultures, to cite one example, are generally male dominant. Most business leaders are overwhelmingly male. The cultures of companies have largely been around from a time when even more men were in leadership. This has created a male-dominated cultural model in most businesses. And yet most men don’t look at their business cultures as wanting things to be done in “a man’s way.” They see it as wanting things to be done “the right way,” without even realizing that, in their unconscious minds, the “right way” and “the man’s way” are virtually synonymous.

If we were to look at this thinking objectively, we could see a certain logic to it. If you were creating a mind and evolving it over the course of millennia, would it make more sense for that mind to be more sensitive, in encountering new people and experiences, to things that are potentially pleasant or things that are potentially dangerous? The obvious answer is that the one that might kill me is more important to spot than the one that might give me a “nice surprise.” When we do not know much about this person, or these people, they can become potentially dangerous to us. Until proven otherwise. We are programmed to notice that potential threat before we notice “friend.” To notice potential “danger” before we notice potential “pleasure.” It helps keep us alive.

This isn’t limited to people. We stereotype all kinds of things to try to figure them out. We see something and our mind automatically sorts it, consciously or unconsciously saying, “that reminds me of . . .” as a way of identifying what we are dealing with at that moment. Pelham has studied this pattern of behavior, even as we relate to dogs.[7] If you show people pictures of a bulldog, a sheepdog, a poodle, and a pointer, and ask them which is “loyal,” “prissy,” “persistent,” or “clumsy,” you will get the same answers almost every time. Some of these stereotypes have even become part of our language (e.g., “he was as persistent as a bulldog!”). Of course we might say these are common characteristics in these breeds, but not every dog in any breed acts the same way, yet we still make the assumption. It is quicker and easier that way, and much more efficient for our brains. And it is mostly unconscious. While we have tended to look at the dynamics of unconscious bias most particularly concerning racial and gender identity, unconscious bias patterns exist in all areas of life and are influenced by factors that might surprise us. For example, it is no surprise that we make certain decisions based on our hand dominance. We may sit in a certain place because we are right-handed or left-handed and don’t want to be constantly bumping up against the person next to us. All of that makes sense. But a study from the Max Planck Institute for Psycholinguistics in the Netherlands seems to show that our responses to hand dominance may influence us more than we think.

In the study, which was led by Daniel Casasanto, researchers found that not only do people tend to choose more toward their dominant hand (in other words, if you are right-handed, you are more likely to choose something on your right side than on your left), but that we also respond to others based on their use of one hand or another.[8] In addition, we may be able to read people’s positive and negative attitudes based on the hands they inadvertently use.

“In laboratory tests, right- and left-handers associate positive ideas like honesty and intelligence with their dominant side of space and negative ideas with their non-dominant side,” said Casasanto. “Right- and left-handers were found to associate positive ideas like intelligence, attractiveness, and honesty with their dominant side and negative ideas with their non-dominant side.” The researchers also analyzed the speeches of politicians to determine whether or not this pattern played out. Studying the 2004 and 2008 American presidential elections, they tracked 3,012 spoken clauses and 1,747 gestures from the four presidential candidates, two of whom were right-handed (John Kerry and George Bush), and two of whom were left-handed (Barack Obama and John McCain). In both cases, the dominant hand was more associated with positive statements and the non-dominant more associated with negative ones. In other words, if the candidate was right-handed, they used their right hand to gesture when they made a positive statement, and vice versa.

Now imagine hiring somebody because they happen to sit in the chair on the right side of your desk versus the one on the left side of your desk. That would be kind of a crazy way to decide who to hire, wouldn’t it? And, of course, in addition to being patently unfair to the person who happened to be on “your wrong side,” it also is a terrible way to make a talent management decision. Your chances of getting the best person have been reduced to a dice roll.

For the most part we have largely thought about bias from the standpoint of those incidents where people have a negative bias against somebody, which then has a destructive impact on that person’s chances to be successful (e.g., a woman who doesn’t get hired for a job because somebody has a negative gender bias about women). However, it is much more complex than that.


These destructive uses of biases against a certain group (Q1 in figure 1.1) are the ones we have focused most of our attention. We have, in fact, created laws to be sure that people are not discriminated against in this way. But they are not the only ways that bias plays out in our daily lives.

As odd as it may seem, there also are constructive uses of biases against certain groups (Q2 in figure 1.1). They can benefit us in many ways. We determine that people who have aggressive personality types might not be the best fit for a customer service job. Or that people who don’t have certain technology skills and background won’t be a good match for a job that requires computer proficiency. If we didn’t have these filters, hiring would be almost oppressive, because we would start with a huge number of résumés and have to look at all of them more carefully than time might allow.

I know that many people would say those are “qualifications,” and that looking for qualifications is not the same as having biases. In fact, qualifications are simply biases that we have agreed upon and codified. There are hundreds of examples of people who have performed in extraordinary ways who do not have the “normal” qualifications for their roles. If qualifications were the only measure of success, than college dropouts such as Steve Jobs, Bill Gates, and Mark Zuckerberg would still be unknown. However, understandably, we have determined that while there are occasional creative eccentrics like those three, it just doesn’t make good sense to look at 150 résumés and not take education into account. So we use biases against the lack of those characteristics to “filter out” certain people who we might have determined are not a good fit for the job. We do the same thing when we are in dangerous situations. For instance, we might be especially attentive to locking our car in a location with a higher crime rate.

All the same, it is important to note that we should be thoughtful and very conscious about how much we take these negative biases for granted. There are always exceptions, even to the most dependable of patterns (Jobs, Gates, and Zuckerberg, for example!). So, while using negative biases can be helpful, we should never assume they are absolute. The annals of sport are filled with examples of players who were “too small to be successful” by “normal” standards. Yet they were able to succeed far beyond expectations. The same can be said for constructive uses of biases for a particular kind of person (Q3 in figure 1.1). We often look for certain circumstances or people because we have a history with them that tells us we can be more assured that they will meet our needs. People who have certain college degrees, or went to a certain college; certain personality types that fit a particular job or situation. Language skills, or any number of other “qualifications” that we have determined might make the person a better fit for the job. Once again, having these filters can be very helpful, but we have to be careful that we don’t develop blind spots that stop us from seeing exceptional people or circumstances that are “exceptions to the rule.”

Finally, we also have to be mindful of the potentially destructive effects of these “positive” biases (Q4 in figure 1.1). This can show up in several different ways. For instance, we may place unrealistic expectations on somebody from a particular group because of a positive bias we have about “that sort of people.” I remember a Chinese student once telling me, “I’m so tired of people expecting me to be good at math and sciences because I’m Asian. It’s just not my thing. I like the social sciences more. But everybody, from my parents to my teachers seems to think I have to ‘try harder’ when my math grades aren’t straight As, even though I do really well in the courses that matter to me.”

Another way the potentially destructive effects of biases favoring a group or person can show up is when one person suffers because we have a positive bias toward somebody else. Imagine you are interviewing two candidates and something about one of them reminds you of your sister. You may not even realize it. It just occurs to you that “there is something about this person that I like.” As a result, you pay more attention to them, listen more carefully, and are even warmer toward them in the interview. The interview goes great and you want to hire the person. However, what may be lost in the “glow” of that positive bias is that the other candidate never had a fair shot because of the bias that you had in favor of the first person.

It would be great if we were totally conscious about every decision we made and never used bias. However, such a thought is not only unrealistic but impossible. Our processing would slow to a near halt. The key is to become more and more conscious about when our biases are serving our greater objectives.

We develop biases toward people and behaviors all throughout our lives. We learn that to relate in a particular way is “better” than another way, or that we prefer people who act or look a particular way. We can sometimes even develop patterns of behavior that work well enough for us in one domain that we unconscious and habitually use them in places where they do not work nearly as well.

As an example, on May 5, 2013, the Washington Post reported that:

After they leave military service, veterans of the two wars (Iraq and Afghanistan) have a 75 percent higher rate of fatal motor vehicle accidents than do civilians. Troops still in uniform have a higher risk of crashing their cars in the months immediately after returning from deployment than in the months immediately before. People who have had multiple tours in combat zones are at highest risk for traffic accidents.

This is obviously of great concern. The story went on to read:

The most common explanation is that troops bring back driving habits that were lifesaving in war zones but are dangerous on America’s roads. They include racing through intersections, straddling lanes, swerving on bridges and, for some, not wearing seat belts because they hinder a rapid escape.[9]

This is one of the great challenges we have when our biases are unconscious. Without realizing it, we can apply the same behavior, or evaluation criteria that worked in one domain, and find that they are not at all helpful, or even tragic in another.

Everyday Bias

Подняться наверх