Читать книгу Reframing Organizations - Lee G. Bolman - Страница 36

MAKING SENSE OF AMBIGUITY AND COMPLEXITY

Оглавление

Organizations try to cope with complexity and uncertainty by getting smarter or making their worlds simpler. One approach to getting smarter is developing better systems and technology to collect and process data. Another is to hire or develop professionals with sophisticated expertise in handling thorny problems. To simplify their environment, organizations often break complex issues into smaller chunks and assign slices to specialized individuals or units. These and other methods are often helpful but not always sufficient. Despite the best efforts, as we have seen or experienced, surprising—and sometimes appalling—events still happen. We need better ways to anticipate problems and wrestle with them once they arrive.

In trying to make sense of complicated and ambiguous situations, humans are often in over their heads, their brains too taxed to decode all the complexity around them. At best, managers can hope to achieve “bounded rationality,” which Foss and Weber (2016) describe in terms of three dimensions:

1 Processing capacity: Limits of time, memory, attention, and computing speed mean that the brain can only process a fraction of the information that might be relevant in each situation.

2 Cognitive economizing: Cognitive limits force human decision makers to use short‐cuts—rules of thumb, mental models, or frames—in order to trim complexity and messiness down to manageable size.

3 Cognitive biases: Humans tend to interpret incoming information to confirm their existing beliefs, expectations, and values. They often welcome confirming information while ignoring or rejecting disconfirming signals.

Benson (2016) frames cognitive biases in terms of four broad tendencies that create a self‐reinforcing cycle (see Exhibit 2.3). To cope with information overload, we filter out most data and take in only what seems important and consistent with our current mind‐set. That gives us an incomplete picture, but we fill in the gaps to make everything fit with our current beliefs. Then, in order to act quickly instead of getting lost in thought, we favor the easy and obvious over the complex or difficult. We then code our experience into memory by discarding specifics and retaining generalities or by using a few specifics to represent a larger whole. This reinforces our current mental models, which then shape how we process experience in the future.

Exhibit 2.3. Cognitive Biases.

Cognitive Challenge Solution Risk
Too much data to process Filter out everything except what we see as important and consistent with our current beliefs Miss things that are important or could help us learn
Tough to make sense of a confusing, ambiguous world Fill in gaps, make things fit with our existing stories and mental models Create and perpetuate false beliefs and narratives
Need to act quickly Jump to conclusions—favor the simple and obvious over the messy and complex Quick decisions and actions lead to mistakes and get us in trouble
Memory overload Discard specifics to form generalities or use a few specifics to represent the whole Error and bias in memory reinforce current mind‐sets and biases in information‐processing

To a greater or lesser degree, we all rely on these cognitive short‐cuts. President Donald Trump regularly provided visible examples in his tweet storms and off‐the‐cuff communications. In March, 2017, he tweeted that his predecessor, Barack Obama was a “bad (or sick) guy” for tapping Trump's phones prior to the election. Trump apparently based this claim on an article from the right‐wing website Breitbart. Since the charge aligned with Trump's worldview, he figured it must be true and continued to insist he was right even after investigators concluded it never happened.

These biases and limits in human thinking and the complexity of human systems often lead us to act before we really understand what's going on. As one highly placed female executive reported to us, “I thought I'd covered all the bases, but then I suddenly realized that the rest of my team were playing football.” Faced with an unending barrage of puzzles or “messes,” managers first need to grasp an accurate picture of what is happening in the moment. Then they must move to a deeper level of understanding, asking, “What is really going on here?” This step omitted, managers too often rush to judgment, forming superficial analyses and pouncing on the solutions nearest at hand or most in vogue. Market share declining? Try strategic planning. Customer complaints? Put in a quality program. Profits down? Time to reengineer or downsize. A better alternative is to think, to probe more deeply into what is really going on, and to develop an accurate diagnosis. The ability to size up a situation quickly is at the heart of leadership. Admiral Carlisle Trost, former Chief of Naval Operations, once remarked, “The first responsibility of a leader is to figure out what is going on … That is never easy to do because situations are rarely black or white, they are a pale shade of gray … they are seldom neatly packaged.”

It all adds up to a simple truth that is easy to overlook. The world we perceive is an image we construct in our minds. Ellen Langer, the author of Mindfulness (1989), captures this viewpoint succinctly: “What we have learned to look for in situations determines mostly what we see” (Langer, 2009, p. 33). The ideas or theories we hold determine whether a given situation is foggy or clear, mildly interesting or momentous, a paralyzing disaster, or a genuine learning experience. Personal theories are essential because of a basic fact about human perception: in any situation, there is simply too much happening for us to attend to everything. We drown in a sea of complexity. To help us understand what is going on and what to do next, well‐grounded, deeply ingrained personal theories offer two advantages: they tell us what is important and what is safe to ignore, and they group scattered bits of information into coherent patterns. Mental models shape reality.

Research in neuroscience has called into question the old adage, “seeing is believing.” It has been challenged by its converse: “Believing is seeing.” The brain constructs its own images of reality and then projects them onto the external world (Eagleman, 2011). “Mental models are deeply held internal images of how the world works, images that limit us to familiar ways of thinking and acting. Very often, we are not consciously aware of our mental models or the effects they have on our behavior” (Senge, 1990, p. 8). Reality is therefore what each of us believes it to be. Shermer (2012) tells us that “beliefs come first, explanations for beliefs follow.” Once we form beliefs, we search for ways to explain and defend them. Today's experience becomes tomorrow's fortified theology.

In November 2014, two police officers in Cleveland received a radio report of a “black male sitting on a swing pulling a gun out of his pants and pointing it at people” in a city park (Holloway, 2015). Arriving at the site, one officer spotted the suspect and saw him reach for his gun. The officer immediately shot and killed the suspect. The officer might have responded differently if the radio report had included two additional details. The caller who made the initial report had said that the suspect might be a juvenile, and the gun was probably fake. The gun was a toy replica of a Colt semiautomatic pistol. The victim, Tamir Rice, was 12 years old, but, at 195 pounds, might have looked like an adult on a quick glance. The officer who shot him was a rookie who had been hired in Cleveland after he was forced out of a suburban department which rated him as unqualified for police work (Flynn, 2016).

Perception and judgment involve matching situational cues with previously learned mental models. In this case, the perceptual data were ambiguous, and expectations were prejudiced by a key missing clue—the radio operator had never mentioned the possibility of a child with a toy. The officer was expecting a dangerous gunman, and that is what he saw.

Reframing Organizations

Подняться наверх