Читать книгу The Black Swan Problem - Håkan Jankensgård - Страница 14

WHAT MAKES US SUCKERS?

Оглавление

The previous section referred briefly to the ‘not in our lifetime’ perspective. Let us linger on this point for a bit, as it is one of the keys to understanding Black Swans and why we are essentially born suckers. Most of us will freely admit that humanity is in for one disaster or other. Sooner or later, that asteroid will knock us out of our pants, for sure, but it is always later, somewhere out in the distant future. It is not going to happen in my lifetime. Why? Because I am somehow special. Stuff only happens to other people, whereas I am destined to lead a glorious and comfortable existence. Based on such egocentric beliefs, we might coolly concede that in the larger scheme of things, something is sure to happen, yet almost completely discount the possibility as far as our lifetime and corner of the world is concerned. Not that we always say so publicly or even think in those terms outright, it is more of a tacit assumption.

This ‘because I'm special' protective mechanism goes a long way in setting the stage for Black Swans. However, it is only one of the many ways in which our outlook is warped, which brings us to the long catalogue of biases that have been identified and described by scholars. A bias can be said to be a predisposition to make a mistake in a decision‐making situation, because it leads us away from the decision that would be taken by a rational and well‐informed person who diligently weighs pros and cons. What biases tend to have in common is that they make us more of a sucker than we need to be. They are a staple of business books nowadays (those on risk management in particular) and may bore the educated reader. Since they are so fundamental to the concept of Black Swans, we must briefly review them nonetheless. What follows is a non‐exhaustive list of certain well‐documented biases that in various ways contribute to the Black Swan phenomenon. As is commonly pointed out, these biases have been mostly to our advantage over the long evolutionary haul, but are often liabilities in the unnatural and complex environment we find ourselves in today.

 The narrative fallacyIn explaining why we are so poorly equipped to deal with randomness, Taleb focuses on what he refers to as ‘the narrative fallacy’, which he defines as ‘our need to fit a story or pattern to a series of connected or disconnected facts’ (Taleb, 2007, p. 309). We invent causes for things that we observe in order to satisfy our need for coherent explanations. It turns out that we do not suffer dissonance gladly, so our brain will supply any number of rationalizations to connect the dots. By reducing the number of dimensions of the problem at hand and creating a neat narrative, things become more orderly. Everything starts to hang together and make sense, and that is how the dissonance is resolved. Since we are lazy as well, we often converge on the rationalization that satisfies our craving with the least amount of resistance. However, when we force causal interpretations on our reality, and invent stories that satisfy our need for explanations, we make ourselves blind to powerful mechanisms that lie outside these simple narratives.

 Confirmation biasThis is one of the leading causes of Swan‐blindness discussed in The Black Swan, where Taleb refers to confirmation as ‘a dangerous error’ (Taleb, 2007, p. 51). It has to do with the general tendency to adopt a theory or idea and then start to look for evidence that corroborates it. When we suffer from this bias, all the incoming data seems, as if by magic, to confirm that the belief we hold is correct; that the theory we are so fond of is indeed true. Whatever instances contradict the theory are brushed aside or ignored, or re‐interpreted (tweaked) in a way that supports our pre‐existing beliefs. Out the window goes Karl Popper's idea of falsification, the true marker of science and open inquiry. Using falsification as a criterion, a theory is discarded if evidence contradicting it becomes undeniable. In the specific context of managing risks, the confirmation bias is a problem because we will be too prone to interpret incoming observations of stability to suggest that the future will be similarly benign.

 The optimistic biasResearch has shown that humans tend to view the world as more benign than it really is. Consequently, in a decision‐making situation, people tend to produce plans and forecasts that are unrealistically close to a best‐case scenario.14 The evidence shows that this is a bias with major consequences for risk taking. In the words of Professor Daniel Kahneman (2011): ‘The evidence suggests that an optimistic bias plays a role – sometimes the dominant role – whenever individuals or institutions voluntarily take on significant risks. More often than not, risk takers underestimate the odds they face, and do not invest sufficient effort to find out what they are.’15 Pondering on extreme and possibly calamitous outcomes will clearly not be a priority for an individual with an optimistic bent. Taking a consistently rosy view distorts expectations and therefore invites the Black Swan.

 The myopia biasMyopia, in the literature on the psychology of judgement, refers to the tendency to focus more on short‐term consequences than long‐term implications. Because of our desire for instant gratification, we tend to place much less weight on future gains and losses relative to those in the near‐term. Professors Meyer and Kunreuther call this the most ‘crippling’ of all biases, resulting in gross underpreparedness for disasters that could have been mitigated with relatively simple measures.16 This was the case, for example, with the tsunami in the Indian Ocean in 2004. Only a few years prior, in Thailand, relatively inexpensive mitigation measures had been discussed – and dismissed. The reason? There were many reasons, but among other things, there was a worry that it might cause unnecessary alarm among tourists. Such miniscule short‐term benefits got the upper hand in preparing for events with colossal consequences.

 The overconfidence biasHumans are prone to overrate their own abilities and the level of control they have over a situation. The typical way of exemplifying this tendency is to point to the fact that nearly everyone considers himself an above‐average driver. Taleb prefers the more humorous example of how most French people rate themselves well above the rest in terms of the art of love‐making (Taleb, 2007, p. 153). As for the effect of overconfidence on decision‐making, it is profound – and not in a favourable way. Professor Scott Plous (1993) argues that a large number of catastrophic events, such as the Chernobyl nuclear accident and the Space Shuttle Challenger explosion, can be traced to overconfidence. He offers the following summary: ‘No problem […] in decision‐making is more prevalent and more potentially catastrophic than overconfidence.’17 Overconfidence has been used to explain a wide range of observed phenomena, such as entrepreneurial market entry and trading in financial markets, despite available data suggesting high failure rates.

Considering the above, one is inclined to agree with Taleb when he remarks that ‘… it is as if we have the wrong user's manual' (Taleb, 2007, prologue xxii) for navigating successfully in a world of wild uncertainty. We crave simple but coherent narratives. We value elegant theories and become committed to them. We think we are special and that the world around us is benign. We are equipped with a mind that was created for an existence with much fewer variables and more direct cause‐and‐effect mechanisms. Reflecting deeply about interconnected systems was not key to survival in our evolutionary past. In a somewhat shocking passage, Taleb says that ‘our minds do not seem made to think and introspect’ because, historically speaking, it has been ‘a great waste of energy’ (ibid.).

In fact, information, which potentially helps us rise above sucker‐status, is costly to acquire and process. Imagine that I bring up the possibility of nuclear terror affecting a major US city. Such a scenario involves hundreds of thousands of dead and an upheaval of life as we knew it, before even considering what the countermeasures might be. Any firm with operations in the US is likely to be greatly affected by this calamity. Now what is your gut reaction to this proposed topic of conversation? In all likelihood, your kneejerk reaction is to immediately try to shut it down. The sheer unpleasantness of the topic makes us not want to go there, even for a brief moment of time. It is too much to take in, and frankly too boring, so, to save us the mental energy, we are perfectly willing to resort to the handy tactic of denial.

As problems, extreme and abstract possibilities, remote from everyday practicalities, are not inspiring enough to energize us. They are out of sight and therefore out of mind. We are unable to maintain a focus on them for long enough. Our thoughts will gravitate towards something more tangible, some action that yields a more gratifying sense of accomplishment here and now. It often takes a herculean effort to process remote possibilities and we are rarely in the mood for it. They are therefore not necessarily ‘unknown unknowns’, rather they can be thought of as ‘unknown knowables’. Unknown knowables is meant to convey that it is within our reach to form an understanding of the possibility and most of its consequence, but we fail to do so because of our laziness or disinterest. That makes it, for practical purposes, a Black Swan, at par with the unknown unknowns. At least to some, that is, because others might be prepared to take up the challenge.

The Black Swan Problem

Подняться наверх