Читать книгу Rules of Reason - Bo Bennett PhD - Страница 5

Оглавление

Know Thyself

Reasoning is a process that is strongly influenced by many factors that are not easily apparent to us. Both biology and environment shape who we are and how we think. While we are not in complete control of our intellect and reasoning, we do have some control, and we can get even more control by knowing our cognitive limitations and keeping those limitations in mind when making and evaluating claims.

Rule #1: Acknowledge the Limits of Your Knowledge Regarding the Claim

It has been said that a little knowledge is a dangerous thing, and the advent of the Internet has certainly provided us with many examples where this is true. Keyboard warriors who spend a few hours on Google and YouTube convince themselves that they know more than doctors, researchers, scientists, and academics who spend their lives studying a narrow field where they have attained mastery. Even the doctors, researchers, scientists, and academics can convince themselves that they know far more than they do. We all need to acknowledge the limits of our knowledge.

We don’t know what we don’t know, or to put another way, without knowing how much there is to know about a particular topic, we have no way to know how much about that topic we do know. Unfortunately for us, we grossly overestimate our knowledge and competence. This is a well-known effect in psychology, known as the Dunning-Kruger effect. The good news is, if we realize that we are likely to be victims of this effect, we can take this into consideration and lower our estimate of how much we actually know. Once we have an accurate assessment of our knowledge on the topic, we can identify and defer to people who know more than we do on the topic. When we realize that there is still more we can learn on the topic; we will be less resistant to related information that could increase our understanding of the topic.

Even if you are confident in your level of knowledge on the topic, realize that factual information or good advice can come from those people and sources less knowledgeable than you on the topic. Dismissing information solely on the source, although reasonable at times depending on the source and situation, is a fallacy known as the genetic fallacy. To illustrate this point, just think about a time when someone tried to “educate” you on a topic about which you actually knew far more they did. You probably felt that they were patronizing or that they were ignorant, and as a result, resisted the information they shared. What you might not have realized at the time is that even if the information they presented was factually correct or a good suggestion, your convictions of you being right led you to dismiss the factually correct information over the preservation of your believed “rightness.” The result... you missed an opportunity to become even more knowledgeable on the topic, not to mention, you almost certainly appeared ignorant to the other person.

Rule Summary: Understand that there is likely much you don’t know on the topic and realize that even sources that are frequently wrong are sometimes right.

Rule #2: Explore Your Biases Related to the Claim

Raised as a Catholic, I attended religious school from the third grade until the eighth-grade and remained a believer well into my thirties. In high school, I had a friend who was an atheist. He would often present arguments as to why the God I believed in almost certainly didn’t exist. I remember, at the time, being terrified because his arguments made sense. I also remember believing that if I didn’t believe in God, I would spend eternity being tortured in Hell, not to mention upset the God that was responsible for all the good things in my life. I defended my belief in God just as passionately as I would defend my own life—because, in a way, I was.

Cognitive psychologists refer to what I experienced as motivated reasoning, an extremely powerful phenomenon where our reasoning process is hijacked by our desires. I wasn’t pretending; I wasn’t lying; I was authentically arguing what I believed because what I believed was strongly influenced by my emotions, not reason.

When we have a vested interest in the outcome, many cognitive biases kick in and distort our reasoning. This creates a dilemma. On the one hand, passion could be a wonderful force that energizes us to cause change for good. On the other hand, passion can blind us to facts and reality when facts and reality contradict our ideological positions. There is a solution. The solution is to be passionate about discovering truth. Be passionate about learning, logic, and reason, no matter where it leads.

In a generic sense and in the context of reasoning, a bias is a tendency to favor select information in an unreasonable manner, usually due to emotion interfering with reason. The reason why you personally may have these powerful emotional biases are best left for a therapist; you just need to be able to notice them, which is generally not difficult because they accompany strong emotion rather than indifference. For example, if you ever felt an immediate sense of frustration when calling technical support and being connected to a woman, this is a bias. This isn’t about sexism; it is about having a preconceived notion that men are better at technical issues than women. Okay, maybe it is about sexism.

A cognitive bias is a systematic error in thinking that affects the decisions and judgments that people make. These are much more difficult to detect than our standard biases for several reasons. First, most of these errors are not the result of strong emotion, so there is no visceral indicator of the bias. Second, there are literally hundreds of known biases, which makes memorizing them challenging, to say the least. And third, there is actually a cognitive bias that prevents us from readily admitting our cognitive biases know as the bias blind spot. Despite these challenges, we can recognize our own biases with a cursory knowledge of the biases and some practice. Did I mention that I have an online course on cognitive biases at www.virversity.com/course/cognitivebiases?

Here are just a handful of what I have found to be the most common cognitive biases people experience related to evaluating claims along with suggestions on how you can overcome these biases.

1 Confirmation Bias. People choose news sources that are most likely to report information that is consistent with what they already believe. Likewise, they dismiss or even subconsciously forget information that contradicts their beliefs. A near-perfect example of this is how President Trump refers to news that is hostile to him as “Fake News.” This label of “fake” is independent of the truth value of the claims made by the source, but rather a result of agreeing with President Trump. To combat the confirmation bias, focus on facts and reality, not on being right. Embrace being wrong; don’t deny it.

1 Courtesy Bias. This is the tendency to agree with a claim because it is the polite thing to do, not because you actually agree with the claim. This is a problem because it gives the appearance that the claim is supported by far more people than it actually is (not that the number of people who support the claim makes it any more true). If the goal is to accurately assess the probability of a claim, agreeing with the claim just to be “nice” is antithetical to the goal. I can’t tell you how many times I read my Facebook friend’s posts and cringe. In most cases, I make the conscious decision not to engage, which is fine. This is displaying judgment and diplomacy. This is a problem when similar claims are made in friendly discussion or debate, and we don’t cringe; rather, we find ourselves nodding in acceptance or even just accepting the claim in an unconscious attempt to avoid conflict. To combat the courtesy bias, remind yourself frequently that good people can and do say stupid things.

Rules of Reason

Подняться наверх