Mind Macros 49: Domain specificity, confirmation bias protection, and rationalism vs. empiricism
"You have power over your mind — not outside events. Realize this, and you will find strength." — Marcus Aurelius
Welcome to another issue of Mind Macros - I hope you find something of value.
The newsletter has been moved to Substack. You will not notice any differences other than a new look and the ability to like and comment on each post. By doing so, you help me gauge the enjoyment of each issue, allowing me to tailor the content accordingly.
Food for Thought
I. Domain specificity
"This inability to automatically transfer knowledge and sophistication from one situation to another, or from theory to practice, is a quite disturbing attribute of human nature. Let us call it the domain specificity of our reactions. By domain-specific I mean that our reactions, our mode of thinking, our intuitions, depend on the context in which the matter is presented, what evolutionary psychologists call the ‘domain’ of the object or the event. The classroom is a domain; real life is another. We react to a piece of information not on its logical merit, but on the basis of which framework surrounds it, and how it registers with our social-emotional system. Logical problems approached one way in the classroom might be treated differently in daily life. Indeed they are treated differently in daily life."
"Knowledge, even when it is exact, does not often lead to appropriate actions because we tend to forget what we know, or forget how to process it properly if we do not pay attention, even when we are experts. Statisticians, it has been shown, tend to leave their brains in the classroom and engage in the most trivial inferential errors once they are let out on the streets.
"For another illustration of the way we can be ludicrously domain-specific in daily life, go to the luxury Reebok Sports Club in New York City, and look at the number of people who, after riding the escalator for a couple of floors, head directly to the StairMasters.
"This domain specificity of our inferences and reactions works both ways: some problems we can understand in their applications but not in textbooks; others we are better at capturing in the textbook than in the practical application. People can manage to effortlessly solve a problem in a social situation but struggle when it is presented as an abstract logical problem. We tend to use different mental machinery—so-called modules—in different situations: our brain lacks a central all-purpose computer that starts with logical rules and applies them equally to all possible situations. And as I've said, we can commit a logical mistake in reality but not in the classroom." — From The Black Swan by Nassim Nicholas Taleb.
Domain dependence describes any technology or technique restricted or limited in its effectiveness to the environment in which it operates.
One of the clearest illustrations of domain specificity is street fights.
A martial artist may dominate in the controlled environment of their dojo, but when thrust into the unpredictable arena of the streets, even experts must revamp their techniques. In the dojo it's a battle of technique like a chess match, but the streets are wild and lawless, forcing a fighter to adapt their arsenal to fit the domain. Uncontrolled environments are equally rife with external variables, one of the most dangerous being hidden weapons, which no amount of technique can defend against with certainty.
In Mind Macros 48, we explored the dichotomy between rationalism and empiricism. Rationalism is classroom knowledge derived from theory, reason, and logic, while empiricism is real-life knowledge derived from experience, observation, and experimentation. Taleb advises mixing the two, combining real-life knowledge with a library.
II. Confirmation bias is a protection mechanism
"Much like physical threats, such as the risk of receiving a shock, might shut down our willingness to explore, your brain must also be motivated to protect itself from psychological threats. If this is true, maintaining your most central, identity-based beliefs must be one of the goals your prefrontal cortex would use to guide your behaviors. The power of this kind of belief structure is that rather than driving you to explore, or collect statistics and form an objective opinion about what might be true of the world 'out there,' such top-down navigation strategies would cause your basal ganglia to turn up the volume only on 'relevant' information—that is, information that is consistent with your identity-based beliefs—while turning down the volume on any information deemed 'irrelevant' because it doesn't support your worldview. In a recent opinion paper, Jay Van Bavel and Andrea Pereira outlined how such a brain-based model might be used to describe the relation between personal values, political beliefs, and partisan behaviors. Whether you're willing to believe it or not, we all do this. It keeps us feeling safe, and protected, and correct.
"Psychologists who study how we form and hold beliefs have long known that when people are surprised by information that is inconsistent with what they believe to be true, they don't often behave rationally. Instead, they ignore or even discredit evidence that is inconsistent with their beliefs—a phenomenon known as a confirmation bias. The important take-away from this is that the possibility of encountering a piece of information that is inconsistent with your centrally held beliefs may be perceived as a threat during the appraisal phase. The result would be to shut down the cycle of wondering—and wandering—that brings us to explore the unknown." — From The Neuroscience of You by Chantel Prat (view my three takeaways).
Confirmation bias is a psychological phenomenon in which people search for, interpret, focus on and remember information that confirms their preexisting beliefs. The bias helps protect us by reducing the mental discomfort caused by holding contradictory thoughts, a concept known as cognitive dissonance. When we only seek information that agrees with our existing views, it gives us a sense of security and decreases the stress associated with uncertainty. This comfort comes at the cost of forming an inaccurate map of the world by interpreting information through the lens of existing beliefs.
I was fascinated by Prat's interpretation of confirmation bias as a safeguard against mental hazards. Before, I had thought of confirmation bias as a 'brain bug’, not a feature to support our contentment. This makes the endeavor of counteracting confirmation bias more demanding and all the more essential.
Learning individual cognitive biases can alter how we experience the world around us. Still, these biases can only tell us so much in isolation. The brain is not a collection of isolated biases but an interconnected system of neurons that work together to enable consciousness. Describing cognitive biases without recognizing this interconnectedness of our minds would be like attempting to appreciate the grandeur of a symphony while focusing on an individual instrument. The intricate interplay of sounds makes up a symphony, just like the interconnectedness of the mind enables consciousness.
"It's like a finger pointing away to the moon.”
*Student looks at Bruce's finger*
*Bruce promptly slaps the student*
"Don't concentrate on the finger, or you will miss all that heavenly glory."
Individual biases are akin to that finger, so preoccupied with their form and function that we fail to behold the entwining nature of the mind.
One example of failing to see the forest for the trees is the notion that we can alter a single belief independently of its cousins. Changing one belief often requires updating many others to maintain harmony and avoid cognitive dissonance. But this chain of revisions takes time and a great deal of energy. Living with constant uncertainty would leave us exhausted, anxious, and stressed. We need mental shortcuts to navigate the world's complexity with some degree of austerity. Confirmation bias is but one shortcut that allows us to consign conflicting information to oblivion. One could even argue that a certain degree of 'blindness' is warranted, as it is impossible to form a comprehensive opinion on every subject. In these moments, I am reminded of a quote from the philosopher King Marcus Aurelius:
"You always own the option of having no opinion. There is never any need to get worked up or to trouble your soul about things you can't control. These things are not asking to be judged by you. Leave them alone."
The TV show Westworld (HBO) presents a memorable metaphor for confirmation bias. When the hosts, created to be indistinguishable from humans, observe something their creators have not equipped them to deal with, their response is a dismissive, "Doesn't look like anything to me." In the series, Anthony Hopkin's character (Robert Ford)—the engineer behind the hosts—explains the meaning behind the phrase:
"They cannot see the things that will hurt them, I've spared them that. Their lives are blissful, in a way their existence is purer than ours, freed of the burden of self-doubt."
Ford describes the design feature as distinct from humanity, yet it seems we have similar self-preservation mechanisms that shield us from perceiving certain truths that could 'hurt us.'
Quotes to Ponder
I. A Chinese parable on viewing every situation without bias:
"Once upon a time there was a Chinese farmer whose horse ran away. That evening, all of his neighbors came around to commiserate. They said, 'We are so sorry to hear your horse has run away. This is most unfortunate.' The farmer said, 'Maybe.'
The next day the horse came back bringing seven wild horses with it, and in the evening everybody came back and said, 'Oh, isn't that lucky. What a great turn of events. You now have eight horses!' The farmer again said, 'Maybe.'
The following day his son tried to break one of the horses, and while riding it, he was thrown and broke his leg. The neighbors then said, 'Oh dear, that's too bad,' and the farmer responded, 'Maybe.'
The next day the conscription officers came around to conscript people into the army, and they rejected his son because he had a broken leg. Again all the neighbors came around and said, 'Isn't that great!' Again, he said, 'Maybe.'“
II. Marcus Aurelius on our circle of control:
"You have power over your mind — not outside events. Realize this, and you will find strength."
Thank you for reading,
P.S. I'm now on Twitter, sharing daily excerpts from the books I'm enjoying.