Human Rationality · Human Thinking

Why facts don’t change our minds

This post is a review of a New Yorker article by Elizabeth Kolbert titled Why Facts Don’t Change our Minds published on February 27, 2017.  This article is, among other things a review of the book The Enigma of Reason by the cognitive scientists Hugo Mercier and Dan Sperber.

I have discussed the work of Mercer and Sperber before when I reviewed Mercier’s essay   “The Argumentative Theory” in edge.org.  And I am also in the middle of reading The Enigma of Reason.

In my opinion, this whole rethinking of what human rationality actually is and why it evolved is one of the most interesting issues that modern cognitive scientists are currently investigating.  Here is my  bibliography of some of the main books and essays about this new understanding of human rationality.  Elizabeth Kolbert’s essay that I am addressing in this post adds another dimension to this discussion and adds a couple of new books to my bibliography.

Kohbert begins her essay by saying that cognitive scientists have been conducting experiments on human rationality for quite some time.  She details a Sanford University study way back in 1975 where students were asked to study two sets of suicide notes.  The students were told that some of the notes were genuine and some were fakes.  The students were asked to distinguish between the fake and real notes.

After completing the exercise, some students were told that they were geniuses at this task and that they had been right almost every time.  The other half of the students were told that they were worthless at this job and had been almost always wrong.

Then the students were told the truth.  They had been lied to.  No one was really good or really bad at this job.  They had all been about the same in discerning which notes were fake and which were fakes.  They were told the real purpose of experiment was to gauge their response to thinking they were right or wrong.   Actually this was a deception also.

Then the students were asked to guess how many  suicide notes they had been right about.  The results were surprising.   Those students who had originally been told they were geniuses said they thought they had done really well and those who had been told they were dunces thought they had done really badly.  This in the face of the fact that they had explicitly been told they had been lied to when they had first been told how they had done.

“Once formed, the researchers observed dryly, impressions are remarkably perseverant.”

This kind of experiment has been done over and over for many years and the results are always the same.

As Kohlbert says, “The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational.”

There have been several takes on this fact.  One of the most famous is that of Daniel Kahneman.  He began his research on human cognitive bias in the mid seventies and has been working on it ever since. He summed up all his findings several years ago in a very well known book called Thinking Fast and Slow .  His conclusion?  Human thought is shot though with hundred of biases that operate mostly subconsciously.  Everyone always thinks they are totally rational and almost no one ever is even a little bit rational.

Just last year Michael Lewis popularized many of Kaheman’s ideas in his own book The Undoing Project.  I haven’t read this one yet but I hear that it is a good story.

And Kahneman’s ideas about human rationality have been picked up by economists who have created a whole new discipline called Behavioral Economics.  Three of the best books describing this discipline in a sprightly and readable way are  Phishing for Phools: The Economics of Manipulation and Deception by George Akerlog and Robert Shiller and Misbehaving: The Making of Behavioral Economics by Richard Thaler and Predictably Irrational by Dan Ariely.  These are all central and well known books about the new ideas on human rationality as applied to economics.  Because of Kahneman’s ideas about what human rationality actually is, economics, I suspect, will never be the same again.

All of these books are absolutely fascinating reading that detail the undeniable fact that humans think in a completely irrationally manner, almost all the time.  And it isn’t just dumb, uneducated scumbags who think irrationality.  Actually almost everyone thinks irrationally, even those who are well educated and especially those who are sure they are always rational.

There are exceptions to this rule of course; there are definitely ways humans can make an end run around their irrationality.   But these excerptions are very few and far between and so much trouble that almost no one ever to goes to the bother to think rationally, even when they know better.   And there are elements of irrationality that always remain, even when we are thinking at our best.  Bottom line.  It really is best to be very, very modest of one’s own thinking ability.

Getting back to Kolbert’s Article.

Hugo Mercier and Dan Sperber are both cognitive scientists.  Mercer works at a French research institute in Lyon and Dan Sperber teaches and does research at the Central European University, in Budapest.  Sperber is the older of the two and has been working on problems of human rationality since the 1970s.  He, along with Kahneman is a pioneer in this field.  Mercer was a post doc not too long ago and does most of the book and review writing these days. You get the feeling he is the heir apparent to this intellectual pool of ideas that Sperber put together over the last 40 years.

Here is how Kolbert describes their central argument.

“Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups …  Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.

Kolbert asks us to consider confirmation bias.  This is one of the most famous biases in human thinking.  Confirmation bias is defined as looking at only the evidence that confirms what we already believe and skipping everything else.  And those prior beliefs that we refuse to give up are almost always intuitions that we have acquired irrationally.  In addition, our prior beliefs are are almost always about things we think we understand well, but actually do not.  There have been entire libraries of studies examining confirmation bias.  It appears to be something that all humans habitually do without realizing they are doing it.  Confirmation bias is almost always done subconsciously.

The fact that human reason is evolutionarily acquired, and that cognitive bias must have been selected for, led Sperber and Mercer to their theory of rationality.  Consider this, Kolbert says,

“If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”

This led Mercer and Sperber to realize that perhaps since we do live in societies and often think collectively, maybe social interaction can cancel many of our biases.  The fact is, all of us are completely convinced that all of our own ideas and positions are correct and moral and true.  We find it impossible to find fault with the beliefs that we have created inside our own heads.  In short, we are all almost certainly wrong about almost everything we have figured out in isolation.   But, very interestingly, we are very, very good at finding mistakes, errors and biases in everyone else’s thinking.  And even stranger, if we are presented with our own ideas as belonging to someone else, we can pick them apart very easily.  Endless experiments have proved all of this pretty conclusively.

This asymmetry, that we cannot see our own bias but have a very acute ability to  see the biases of others helped Mercer and Sperber to understand what human reason really evolved to do.

“This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group.   [Those nasty freeloaders.]  Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave. There was little advantage in reasoning clearly, while much was to be gained from winning arguments.”

So, say Mercer and Sperber, our ancestors didn’t evolve to be great logical thinkers.  This, they say, worked OK in prehistoric times, but unfortunately it doesn’t work so well in the modern world that we live in now.  They say that “this is one of many cases in which the environment changed too quickly for natural selection to catch up.”  This, Kolbert implies, is part of the reason we now are stuck with Trump.

Mercier and Sperber are not the only ones thinking along these lines.  “Steven Sloman, a professor at Brown, and Philip Fernbach, a professor at the University of Colorado, are also cognitive scientists. They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions. They begin their book, “The Knowledge Illusion: Why We Never Think Alone”, with a look at toilets.”

Sloman and Fernbach say that everyone is very familiar with toilets and all kinds of other technology in our modern world from jet aircraft to iPhones.  But no one knows how these things actually work, (toilets, it turns out, are more complicated than they appear).   But people think they know way more than they actually do.   Sloman and Fernback call this the “illusion of explanatory depth.”  And they see this bias just about everywhere.

They say that “What allows us to persist in this belief is other people. In the case of my toilet, someone else designed it so that I can operate it easily. This is something humans are very good at. We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate,” Sloman and Fernbach argue, “that we can hardly tell where our own understanding ends and others’ begins.”  But, unfortunately, everyone thinks they understand everything.  They say we evolved to be this way.

In essence, they are saying that there is no sharp dividing line between one person’s knowledge and beliefs and those of another persons.  This borderlessness has a downside though.  As people created new knowledge for new ways of living, they simultaneously created new forms of ignorance also.  If everyone had to understand everything about everything the human race would never have made any progress.  “If everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much. When it comes to new technologies [and progress], incomplete understanding is empowering.”

Where this gets us in trouble now-a-days is with politics.  “It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about.”  There have been a lot of studies done that show that the less we know about anything the more adamant are the opinions we have about it.  For instance if people are asked how the US should react to  the Russian annexation of Crimea and at the same time are asked to point out Crimea on an unlabeled map, the farther off they thought Crimea was from its actual location, the more favorable they were to US military intervention.  The less they knew, the more extreme their opinion.

The moral of this, Solman and Fernbach say, is that “as a rule, strong feelings about issues do not emerge from deep understanding.”  They say that the more we understand things for ourselves, the more moderate our views and positions become.  They “see in this result a little candle for a dark world. If we—or our friends or the pundits on CNN—spent less time pontificating and more time trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views. This, they write, ‘may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.'”

At this point Elizabeth Kolbert turns to a third book and considers the fact that maybe science could help poor human beings out of this dilemma.  “In “Denying to the Grave: Why We Ignore the Facts That Will Save Us,”  Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, probe the gap between what science tells us and what we tell ourselves.”

Immunization is one of the great triumphs of science.  Yet many people who have no understanding of the science of immunization firmly believe that it is dangerous, and that it will give their children autism and all sorts of other dangerous maladies.  And nothing will convince them that this is not true, that there is not a scrap of evidence for a connection between immunizations and autism.

“The Gormans, [like the authors of the other books in this article], argue that ways of thinking that now seem self-destructive must at some point have been adaptive. And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component. They cite research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs. “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.

Unfortunately, says Kolbert, giving people accurate information just doesn’t seem to help.   They just ignore this.   The challenge, the Gormans say toward the end of their book “is to figure out how to address the tendencies that lead to false scientific belief.”  Unfortunately, no one seems to have come up with a way to do this.

Elizabeth Kolbert ends the book by saying that

““The Enigma of Reason,” “The Knowledge Illusion,” and “Denying to the Grave” were all written before the November election. And yet they anticipate Kellyanne Conway and the rise of “alternative facts.” These days, it can feel as if the entire country has been given over to a vast psychological experiment being run either by no one or by Steve Bannon. Rational agents would be able to think their way to a solution. But, on this matter, the literature is not reassuring.”

Post by Fred Hanselmann

9151,-Dark-Tetons,-Golden-Sunset
The Tetons and the Snake River at Sunset

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s