Cognitive Science · Human Rationality · Human Thinking

“Thinking, Fast and Slow,” by Danial Kahneman

The following article is about the book, Thinking, Fast and Slow, by Danial Kahneman.  In my opinion this book is one of the most important books of the last 50 years.  Kahneman has been thinking about and refining the ideas in this book since the 1970’s.  This basic ideas of this book have been recapitulated in Michel Lewis’s book, The Undoing Project: A Friendship That Changed Our Minds.

This post is still in fairly rough shape, to say the least.  It is basically just a set of notes that I took on the book when I read it.  You might be able to get something out of these notes with a bit of struggle though.  I apologize.   I’ll get back to whipping this post into more readable shape just as soon as I can find the time to do it.  This blog is new, and there is currently more work than I can deal with.  Thanks for your patience.

Chapter one, The Characters of the story.

Kahneman says there are two different types of human thinking.  System 1 thinking and system 2 thinking.

Number 2 System of thinking is a deliberate effort for orderly conscious way of thinking. Other animals have number one systems. The number 2 system requires attention. You have an attention budget. Number one still sees Illusions even when #2 sees thru the illusions. There are no only visual illusions.  There are many cognitive Illusions out there.  For example Psychopaths often fool us.  This is a cognitive illusion. #2 says we shouldn’t don’t believe what psychopaths do but our #1 is easily fooled by their apparent niceness and physical hints and body language.   We are certain that psychopaths are nice people. This is a cognitive illusion.

System one cannot be turned off. Lines look longer when we know they are.  This is a #1 illusion.   Unfortunately not all Illusions are visual. There are also cognitive illusions. However no one can live in number 2 system only.  We couldn’t survive without our #1 system.   It’s always very easy to see other people’s biases but almost impossible to see our own biases. In actuality the number one and number two systems are fictional characters that help us understand how the brain works . It’s really important to remember that we all have limited working memory, and when it runs out we shift over to #1 without even realizing it.

 

Chapter 2, Attention and Effort

Number 2 system requires paying attention and this takes more effort than the automatic #1 system so we tend to prefer the #1 system to the #2 system.   We have an attention budget, we can only pay so much attention to things and then #1 takes over.   #1 system also takes over in emergencies.  Talent and experience actually leads to less #2 usage.  Very experienced people feel they don’t need to pay attention, they can do what they do well without paying attention.   People tend to gravitate towards easy options. Laziness is built into human nature. Number 2 can deal with two ideas at once, number 2 can deal with statistics. When number 2 system is working usually everything else is excluded. We often don’t see or hear what is going on around us. Number one intuition must be examined by number 2 if we are avoid bias.

Chapter 3, The Lazy controller.

Number two thinking has a natural speed. You can stroll and use Easy #2. However hard walking and number two are impossible, you cannot use number 2 Thinking while walking hard. Any self-control requires number 2 system. What is often called flow is effortless number 2 concentration. Flow requires no self-control though.  Experiencing flow is very rare.  Few people can effortless use their number two.  A luck few can tho. Self-control and cognition usually means hard, difficult unfun mental work. Number one has more influence when number 2 is busy. Any effortful cognition is physical work and mental energy goes down. Self-control means forcing yourself to do anything and this is effortful. Self-control leads to Ego depletion. Ego depletion leads to loss of self-control. Ego depletion is not the same thing as cognitive busyness. Hard concentration leads to large glucose usage. You can fix ego depletion by eating a candy bar this is simply glucose increase. Lack of glucose leads to no number 2 to control suggestions by number one.

 

Here is an example of how the human mind really doesn’t like to work.

A bat and a ball cost a dollar and $0.10

the bat costs $1 more than the ball.

How much does the ball cost.

The number one answer is $0.10

The numbe 2 answer is $0.05.

 

Number one brain doesn’t like to work hard and it quickly comes up with .10 cents.  #2 however requires going back and checking your answer.  You always have to check on what #1 tells you.   Overconfidence in number one is universal.  

Intelligence is also the ability to remember. But memory is number one. But searching memory for the right answer is number 2.

Some people have a very lazy number2 system.   Some people ever very weak number 2 system.

Chapter Four, Your Associative Machine.

Number one responses are caused by associative activation. Here’s a phrase Bananas Vomit. This leads us to a number one reaction. Cognition is embodied.   We think with our body. If someone says Bananas Vomit we get a body action. Associative memory happens very rapidly. One idea activates many others. Most of associative memory is subconscious. The workings of the mind are mostly subconscious. Number 1 is always subconscious.

The Priming Effect. What we recently did primes us. Priming is a ripple of associations, words, ideas, actions, emotions. We can all be primed.

When we look at words with “old” connotations we act old.  When we are amused we often smile, when we smile this makes us feel amused. Facial expressions match your moods. Facial expressions can cause moods and moods can cause facial expressions. When we nod we tend to agree. When we shake our head we tend to disagree. If we tell ourselves “Remain calm and carry on” we tend to act calm and carry on.  

Priming indicates that humans really don’t have much in the way of free will. For instance the kind of polling station in which you vote  influences your vote. When people see money or are reminded of money, they tend to act more independent, self-reliant, selfish, separate,. Living in a money conscious culture has large influences on the way we act when we are surrounded by advertisements and all sorts of things that are about money we act in certain ways that we aren’t even conscious of doing. Violence, God, money, respect, leaders, all Primus for different things in different cultures. This is the Lady Macbeth effect. People who lie tend to buy mouthwash.  People who tell untruths in emails tend to want to buy soap to wash their hands.

Our number two believes it is in charge but in fact it is not. Number two believes it has free choice and thinks it knows why it does things but it does not. This is true about everyone including me too.

Priming happens in number one and  number one is subconscious. People do what they are primed to do. Violent depressing news leads to a belief that the world is bad. You are primed by whatever you pay attention to. You’re number one system does all of this.

 

Chapter 5, Cognitive Ease.

When we feel cognitive ease,   it feels like we have no worries at all.  When we feel strained it feels like there are problems that will need  number 2.  Ease feels good, true, familiar, easy.   The feeling of familiarity comes from #1. Number 2 uses familiarity is a test for truth. Ease leads to things feeling true but this is unfortunately a bias. Repetition leads to believing that whatever is repeated his true. Repetition leads us to feel like we have no worries at all. When we feel strained it feels like there are problems that will need a number 2.   Ease feels good, true, familiar, easy. Familiar feelings come from number one. Number 2 uses the feeling of familiarity as a test for truth. Repetition leads to believe that whatever is repeated is true. We tend to think that things that we read in clear, bold, type or  in red type are true. We tend to not believe things that are wordy or written in erudite style. So we don’t believe in them. We think that rhyming words are true.  When we hear something like “Woes unite Foes”, we to believe this. Words that are easy to pronounce sound true.  So we believe them.  #2 is lazy so we mostly let #1 guide us.

Cognitive ease leads to acceptance of most anything.   Cognitive strain leads to using #2. If things if we are reading are in hard to read print, we tend to use number 2 and then we tend to think these things are true. If your mind is at ease, this  puts a smile on our faces. Words that are easily pronounced feel familiar and this puts a smile on our faces.

The things that we have more exposure to we become more familiar with and the things that we are familiar with we are favorable to. This is called the Mere Exposure effect. All this is completely subconscious. Number one responds before number 2 is even aware of anything happening.

Sarnof Mednich uses linked words like Cottage – Swiss – Cake.  The more at ease we are, the more intuition we have the linkages between words.   When we are in a happy mood the more linkages we see.   When we are in an unhappy mood we have zero intuition of linkages.  Moods affect how good our number 1 is. When the number one is happy we are intuitive, creative, and gullible.  On the other hand our #2 tends to be  sad, vigilant, suspicious, analytic.  People who use their number 2 a lot, tend to be this way. We tend to associate cognitive ease with  pleasure. Happy good moods lead to high intuition.   Being happy and being in a good mood tends to weaken #2.

 

Chapter 6, Norm’s, Surprises, Causes.

The associative system is the number one system. The associative system is automatic. Consciousness is the number 2 system. Number one is both wonderful and limited. The main things that number one does:  Number one shows us what is normal. We think linked associations are all normal. Normal basically means no surprises.

Number one makes repeated surprises seem normal. This is the Moses illusion. If we see the word Arc we think biblical and we think Moses is normal here. We subconsciously accept Moses here. We accept Moses or Noah. Our number one system links Moses and Noah. Our number one system detects abnormalities. Our number-one system detects violations of normality.

We have norms for vast number of categories. Number one understands language normally. Number one understands normal cause and effect. Number one offers normal interpretation to number 2.  #1 automatically searchs for causality. We need coherence in our life and in the world. Number one finds coherent causes. Number 1 associates words with causes. Number one invents probable causes. Cause links words and ideas. Number one sees a sequence of events as cause and effect. Number one has the impression of causality. In one example Number 1 sees a large triangle as being with bullying person and a small triangle as being a timid person. We tend to see intention and emotion everywhere this is universal. Only autistic people don’t do this. Number 1 invents stuff out of nothing. Paul Bloom wrote an article in the Atlantic in 2005 where he shows that people invent causality and this leads  to inventing religions. Number 1 is desperate to find causes. (Location 1263)

#1 sees causation wrongly.  Statistics is the only real way to see causation.  Very little repetition is needed to make a new experience feel norman.

Humans cannot accept random chance.  They need a causal story.  They cannot accept randomness.  See Talib Nasim’s, Fooled by Randomness.

 

Chapter 7, A Machine for Jumping to Conclusions.

Jumping to conclusions is efficient if conclusions are likely to be correct and the costs of error are acceptable.

We jump to conclusions when we see the letters and numbers above. (This is a good experiment.  Check the book to see this.)  This is our system one making errors. Recent events and current context determine interpretation. Number 2 is not even aware that number one is doing this. Conscious doubt is not part of number one. Uncertainty and doubt are number two things.

Daniel Gilbert wrote a book called to Stumbling to Happiness. Gilbert says we must believe before we can understand. Belief is from the number one system and it is automatic.   Un-believing is number 2 system though.

Associative memory leads to confirmation bias. Science says we test ideas by refuting them. That is, by falsifying them. Not by confirmation of the ideas. Popper says you can test only falsifiable hypothesis.

The halo effect: This is the tendency to either like or dislike everything about a person.  First Impressions influence all following impressions. The context of things is very important.  We are influenced by the context of a sentence. Both of the above are Halo effects.   Both are associative effects.  The sequence of an observation is often determined by chance.  The Halo effect increases the weight of the the first impression.  

Surowiecki, in his book The Wisdom of Crowds  says all estimators have to be unbiased for the wisdom of crowds to work.  You can’t allow the estimators to influence each other. Associated things and observers influence each other.  Thus meetings are not good.  Everyone influences everyone else.  Bias is shared.  Get brief independent summaries of people’s feelings and ideas  before meetings.  It you do this , then the Wisdom of Crowds works.

The number one mind is the associative mind. The associative machine represents only active ideas, stuff you have dredged up from the memory. It acts as if this is all there is. In the #1 system, coherence between memories equals success. The amount and quality of data is totally irrelevant. When information is scarce the number one mind jumps to conclusions.

A coherence seeking number one and a lazy number 2 is a bad combination. Knowing only a little makes coherence easier leading to feelings of ease.  WYSIATI leads  to overconfidence and framing effects. What is really needed is looking at the base rate.

Chapter 8, how judgments happen.

Number one is at work all the time monitoring what is going on inside and outside the mind and generating assessments of the situation. It does this effortlessly. These basic assessments are often substituted for more difficult questions.  This is the essential idea of the biases heuristic.. And this happens mostly subconsciously.   Stuff looks true to us, but it may be very far from the truth. System 1 evolved (that is, through Evolution) in us to help us survive. It’s a continual assessment of the main problems we must solve to survive. Are there threats out there? Opportunities out there? Normal? Safe? Familiar? None of this has been turned off since we left the Savannah.

At a glance we can tell friend from foe, trustworthy or safe or not.   Research shows voters do this.  We substitute how trustworthy a face looks to us for a real number 2 assessment of a politician’s competence. This is a heuristic. We substitute a number one judgment for a number 2 judgment and don’t even know we have done this. Number one does not do well adding up things.  Emotion will almost always swamp the number one system. Questions of more are understood by number one easily and number one can switch from one scale to a different scale.  This is statistically incorrect.  System one can carry out many computations at any one time. We can compute 3D, shapes, position, identity, speed, etcetera. This is all done subconsciously. Other assessments are done only when needed.

We often do excess computation. This is called the mental shotgun. The intention to do one computation evoked other computations. Evaluating people as attractive is something that number one does automatically. Whether we want to or not.

 

Chapter 9, answering an easier question.

We are never stumped, we always have lots of answers and opinions about all kinds of complex stuff we really don’t understand at all.  Number one does this. It will find a related question and answer it. This is called substitution. Number 1 answers complex questions, by substituting a simple questions and answering them. When called on to judge probability, people judge something else and think they have judged probability. This happens because of the mental shotgun. Number 2 is lazy and so number one just answers an easy question.

Intensity matching: “How much would you spend to stop climate change” becomes “I really am angry about the death of polar bears.”

The 3D heuristic:. Your perceptual system is easily fooled by a 3D picture. There is a picture of men in the tunnel shown in the book. This is a good picture, look at it on page 100 of the book. When we see perspective on paper we are always fooled. Substitution occurs. We substitute 3D size for 2D size. This happens automatically and subconsciously. Even when we know what is happening we still see the illusion. This is a very good illusion look at page 100 in the book. Distant objects always appear larger to us in 2D. This happens so deep in the perceptual system we can’t help it.

The order in which two questions are asked influences how you answer them.

Also, your present mood determines if you think you are happy in general or not.

Affect heuristic: People let their likes and dislikes and emotions determine their beliefs about the world. Current emotion influences decisions. You believe arguments that you like.   You don’t really look at or something. System 2, when it comes to stuff we like and dislike, is not a very good controller. It usually just goes along with what number 1 feels.  Number 2 just tends to confirm what number one says. Number 2 is pretty undemanding.

On page number 105 there is a good list of characteristics of number one system and what it does and how it works.

 

Chapter 10, the law of small numbers.

Kidney cancer is very low in rural areas. So we infer that the clean living and low pollution areas keeps incidence of kidney cancer very low. However if you look deeper, the areas where kidney cancer is highest are also rural. The same kind of country is both high and low for kidney cancer. The real answer is simply that rural areas have low populations. Rural or Urban doesn’t cause cancer. It’s just that statistics don’t work well with small samples.

Collections of random events behave in a very regular fashion. One of the rules of statistics is that smaller samples yield extreme results more often than larger samples do. There are mathematical laws that predict exactly how this happens. Small samples don’t cause or prevent cancer, it just allows the incidence of cancer to be much higher or lower than a huge sample. The incidence of cancer is not higher or lower it just appears to be because of an accident of sampling. These things are called artifacts, caused by some aspect of the method of research. Knowing is not a yes or no affair. Understanding sampling effects is very important.

System 1 believes all one way or not at all. Number one does not like doubt or ambiguity. Number 2 can deal with doubt and uncertainty. Certainly is very easy for the number 1 system so we naturally favor certainty over doubt. This is the same as the law of small numbers. We also exaggerate the truth of what we see. This is also similar to the halo effect. #1 runs ahead of the facts.

Cause and chance

The associate Machinery seeks causes.  Chance means there are no causes. Statistics deals with what could have happened instead. It does not deal with causes.   Statistics deals with chance and probability of certain things happening.

We are hard wired for causal thinking.   This causes us to make mistakes in evaluating the randomness of truly random events. Humans are pattern seekers. We don’t want things to be random. We are believers in a coherent world that makes sense. We want to believe in mechanical causality or in someone’s intention. Causality and intentionality and pattern and regularity are what human beings want. These are the opposite of randomness.  The ability to look for patterns of danger helped us to survive.

Misunderstanding of Randomness has consequences.   People see patterns where none exist. Randomness appears in clusters. Randomness is basically blind luck.

The so-called hot hand in basketball is a cognitive illusion. Hot hand shooting streaks pas all tests for Randomness. You can also figure the odds for this. The illusion of pattern affects our lives in many ways. Actually, if you follow your intuition that there are hidden causes, you will err more often than if you had believed in randomness. In general we pay more attention to content of message then to the reliability of the message. We like a simple highly coherent messages. But the actual world is more complex. Statistics seem too beg for causal explanation. Causal explanation of chance, random events is always wrong. This knowledge is fairly new.

 

Chapter 11, Anchoring

The anchoring effect is huge. When estimating numbers, people guess a number close to the last number they considered. What you pay for a house depends on asking price. There is a huge amount of evidence for this. Anchoring happens via two mechanisms number 1 and number 2

When we are aware of  an anchor we can move away from it and adjust our answer with number 2. Or the anchor can be a subconscious number one priming effect,  this is a case of suggestion. Suggest a person’s leg is numb and it will be numb.

We test facts by trying to make them true. But even if we reject an idea, we still have the feelings in our minds and these can create a whole series of number 1 errors. This is priming, associative coherence.

The anchoring index. You can actually measure the effect of anchoring. Asking the price of houses radically influences selling price. This point is also discussed by Dan Ariel in his book Predictably Irrational. In one experiment, the anchoring effectiveness was 41%. Professional realtors deny that anchoring like this actually happens. Obviously random anchors work just as well as actual anchors.  This influences everyone no matter how experienced or educated they are.   Anchoring effects are everywhere. Here is an example: You could make a sign that says sale: limit of 12 or you could make one that says No Limit. People will buy many more if there is a higher limit.  That number gets anchored in their minds.

Even system 2 is affected by anchoring. Number 2 depends on memory and memory is an automatic, involuntary, subconscious, operation of number one.  We are not in rational control of what memories come to mind, of what pops up. We just remember the last number we heard and we apply it. Example: If you ask, did Gandhi die at age 144? If not, then when did he die? And effect of mentioning 144 can be measured. The mention of 144 prime’s your answer. Another example, we remember a story of a dramatic Mountain Rescue. No matter if it comers from a dumb movie or an accurate book. It still affects us. Even random anchors (for example numbers that come from a wheel of fortune in experiments)  have a huge effect on us. Random anchors have zero information. We associate with all kinds of dumb things without even knowing it. We are influenced by the environment of the moment without even realizing it. Priming and anchoring are scary things. Both affect us in ways we don’t understand. We are not ever aware of priming happening. We are aware of anchors but don’t realize they are affecting our judgment. We have to assume that any number on the table has an effect on us. The only way to deal with this is to mobilize your number 2 system. We should never anchor on a  best or worst case scenario.

Chapter 12,  The science of availability

The availability heuristic. When it is easy to remember examples of something we think these things are common, ie that there are a lot of them in the real world. Easy memory of things equals, in our minds, lots of things, high frequency. Things seem frequent when instances are easy to remember. Ease of recall equals frequent. Both number one and number two are involved in this.

The availability heuristic substitutes one question for another. You are asked size but substitute ease. This leads to error.

Salient, sexy, scandalous events are easy to remember and you can think of examples quickly. Thus you think these things are frequent, that they happen often. Anything with lots of media attention we think is very frequent:  terrorism, murder, car accidents.

Personal experience is more easy to remember than Words, statistics, other people’s  experiences.  You were remember what happens to you much better than a newspaper article.

The easier something is available to you, the more frequent you think it is. We don’t get a flu shot because no one we know got flu last year. Awareness of your own by biases leads to better marriages and in other joint projects. We remember the good things we do better than the good things our wife’s do.   When husband’s and wifes were asked what do they contribute to tidyness, the husband percent plus wife percent is always greater than 100%.

Kahneman is not optimistic about personal control of bias, but he thinks we can control availability bias.

Experiment: ask people if they are meek or assertive. People thought they were assertive when asked for 6 examples of their assertiveness. It was easy to come up with 6 examples.  When asked for 12 examples it was hard to find 12 examples so they thought they were meek. [Rationality has very, very little to do with how people make decisions.]

People who deliberately frown,  work harder when looking for assertive examples. These people thought they were not very assertive.  Anytime people work hard in thinking, they are less convinced something is true.

But sometimes this isn’t true.  Sometimes people go by content rather than ease of retrieval. People are surprised how hard it is to find examples between 6 and 12.  If this surprise is explained then they don’t use this heuristic. If they are told distraction is the reason they can’t think of more examples of assertiveness, then they think they are meak.  

People rely more on #1 and on intuition if they are powerful or if they are made feel powerful.  George W Bush said he just had to know how he felt to make a decision.

Here are some examples of the availability heuristic: Two planes crashed last month so she now Takes the Train. The risk really hasn’t changed though. This is just silly thinking.  Or, he underestimates risk of indoor pollution because there are few articles about it. Or, she watches spy movies and sees conspiracy everywhere now. Or, the CEO has had several successes and is now over confident.

Here are some examples of the availability heuristic from my own life.

  • Ron watches murder and mayhem and now sees murder as a real danger.
  • Ron watches violent news and thinks we live in a violent society.
  • Trump supporters are told America is falling apart , and they watch easy to understand news programs, so now they see this everywhere.
  • WE see as real what we were exposed to recently or what is easy to understand.  We don’t see as real anything that takes hard thinking to figure out.  

 

Chapter 13,  availability, emotion and risk.

Problems of human Judgment of risk:  Kunreuther studied risk and insurance. People buy insurance after disasters. Then memory dims and diligence drops. Public perception of risk depends on availability. People have no idea of the true cause of most  deaths. Statistics is the only way to know true causes.  Causes of death  are warped by media coverage, and media coverage is biased toward novelty, poignance, and sensationalism and drama. Our ideas of what the world is like are very inaccurate. They are mostly associative bias and substitution. Scary thoughts and images are very easy for us. Affect heuristic comes from emotions:  things we strongly hate, fear, or love. We react to these things unconsciously. People make judgement and decisions by consulting their emotions.  We subconsciously avoid what we fear and are attracted by what we love. This is also substitution where we substitute the answer to hard questions with answers to easy questions. This relates to DiMaggio’s idea that emotion is directly related to decision-making.

Affect heuristic skews risk / benefit analysis. When people hate something they can see only huge risks and small benefits.   Jonathan Haidt says  “the emotional tail wags the rational dog.” “The affect heuristic simplifies our Lives by creating a world that is much tidier than reality.”

[Confirmation bias: You only look at evidence that confirms an opinion you already hold.]

[Humans are everything but rational.  We have always known this.  However, in the past, the fix was to consciously avoid emotion and consciously use formal logic and education to avoid pitfalls.  This kind of works, but the real problem is that biases are sub-conscious.  We don’t know they are happening, we don’t know they are effecting us.  Biases affect the most highly educated and those who consciously struggle to be rational.  Biases are like invisible, undetectable poison gas.]

Paul slovic knows a lot about the problems of human judgment and risk. Slavic says risk is associated with power. For instance risk of toxic air contamination candy

Paul Slowvic knows a lot about the problems of human judgement and risk. Slavic says risk is associated with power. For instance risk of toxic air contamination can be measured as deaths per dollar of production. This risk assessment of experts may be wrong. Risk is subjective, it depends on who is doing the study. Cass Sunstein disagrees. He says experts always judge risks better than the public. He says rational weighing of cost and benefit should determine risk assessment. He says the number  of lives and money saved our what is important. He says these can be measured objectively. He says law-makers and regulators are overly concerned with irrational concerns of citizens.

Sunstein and Timur  invented a name for how biases flow into policy:  availability cascade. This is a self-sustaining chain of events. Media concern of a small risk is often deliberately magnified by those who work to ensure a continuous flow of  worrying news. The danger and risks are increasingly exaggerated. Those who insist danger is exaggerated are accused of cover ups. Thus the availability cascade resets priorities. Love canal is a good example.

This illustrates the fact that the human mind cannot deal with very small risks rationally. We either ignore these small risks totally or give them far too much weight. Worry about children is like this. We cannot prevent images of disaster coming to mind even when chances are minimal. Risk is presented as a fraction with a numerator (part) and a denominator (whole) . We worry about the numerator and forget about the denominator. Sunstein called this the probability effect Nowadays days terrorists are best at inducing availability cascades. Death by terrorists is very tiny compared to all deaths. Traffic deaths are almost always higher. The difference in reaction is the availability of the two risks.  Terrorism speaks directly to number one.

Both the Slovic and the Sunstein views are correct. Government must protect public from Fear even when it is irrational.

 

Chapter 14, Tom W’s specialty

Base Rate: In a jar of marbles the proportion of red marbles to total marbles is its base rate.  In guessing which marble is most likely to be chosen, you should always think about the base rate.   It there are more red marbles, guess red and ignore all other factors.

If you are trying to guess the major of a student.  You should always guess the major that has the largest number of students and ignore any hints about what the student’s character is like and what he likes and dislikes.  Most people get this wrong.  They try to match the student’s description with a major that has people that are similar to the student.

Nobody except statisticians, and not always even statisticians, ever looks at base rates when when making predictions. They just conform to the stereotypes, they substitute stereotypes for statistical probability. We tend to just answer the easy question.

Non statisticians think that the word probability equals likelihood. When people are asked to judge probability they want to answer easier questions and so they revert to stereotypes. This is the representative bias. They tend to think that orderliness represents librarians. Or maybe accountants or computer nerds. Academics, dentists, and lawyers are not represented by tattoos by stereotype. Prediction by representativeness is a very common bias but it is highly inaccurate. System number one generates an impression of similarity subconsciously. Michael Lewis’s Moneyball is about how an accurate prediction by representativeness works. Stats on performance actually work much better.

they tend to think that order order Elena’s represents librarians. Or maybe accountants or computer nerd. Academics, dentist, lawyers are not represented by tattoos. Prediction by representative this is a very common by us but it is highly inaccurate. System number one generation impression of similarities subconsciously. Michael Lewis’s Moneyball is about how in-accurate prediction by representativeness works. Stats on performers actually work much better than representative stereotypes.

Prediction by representativeness works better than chance, ie random guessing. Stereotypes are often correct. They are often wrong also, unfortunately, especially if base rates are involved.

Here are some sins of representativeness:

Predicting occurrence of low base rate events. Example: Subway Rider Reading New York Times:  a) she has a Ph.D.   b) she has no college degree.   Base chooses b) and this is correct.  They are few Ph.D’s on the subway and many with no college degree.  

Always look at base rate. Representativeness is very very misleading.

People ignore base rates out of both laziness and ignorance.

Frowning activates the   #2 system. This means wrinkling brow, persing lips, looking over glasses.

A second series of Representative Sins. If you doubt the quality of the evidence, apply the base rate.

How to discipline intuition. Intuition is not valuable.  “To be useful,  your beliefs should be constrained by the logic of probability.”

Bayesian statistics says that base rates matter even in the presence of other evidence. Also it is important to remember that the new intuitive evidence that you use in Bayesian statistics may also be biased.

 

 Chapter 15, Linda, less is more.

 

When you specify an event in greater detail, you can only lower its probability. If set 2 is included in set one, then set 1 has to be larger, by definition,  and thus it is more likely, it has a greater probability of being true or of happening.

Here is an example.  Is a person more likely to be a bank teller or to be a feminist bank teller.  Obviously there are many more tellers than feminist tellers so the former is more probable.  Just like marbles in a jar.  If there are twice as many red marbles than white marbles, it is more probably the red marble will be randomly drawn.  This sets up a conflict between the intuition of representativeness and the logic of probability. Intuition is much easier than logic. If you pit representativeness against logic, then the easy intuitive representativeness always wins in experiments. This is a failure of system #2.

Here is the short version of the Linda test:

Which is more probable:

Linda is a bank teller.

Linda is a bank teller and is active in the feminist movement.

In the short version, most people see that “Linda is a bank teller” is the more probably statement.

Here is the longer version of the  Linda test:

Which is more probable, 1) Linda is a bank teller.  2) Linda is a bank teller and is active in the feminist movement.   Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student she was deeply concerned with issues of discrimination and social justice, and also she participated in antinuclear demonstrations.

Almost everyone gets this longer version wrong.  Everyone chooses that Linda is a bank teller and is active in the feminist movement. 89% of undergraduates get this wrong. 85% of PhD students specializing in statistics, probability and decision-making also make this error.  Kind of discouraging.

This is called the conjunction fallacy. Judging a conjunction of two events like bank teller and feminist seems to system #1 to be more probable than just a single event.  

The Linda fallacy is like the Muller-Lyer illusion with the two lines. It still seems true even after you see that it is a fallacy.  

In the longer version of the Linda test, more possible Linda Vocations, fewer people got the correct answer because people judge each item independently.

In the shorter version of the test, more people got it right because they saw the two choices right together and judged them together.

Notions of coherence, possibility and probability are very easily confused.

Things that are familiar to us are very possible even if not logical. Also very detailed and very rich descriptions seem more plausible even if less logical. Ditto if they are good stories.

System number one always averages things to judge them. System number #2 (probability) adds things to judge them. If we are judging the value of only one thing we will use number one and figure the average of each piece. When we compare two things to decide which is more valuable, we tend to see which item has more pieces.

For example: Baseball cards.  We have two sets of cards and set A has 10 high value of cards and Set B equals 10 high value of cards + 3 low value cards. Show A and B together and B wins.  Show  each set separately and A Wins. From the economics point of view A is obviously more valuable.

The sum-like nature of things is less obvious for probability then for money. Joint evaluation eliminates the error in the card problem but not in the Linda problem.

Failing to understand that adding probabilities results in larger probabilities is a common error.

The conjunction fallacy is due to a misinterpretation of probability. The conjunction fallacy is when you assume that several specific conditions are more probable than a single General one.  Ie, thinking that it is more probable that Linda is a Feminist Bank Teller than a bank teller.  (Actually it is more probably  she is a Bank Teller because that is a larger set.)

So the sum-like nature of things is less obvious for probability then for money. Joint evaluation allows us to add up the various things and eliminates the error in the card problem but not in the Linda problem.

Failing to see that adding probability to probability gives larger probability is a common error.

The conjunction fallacy is due to a misinterpretation of probability.  Conjunction fallacy is when you assume that several specific conditions are more probable than a single General one. Most people think that it is more probable that linda is a feminist bank teller than a bank teller only. (Actually it is more probable that she is a bank teller because there are more bank tellers than feminist bank tellers.)

 

This article is still in progress.  It will be finished soon.

J10,Buoies,CapeNeddick,5-8-14_W5P9952
Bouys and nets on seashore shack, near Bath Maine.  Picture by Hanselmann Photography. 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s