We have two different systems of thinking -- automatic (fast) and considered (slow). An example of automatic thinking (System 1) is when we hear a loud noise and we immediately turn towards it. An example of considered thinking (System 2) is when we try to find someone specific in the crowd.
Answer this question: A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?
Most people's immediate answer is $0.10, but the actual answer is $0.05. This is an example of the intuitive and impulsive System 1 taking over control and neglecting System 2.
The law of least effort states that we tend to use the minimum amount of work to achieve a task. In the bat-and-ball problem, our automatic system was misled to think it can solve the problem on its own. This is an example of our inherent mental laziness. We should make attempts to overcome this laziness as it'll help us avoid similar errors. Also, research shows that exercising System 2 leads to higher intelligence.
Priming is the exposure to something that affects our thoughts or actions. For example, consider the word SHOWER, and now complete this word SO_P. You probably thought of the word SOAP, and not SOUP. If you considered the word FOOD before this exercise, you probably would answer SOUP.
One research showed that when participants are primed with ideas associated with the elderly (e.g., Florida, wrinkles), the participants actually walk slower. This shows that priming also affects our actions.
Priming is often a subconscious process. This suggests that we are not always fully in control of our thoughts and actions. We are perpetually being primed by our environment. Priming has implications on how our society and culture are shaped.
Exaggerated emotional coherence, or, the halo effect, is when we inflate our limited impression of something or someone to make quick decisions that are prone to error. For example, you meet Bob at a party and find him easy to talk to. Later someone asks you if you know who'd like to contribute to a charity, and you immediately think of Bob. This is the halo effect at work, because you don't know much about Bob besides that he's affable.
Confirmation bias says that we are more readily to accept information that conform to our previously held beliefs.
Priming, halo effect, and confirmation bias are all cognitive processes we unconsciously use to simplify mental efforts, and they all can lead to errors.
Heuristics are mental shortcuts we take to make decisions easier. These are helpful but sometimes overused to our detriment.
The substitution heuristic is when we replace a harder question with an easier one. For example, if you were shown a picture of a basketball player and someone asked you if you think he's good, you might quickly replace the question with "Does he look like a good basketball player?"
The availability heuristic is when we overestimate the probability of something easier to remember or hear about more often. For example, a study found that 80% of people think an accidental death is more likely than dying from a stroke, even though strokes actually cause many more deaths than accidents do. This is because people hear about accidents in the media way more than they hear about strokes.
Base-rate neglect is a mental bias where we disregard established facts (the base-rate) and answer according to what is expected. For example, if we know 80% of taxis are yellow and 20% are red, and we see five red taxis pass by us, we will be tempted to guess the next taxi will also be red if we don't keep the base-rate in mind.
Regression to the mean says that situations will have their variations, but the average will always remain consistent. For example, if you flip a coin five times and they're all heads, that does not mean the coin flip probability is no longer 50% heads and 50% tails. If you flipped enough times, it will regress to the 50% heads and 50% tails average.
We describe experiences in two different ways, or two memory selves -- the experiencing self and the remembering self.
The experiencing self records how we feel during the moment. The remembering self records how we feel after the event. The experiencing self is more accurate. The remembering self is less accurate because of duration neglect and the peak-end rule. Most of our memories are recalled with the remembering self.
Duration neglect refers to how we tend to neglect the full duration of an event and overemphasize certain moments from an event. The peak-end rule refers to how we tend to overemphasize the end of an event.
An experiment set up two groups of patients that went through colonoscopy. One group's procedure was longer than necessary. The other group's procedure was short but with the most pain experienced at the end. When asked about the pain during the procedure, the groups gave accurate answers -- the group with the longer procedure felt worse. When asked about the pain after the experiment was over, the group with the shorter procedure felt the worse, which is a clear example of duration neglect and the peak-end rule in effect.
When we are in a state of cognitive ease, we're using the intuitive System 1. We expend little mental energy, we're more relaxed and creative, but we're more prone to mistakes.
When we are in a state of cognitive strain, we're using the considered System 2. We use more mental energy, we heighten our awareness, and we're less prone to mistakes.
We can influence what state of mind we're in to better perform certain tasks.
For example, to be more persuasive and natural at delivering a message, repeat the message to yourself as much as you can. This will put you in a state of cognitive ease because we're evolved to react positively to something familiar.
To enter a state of cognitive strain to better study a mathematical problem, try setting the text to an interesting font. This will excite our mind, increase our mental energy, and help us focus.
The way statistics are presented to us affects our judgments even though the facts remain the same. For example, in one experiment, one group was told a psychiatric hospital patient had a 10% probability of committing an act of violence, and a second group was told that of every 100 similar patients, 10 are estimated to commit an act of violence. Even though the probability is the same in both statements, almost twice as many respondents were against the patient's discharge in the second group.
Denominator neglect is when we favor memorable images over relevant statistics. For example, the statement "One of 100,000 children who take this drug will be permanently disfigured." will give you a more negative impression of the drug than the statement "The drug protects children from this disease but has a 0.001% chance of permanent disfigurement."
Utility theory suggests that we make decisions based on rational facts and choose the options with the best outcome for us, thus maximizing utility. However, the author challenges utility theory by proposing the prospect theory, which suggests that our decisions are usually influenced by emotions.
Loss aversion refers to our tendency to value losses more than we value gains. For example, an experiment has shown that we would rather gain $1,000 than gain $2,000 and then lose $1,000, even though we should feel the same regarding both scenarios if we were rational. This experiment showed that reference points matter. I.e, starting with $1,000 or $2,000 mattered in the decision even though the result was the same.
Diminishing sensitivity principle refers to when the value we perceive differs from its actual worth. For example, losing $100 after you won $1,000 won't feel as bad as losing $100 after you won $200.
We create general images in our mind to simplify problems, this is called cognitive coherence. For example, we might have an image of a bright sun and hot weather for the summer season, and we might regrettably wear a T-shirt outside even when the forecast predicts cooler weather because we tend to be overconfident in our mental images.
To help with overreliance of our mental images, we can apply reference class forecasting, which means referencing historical examples, e.g., recalling what happened the last time the forecast predicted a cooler summer day.
Having a long-term risk policy means to plan for multiple scenarios. This can also help with mitigating judgment mistakes, e.g., bringing a sweater just in case it's too cold for a T-shirt.