david mcraney 2013
9781101621783
“bias is a tendency to think in one way when other options are just as good, if not better. For instance, if you tend to take a right turn every time you walk into a grocery store when turning to the left would be no better, you have a right-turn bias in your own behavior. Most people are biased in this way, and most large chain stores develop displays and lay out their interiors with this in mind. Most cognitive biases are completely natural and unlearned. They can be teased out of every person with a functioning brain. So, no matter if you were born in Egypt or Alabama, in 1902 or 2002, you still have the same collection of inherited cognitive biases every other human must deal with. Scientists speculate that most biases are adaptive, which just means that over millions of years they served as dependable fallback positions when you were unsure how to act or feel. For instance, you have a hindsight bias that makes you believe your predictions about the future are usually accurate because you falsely assume you’ve been able to predict the outcome of events all your life. The truth, however, is that you are terrible at making predictions but great at rewriting your memories to make it seem as if you were right all along. You also suffer from a confirmation bias that causes you to seek out information that confirms your worldview while avoiding and ignoring threatening information. Over time, this creates a bubble in which it seems there is a monumental amount of consensus for your beliefs ”
confirmation bias
you seek out information that confirms your worldview while avoiding and ignoring threatening information. Over time, this creates a bubble in which it seems there is a monumental amount of consensus for your beliefs.
narrative bias
the misconception: you make sense of life through rational contemplation.
the truth: you make sense of life through narrative.
conjunction fallacy
your narrative bias is bolstered when you are presented with an abundance of information. the more info you get about a statement, the more likely you are to believe that statement.
feminist bank teller
they assumed the subjects were answering their question. this assumption was wrong. they were instead answering “which is more likely, that she is a feminist and a bank teller, or that she is a feminist and not a bank teller”.
representativeness heuristic
tendency to ignore odds and instead judge the likelihood of something based on how similar an example is to an imagined archetype.
narrative psychology
you do not use logic and careful analysis to unravel the mysteries of who you are and what you want. You do not hypothesize and test. You don’t study, record, and contemplate the variables of life and the people you meet along the way. Objectivity and rationality find it difficult to thrive in your intellectual ecosystem. You perceive time as a path from the past to the present with all the events of your life in between. You imagine life begins in one place and ends in another, and there are obstacles and climaxes along the way. You need a narrator in your head to make sense of the buzz generated by your giant network of neurons. You search for causes and effects that will explain the world in such a way that benefits your self-image. Over your bloodline’s long history, the narrative evolved as the best method by which to pass meaning from one person to another, and it remains true inside this sentence.
Narratives are meaning transmitters. They are history-preservation devices. They create and maintain cultures, and they forge identities that emerge out of the malleable, imperfect memories of life events. It makes sense, then, that every aspect of humanity concerned with meaning, with cause and effect, will lean heavily on narratives
hindsight bias
since you rarely record your predictions, you rarely notice how wrong they tend to be. as a result, you tend to trust your current predictions far more than you should
post hoc rationalization
explanation after the fact that makes enough sense to you that you can move on and not get stalled second-guessing your own motivation
“without episodic memories, there is no narrative; and without any narrative, there is no self”
is it because I have no episodic memory, that I have no personal narrative, that I have little sense of self?
eagleman incognito: he speaks of the conscious part of the brain as only a tiny portion of the whole, and when ideas arrive in the mind, the conscious portion tends to take credit for something that was growing without its knowledge in the ocean of the unconscious, maybe for years. you don’t have access to the truth of what has happened, but that doesn’t stop you from coming up with a story to explain it. in that story, you mistake awareness for creation. in reality, the part of you that is aware is not the sole proprietor of your brain. to paraphrase psychologist george miller, you don’t experience thinking; you experience the result of thinking.
narratives that keep you bound together are nearly impervious to direct attack
Truth and accuracy usually lose when pitted against a riveting account—even when that account is coming from inside your noggin. …Whenever things start to get just a little difficult to understand, you replace that anxiety with false understanding in story form.
“Logical fallacies appear during arguments with yourself and others. You often begin with a conclusion already in mind and then work toward proving that you were not stupid to have drawn that conclusion in the first place. This sort of motivated reasoning often depends on warping logic to make things work out in your head. For instance, you might say hot dogs are a disgusting manufactured food product, and you can’t believe your cousin is serving them to his children, because no child should be forced to eat gross food. You’ve just committed a fallacy because your assumption was in your original statement: hot dogs are nasty. You’ve proved nothing. Your argument didn’t make the case about the nastiness of digestible casings filled with beef trimmings and fat. You’ve only stated what you believe and then said that what you believe informs your opinions. You can untangle this fallacy by rewording it like so: Kids shouldn’t be forced to eat food I believe is gross. You get confused in your own logic all the time and end up twisting language to make the world line up with your preconceived notions.”
common belief fallacy
the misconception: the larger the consensus, the more likely it is correct.
the truth: a belief is not more likely to be accurate just because many people share it.
most of what we presume to be evidence of our intelligence is just part of a vast cultural inheritance
scientific method
looking for disconfirming evidence was a better way to conduct research than proceeding from common belief
null hypothesis
seek out evidence to the contrary to see how it matches up with your assumption
Corporations and other institutions rarely set aside a division tasked with paying attention to the faults of the agency
the benjamin franklin effect
the misconception: you do nice things for the people you like and bad things to the people you hate.
the truth: you grow to like people for whom you do nice things and hate people you harm.
attitude
the bundle of beliefs and feelings you experience toward a person, topic, idea, etc., without having to form concrete thoughts.
Franklin sent a letter to the hater asking if he could borrow a specific selection from his library, one that was a “very scarce and curious book.” The rival, flattered, sent it right away. Franklin sent it back a week later with a thank-you note. Mission accomplished. The next time the legislature met, the man approached Franklin and spoke to him in person for the first time. Franklin said the man “ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death.”
kurt vonnegut: “we are what we pretend to be, so we must be careful about what we pretend to be.” when you become a member of a group, or the fan of a genre, or the user of a product—those things have more influence on your attitudes than your attitudes have on them
self-perception theory
your attitudes are shaped by observing your own behavior, being unable to pinpoint the cause, and trying to make sense of it. you look back on a situation as if part of an audience, trying to understand your own motivations. you act as observer of your actions, a witness to your thoughts, and you form beliefs about your self based on those observations.
memories divided into declarative, or accessible to the conscious mind, and nondeclarative, that which you store unconsciously. you intuitively understand how declarative memories shape, direct, and inform you … nondeclarative memories are just as powerful. you can’t access them, but they pulsate through your nervous system. your posture, the temperature of the room, the way the muscles of your face tense—these things inform your perception of who you are and what you think. drawing near is positive; pushing away is negative. self-perception theory shows that you unconsciously observe your own actions and then explain them in a pleasing way without ever realizing it. benjamin franklin’s enemy observed himself performing a generous and positive act by offering the treasured tome to his rival, and then he unconsciously explained his own behavior to himself. he must not have hated franklin after all, he thought; why else would he have done something like that?
we associate flexing (pulling) with positive experiences and extension (pushing) with negative
cognitive dissonance
Sometimes you can’t find a logical, moral, or socially acceptable explanation for your actions. Sometimes your behavior runs counter to the expectations of your culture, your social group, your family, or even the person you believe yourself to be. In those moments, you ask, “Why did I do that?” and if the answer damages your self-esteem, a justification is required. You feel as if a bag of sand has ruptured in your head, filling all the nooks and crannies of your brain, and you want relief. You can see the proof in an MRI scan of someone presented with political opinions that conflict with her own. The brain scans of a person shown statements that oppose her political stance show that the highest areas of the cortex, the portions responsible for providing rational thought, get less blood until another statement is presented that confirms her beliefs. Your brain literally begins to shut down when you feel your ideology is threatened
Festinger: you make “your view of the world fit with how you feel or what you’ve done.” When you feel anxiety over your actions, you will seek to lower the anxiety by creating a fantasy world in which your anxiety can’t exist, and then you come to believe the fantasy is reality, just as Benjamin Franklin’s rival did. He couldn’t possibly have lent a rare book to a guy he didn’t like, so he must actually like him. Problem solved.
You tend to like the people to whom you are kind and to dislike the people to whom you are rude … behaviors create attitudes when harming just as they do when helping.
franklin: “This is another instance of the truth of an old maxim I had learned, which says, ‘He that has once done you a kindness will be more ready to do you another, than he whom you yourself have obliged.’ And it shows how much more profitable it is prudently to remove, than to resent, return, and continue inimical proceedings.”
the post hoc fallacy
the misconception: you notice when effect doesn’t follow cause.
the truth: you find it especially difficult to believe a sequence of events means nothing.
post hoc ergo propter hoc: “after this therefore because of this.”
the halo effect
the misconception: you objectively appraise the individual attributes of other people.
the truth: you judge specific qualities of others based on your global evaluation of their character and appearance.
affect heuristic
make decisions and kindle beliefs based on innate sensations, psychologists say you are using the affect heuristic. An affect, in psychological terms, is a feeling that needs no further analysis. It isn’t a coherent thought with words and symbols attached, but rather, a raw emotional state, a twinge or a jolt or just a general sensation that sets a tone or a mood. You can glide through an affect, or slowly submerge into one, and sometimes an affect slaps you like a fish to the forehead. When it comes to keeping you alive over the long run, the affect heuristic serves you well. It keeps you out of unfamiliar and seemingly dangerous situations and it tells you to avoid weirdos and creeps. It’s a blunt tool, though, and very inaccurate from one specific situation to the next
halo effect makes you believe taller people are more desirable for things in which height doesn’t necessarily matter
The halo effect causes one trait about a person to color your attitude and perceptions of all her other traits. Even stranger, the more noticeable the aspect is when you form your first impression, the more difficult it becomes to change your attitude about that aspect. So, for example, if you are bowled over by the warmth and kindness of a coworker in your first week at a new job, you’ll let him get away with a host of obnoxious behaviors later on, maybe even for years. If the first year of a relationship is stellar and life-altering, it can take a long time to notice if things turn sour later. If you like specific aspects of an individual, the halo effect causes the positive appraisal to spread to other measurements and to resist attack. Beautiful people seem more intelligent, strong people seem nobler, friendly people seem more trustworthy, and so on. When they fall short, you forgive and defend them, sometimes unconsciously.
sappho: “what is beautiful is good, and who is good will soon be beautiful.”
affect heuristic unconsciously tells you to seek or avoid that which is considered beautiful in your culture and era, and you then follow that response with a rationalization as to why you’ve been struck by those feelings, whether or not you truly know the source.
beautiful people don’t just have the advantage of beauty, but you treat them as if they have a host of other presumed advantages that compound that advantage … after years of walking through life receiving treatment as though they possess the personality traits we like to see in others, beautiful people tend to believe and act as though they truly possess those attributes. Pretty people believe they are kind, smart, decent, and whatever else the halo effect produces in the eyes of their audience—whether or not those things are true.
landy and sigall: you expect more from pretty people well before you know anything else about them, and when they fall short of your expectations, you give them more of a chance to prove themselves
Realize those closest get the benefit of the doubt and so do the most beautiful and radiant among us. Know the halo effect causes you to see a nice person as temporarily angry and an angry person as temporarily nice. Know that one good quality, or a memory of several, can keep in your life people who may be doing you more harm than good … people in your life possess or lack virtues colored by the radiance or gloom of the halo you create for them early on.
ego depletion
the misconception: willpower is just a metaphor.
the truth: willpower is a finite resource.
cookie consumption based on social acceptance — impaired self-regulation
not just restraint in the face of desire that could deplete your ego, but any choice at all.
focused concentration made people less eager to make active choices later.
process model of ego depletion
although you have the glucose to spend, your brain becomes frugal after mental exertion and dampens your motivation. reward cues become more salient in the environment, and tasks requiring self-control become less attractive. if at that moment the brain becomes highly motivated, it will happily use its available glucose
may be a mechanism to avoid overspecialisation or using too many resources on one path
only way to avoid this state of mind is to predict what might cause it in your daily life and to avoid those things when you need the most volition
misattribution of arousal
the misconception: you always know why you feel the way you feel.
the truth: you can experience emotional states without knowing why, even if you believe you can pinpoint the source.
the illusion of external agency
the misconception: you always know when you are making the best of things.
the truth: you often incorrectly give credit to outside forces for providing your optimism.
agency
notion that there is a thinking being behind an event
affective forecasting
predicting the future of your emotions
impact bias
overestimate the impact of both good and bad outcomes on your emotional hereafter.
impact bias contaminates your affective forecasting, making that forecasting much less accurate.
hedonic treadmill
regression to the mean
psychological immune system
when you feel damaged by rejection, loss, shame, humiliation, powerlessness, and all the other dark emotional states, or what psychology calls negative affects, your mind has a tremendous capacity for psychological healing
subjective optimization
seeing life as it is as being the best it could be
lemons into lemonade
more likely to subjectively optimize, to make lemonade out of lemons, in an unchangeable situation. The people without a choice were later happy. The people with a choice were later sad. Getting locked into a situation with no hope of escape activates subjective optimization
immune neglect
system works well, in part because it happens unconsciously. The downside is that you find it difficult to predict your own unconscious systems because you do not factor them into your predictions
illusion of external agency
subjective optimization can lead you to believe in invisible forces, forces controlling your happiness and fate
since you are always generating artificial sunshine to brighten up your day, yet you remain unaware of that process, when you reflect on your life and wonder where all the brightness is coming from, you rarely point the finger at your own head. Instead, you point toward whatever invisible forces match up with your beliefs. You see an intangible benevolence that harmonizes with your worldview as the source of your good fortune.
Even though everyone in the study was responsible for how they viewed reality and how happy they were with their fates, when given the option to place responsibility for that fate outside themselves, they took it
Before you buy into any system, before you place trust in any external agent real or invisible, ask yourself if the positive emotions you are feeling come from within or without.
So much of life is impossible to do over, so many situations are impossible to change without harming people important to you, that you learn to change the way you see the world because it is easier than changing the world itself. As the scientists put it, you have a “healthy capacity for generating satisfaction” within the tribulations of an imperfect life.
when you argue whether your friends, family, or acquaintances should have gone down one path or another, you can be disappointed and upset on the outside, but they are not. They always seem pleased—blessed, even—that things worked out the way they did. That’s because they have the same capacity to alter their view of reality that you possess. Your personal historian is usually quite happy with the outcome of your life so far. Sure, it could have gone this way or that, but then you wouldn’t have all the nice things you are thankful for, no matter whom you thank when you count your blessings.
this may be source of unwillingness to change, to learn, to grow — and why people who are vulnerable or dissatisfied are more able to change, as they are not content with their situation. but what about people who are open, or children — why do they want to learn?
the backfire effect
the misconception: you alter your opinions and incorporate the new information into your thinking after your beliefs are challenged with facts.
the truth: when your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.
Once something is added to your collection of beliefs, you protect it from harm. You do this instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens those misconceptions instead. Over time, the backfire effect makes you less skeptical of those things that allow you to continue seeing your beliefs and attitudes as true and proper.
narrative scripts
stories that tell you what you want to hear, stories that confirm your beliefs and give you permission to continue feeling as you already do.
you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel even surer of his position than before you started the debate. As he matches your fervor, the same thing happens in your skull. The backfire effect pushes both of you deeper into your original beliefs.
climate scientist John Cook and psychologist Stephan Lewandowsky write in their pamphlet, The Debunking Handbook, “A simple myth is more cognitively attractive than an over-complicated correction.” Multiple lines of research back up this advice. The more difficult it becomes to process a series of statements, the less credit you give them overall. During metacognition, the process of thinking about your own thinking, if you take a step back and notice that one way of looking at an argument is much easier than another, you will tend to prefer the easier way to process information and then leap to the conclusion that it is also more likely to be correct
you spend much more time considering information you disagree with than you do information you accept. Information that lines up with what you already believe passes through the mind like a vapor, but when you come across something that threatens your beliefs, something that conflicts with your preconceived notions of how the world works, you seize up and take notice. Some psychologists speculate there is an evolutionary explanation. Your ancestors paid more attention and spent more time thinking about negative stimuli than positive because bad things required a response. Those who failed to address negative stimuli failed to keep breathing.
when your beliefs are challenged, you pore over the data, picking it apart in search of weaknesses. The cognitive dissonance locks up the gears of your mind until you deal with it. In the process, you form more neural connections, build new memories, and put out effort—once you finally move on, your original convictions are stronger than ever.
people can come around to the truth, but it takes a long time, and you shouldn’t expect one argument or one conversation to make much of an impact.
pluralistic ignorance
the misconception: many of your private beliefs are in disagreement with what most people think.
the truth: on certain issues, the majority of the people believe that the majority of the people in a group believe what, in truth, the minority of the members believe.
belief that the majority is acting in a way that matches its internal philosophies, and that you are one of a small number of people who feel differently, when in reality the majority agrees with you on the inside but is afraid to admit it outright or imply such through its behavior.
much used by abusers, who try to amplify this
was lucky to be able to grow some of my own beliefs without too much influence from others due to parents’ parenting style and lifestyle — parental neglect and moving away from places and friends every few years as expats in a world without internet was sad but had a silver lining — able to conform to behaviours yet (relatively) able to think what those behaviours meant as had lived outside them
pluralistic ignorance can also oppress whole nations and cause social change to stagnate for generations.
Pluralistic ignorance keeps people on the fringe, the sort of people who will be phased out by progress, clinging to their outdated beliefs for longer than they should. It keeps their opponents feeling less supported than they truly are while keeping people in the middle favoring the status quo. In the end, a make-believe status quo changes the way everyone acts and thinks. As the sociologists put it, people often “unintentionally serve as cultural carriers of cognitive error.”
Sociologists Damon Centola, Robb Willer, and Michael Macy applied game theory to this concept, plugging data into computer models, and showed that unpopular norms are likely to pop up in just about any human social situation in which there is a palpable fear of retribution. As they pointed out, “an outbreak of enforcement” will cascade through a social movement when it is beneficial for members to act or cease to act out of fear. They point out that the literal witch hunts of the early American colonies and the metaphorical witch hunts of the McCarthy era share similarities with the bizarre irony of closeted homosexual men committing acts motivated by homophobia. Politicians known for their antigay legislation have boggled the minds of their constituents by getting caught in gay sex scandals. You might wonder why someone who is homosexual would work so hard to make it difficult to be homosexual, but Centola, Willer, and Macy say it is pretty simple. One of the common strategies to avoid embarrassment and punishment for disagreeing with a norm is actively to enforce it. People often become norm enforcers to prove their loyalty and head off any suspicion. Whenever a person imagines that violating a norm would result in serious consequences, such as eternal damnation in a pit of fiery torture or the complete social abandonment of friends, family, and his brothers and sisters in the church, that person may opt to become an enforcer of norms instead of just simply complying with them. As the researchers put it, “enforcing norms provides a low-cost way to fake sincerity, to signal that one complies—not as an opportunist seeking approval—but as a true believer.”
conservative bias
most people falsely assume their culture is less progressively tilted than it truly is, and thus the institutions and media of the culture will present themselves as more conservative than necessary. In addition, its programming will consist of content designed to appeal to a public far more prudish than the actual audience consuming it.
perhaps one of the reasons revolutions happen — the status quo existed only in the beliefs of people, and once this belief changes, society changes, seemingly abruptly for those few who actually believed the old “majority” belief
Encourage individuals to speak out and reveal their private thoughts. They, along with Miller and McFarland, advocate support groups and other forms of intimate gatherings in which people are encouraged to open up to others, so the rest of the group can reveal in turn that what everyone thought was their own deviant thoughts or actions or beliefs were not only normal but also the true majority sentiment. In Kitts’s study, the one about the vegetarians, he found that the more aberrant your friends and family, the more deviant the people in your inner circles, the less power pluralistic ignorance has over your perceptions.
the no true scotsman fallacy
the misconception: you honestly define that which you hold dear.
the truth: you will shift your definitions to protect your ideologies
begging the question fallacy
something along the lines of “What proof do you have that your assumption is true? Are there any exceptions?”
looks like an argument seeking to establish facts, but it is actually a statement of belief
statements that assume something to be true without giving that thing a chance to be disproved before linking it to another assumption.
to keep pristine the groups to which you belong, the ideas to which you cling, and the institutions to which you pledge allegiance, you use the no true Scotsman fallacy to ensure that those things are exactly as you expect them to be no matter what sort-of-imperfect instance appears. Simply toss out that instance. You can’t improve the things you love if you never allow them to be imperfect. Thinking in this way, if you looked hard enough so that you saw every flaw in every example, you would soon find that nothing matched your expectations or deserved your definitions, and the membership of every group and category you hold dear would drop to zero.
the illusion of asymmetric insight
the misconception: you celebrate diversity and respect others’ points of view.
the truth: you are driven to create and form groups and then believe others are wrong just because they are others.
You can, you believe, put yourself in their shoes and predict their behavior in just about any situation. You believe every person except you is an open book. Of course, the research shows they believe the same thing about you.
You can’t see internal states of others, so you generally don’t use those states to describe their personalities.
seems that you know everyone else far better than they know you, and not only that, you know them better than they know themselves. You believe the same thing about groups of which you are a member. As a whole, your group understands outsiders better than outsiders understand your group, and you understand the group better than its members know the group to which they belong.
this could be how you arrive at believing your thoughts and perceptions are true, accurate, and correct, therefore if someone sees things differently from you or disagrees with you in some way, it is the result of a bias or an influence or a shortcoming. You often feel the other person must have been tainted in some way, otherwise he would see the world the way you do—the right way. The illusion of asymmetrical insight clouds your ability to see the people you disagree with as nuanced and complex. You tend to see yourself and the groups you belong to in shades of gray, but others and their groups as solid and defined primary colors lacking nuance or complexity.
When you feel the warm comfort of belonging to a team, a tribe, a group—to a party, an ideology, a religion, or a nation—you instinctively turn others into members of out-groups, into outsiders. Just as soldiers come up with derogatory names for enemies, every culture and subculture has a collection of terms for outsiders so as to better see them as a single-minded collective.
They must not understand, because if they did, they wouldn’t think the things they think. By contrast, you believe you totally get their point of view and you reject it. You don’t need to hear them elaborate on it because you already know it better than they do. So each side believes it understands the other side better than the other side understands both its opponents and itself.
solve this by asking questions about their beliefs
corollary is we need to also question our own beliefs, to confront our own ignorance
Link: 017-a11470424bc5bae7486a5ebd64d0a202.html
enclothed cognition
the misconception: clothes as everyday objects are just fabrics for protection and decoration of the body.
the truth: the clothes you wear change your behavior and can either add or subtract from your mental abilities.
embodied cognition
your physical state is translated into words, and those words initiate a cascade of associations
priming
every idea you experience now unconsciously influences all the ideas you experience later. Those ideas then influence your behavior without your realizing it.
Adam and Galinsky split people into three groups. One group wore white coats, and the scientists told them it was the same coat that doctors wore. Another group wore the same white coats, but the scientists told them they were painters’ smocks. In a third group, the scientists presented a coat described as the sort a medical doctor would wear and asked the subjects to write an essay on what images and ideas a doctor’s coat conjured up when they considered its meaning. This third group then moved on to the next task with the coat lying on a desk nearby. Members of all three groups then looked at photos positioned side by side, each with four minor difficult-to-spot differences. The psychologists measured how long it took the subjects to find the disparities and how many they found in each set. This is where the most interesting aspect of enclothed cognition appeared. The people who wore what they believed was a painter’s smock spotted the fewest differences in the photos. The people who merely thought deeply about a doctor’s coat spotted more, and the people who wore what they assumed was a doctor’s white coat spotted the most. Priming alone was not as powerful as actually putting on the coat, but even more astounding was that the same clothes can affect you more or less powerfully depending on the symbolic meaning you ascribe to those threads.
The white lab coats pinged the semantic nets of each person in the study, but like the Baker/baker experiment, when a person believed the coat was that of a painter, the web of ideas and associations that flooded into his unconscious was completely different from those of a person who believed she was wearing what doctors wear. For most people, the network of concepts that emerge when pondering a doctor’s costume is richer and more powerful than the network surrounding the ideas, schemas, memories, and other assorted concepts related to painters and painting. One symbol is more potent than the other. Priming you to think about doctors is a compelling enough experience to influence you to outperform people who believe they are wearing painters’ smocks, but if you go a step further and slip your arms into what you believe to be a doctor’s coat, feel it against your skin, notice its weight as you move around, the effect is far more powerful. Adam and Galinsky showed that enclothed cognition isn’t just priming. It is something more. They wrote in their study that they speculated the uniforms worn by police, judges, priests, firefighters, soldiers, sports teams, doctors, and so on may do more than communicate the role played by the wearer to other members of society. Those clothes may also strongly alter the wearers’ attitudes and behaviors, making them more courageous or ethical or aggressive or compassionate or attentive.
As if their fibers were enchanted, the things you wear cast a spell over your persona. The trick is that it’s you who is doing the enchanting, and you do it unconsciously.
Symbols, like rituals, are important not because they encourage superstitious behavior or obsolete beliefs, but because they naturally plug into the way your brain works. Everything has a symbolic power, a communicative potential to evoke memories and ideas in your brain like a beehive reacting to a thrown rock.
deindividuation
the misconception: people who riot and loot are scum who were just looking for an excuse to steal and be violent.
the truth: under the right conditions, you are prone to losing your individuality and becoming absorbed into a hive mind.
the sunk cost fallacy
the misconception: you make rational decisions based on the future value of objects, investments, and experiences.
the truth: your decisions are tainted by the emotional investments you accumulate, and the more you invest in something, the harder it becomes to abandon it.
Kahneman says organisms that placed more urgency on avoiding threats than they did on maximizing opportunities were more likely to pass on their genes. So, over time, the prospect of losses has become a more powerful motivator on your behavior than the promise of gains. Whenever possible, you try to avoid losses of any kind, and when comparing losses to gains you don’t treat them equally. The results of their experiments and the results of many others who’ve replicated and expanded on them have teased out an inborn loss aversion ratio. When offered a chance to accept or reject a gamble, most people refuse to make a bet unless the possible payoff is around double the potential loss.
Behavioral economist Dan Ariely adds a fascinating twist to loss aversion in his book Predictably Irrational. He writes that when factoring the costs of any exchange, you tend to focus more on what you may lose in the bargain than on what you stand to gain. The “pain of paying,” as he puts it, arises whenever you must give up anything you own. The precise amount doesn’t matter at first. You’ll feel the pain no matter what price you must pay, and it will influence your decisions and behaviors.
When anything is offered free of charge, Ariely believes your loss aversion system remains inactive. Without it, you don’t weigh the pros and cons with as much attention to detail as you would if you had to factor in potential losses.
Kahneman and Tversky also conducted an experiment to demonstrate the sunk cost fallacy. See how you do with this one. Imagine you go see a movie that costs ten dollars for a ticket. When you open your wallet or purse you realize you’ve lost a ten-dollar bill. Would you still buy a ticket? You probably would. Only 12 percent of subjects said they wouldn’t. Now imagine you go to see the movie and pay ten dollars for a ticket, but right before you hand it over you realize you’ve lost the ticket. Would you go back and buy another one? Maybe, but it would hurt a lot more. In the experiment, 54 percent of people said they would not. The situation is exactly the same—you lose ten dollars and then must pay ten dollars to see the movie—but the second scenario feels different. It seems that the money was assigned to a specific purpose and then lost, and loss sucks
abusers use this too
what is the relation to persistence?
the overjustification effect
the misconception: there is nothing better in the world than getting paid to do what you love.
the truth: getting paid for doing what you already enjoy will sometimes cause your love for the task to wane because you attribute your motivation as coming from the reward, not your internal feelings.
the self-enhancement bias
the misconception: you set attainable goals based on a realistic evaluation of your strengths and weaknesses.
the truth: you protect unrealistic attitudes about your abilities in order to stay sane and avoid despair.
illusory superiority bias, an unrealistically positive view of yourself; the illusion of control, the belief that you have command over the chaos that awaits you every day; and optimism bias, “the odds will always be in your favour”
Confirmation bias
tendency to notice and remember when information and events match your expectations and confirm your beliefs, but to ignore and forget when the world challenges your preconceived notions.
If you never look for disconfirmation of your beliefs, especially the ones that make you feel special and above average, you can proceed unchallenged and deluded.
hindsight bias
presented with new information or an outcome you could not possibly have foretold, you have a tendency to look back on your memories and assume that, back before you knew what was going to happen, you accurately predicted what has just now unfolded.
self-serving bias
provides you with credit for all the things in life that worked out in your favor, and it absolves you of responsibility for those times you fell short. The self-serving bias makes it difficult for you to acknowledge the help of others, or luck, or an unfair advantage. It isn’t a malicious defect of your personality; it’s just your brain’s way of framing things so that you don’t stop moving forward.
realistic expectations (as opposed to positive expectations) are denigrated in the book, under the assumption that anyone who was realistic would be worse–equipped to compete or survive. they mistake delusion for motivation. but these are only speculations, not fact. computer simulations of single aspects of life should not be used to draw broad conclusions about the worth of people’s outlooks. they also assume that competitive situations are the only arena for adaptive advantage. in addition, they assume this is true at all times for the same individual, and that situations are symmetric — that very bad situations are compensated for by very good situations, or even that advantages from many good situations are not destroyed by the disadvantages from a few very bad situations. I want to know about the ~20% of the population they say is not susceptible
the evolution of overconfidence
dominic johnson, james fowler 2011
http://dx.doi.org/10.1038/nature10384
Link: dx.doi.org/10.1038/nature10384
forming beliefs: why valence matters
tali sharot, neil garrett 2015
http://dx.doi.org/10.1016/j.tics.2015.11.002
Link: dx.doi.org/10.1016/j.tics.2015.11.002
“One of the most salient attributes of information is valence: whether a piece of news is good or bad. Contrary to classic learning theories, which implicitly assume beliefs are adjusted similarly regardless of valence, we review evidence suggesting that different rules and mechanisms underlie learning from desirable and undesirable information. For self-relevant beliefs this asymmetry generates a positive bias, with significant implications for individuals and society. We discuss the boundaries of this asymmetry, characterize the neural system supporting it, and describe how changes in this circuit are related to individual differences in behavior.”
“Humans update self-relevant beliefs to a greater extent in response to good news than bad news. This asymmetry is mediated by differential neural representation of desirable and undesirable estimation errors. The extent of this asymmetry varies with age and mental health.”
“desirable information is integrated into prior beliefs more readily than undesirable information, resulting in positively biased beliefs. The difference is robust and is observed in approximately 80% of the population, regardless of country, and gender” I want to know about the ~20% of the population they say is not susceptible to this, and I don't mean the depressive people — what makes these people unsusceptible?
how unrealistic optimism is maintained in the face of reality
tali sharot et al 2011
http://dx.doi.org/10.1038/nn.2949
Link: dx.doi.org/10.1038/nn.2949
papers
Alicke, Mark D. “Global Self-evaluation as Determined by the Desirability and Controllability of Trait Adjectives.” Journal of Personality and Social Psychology 49, no. 6 (1985): 1621–630.
Alicke, Mark D., and Olesya Govorun. “The Better-Than-Average Effect.” In The Self in Social Judgment. Ed. by Mark D. Alicke, David Dunning, and Joachim I. Krueger. New York: Psychology Press, 2005, pp. 85–108.
Bushman, Brad J., Scott J. Moeller, and Jennifer Crocker. “Sweets, Sex, or Self-esteem? Comparing the Value of Self-Esteem Boosts with Other Pleasant Rewards.” Journal of Personality 79, no. 5 (2011): 993–1012.
Bushman, Brad J., Scott J. Moeller, Sara Konrath, and Jennifer Crocker. “Investigating the Link Between Liking Versus Wanting Self-Esteem and Depression in a Nationally Representative Sample of American Adults.” Journal of Personality (2012).Web: July 2012.
Campbell, W. Keith, and Constantine Sedikides. “Self-threat Magnifies the Self-serving Bias: A Meta-analytic Integration.” Review of General Psychology 3, no. 1 (1999): 23–43.
Colvin, C. Randall, Jack Block, and David C. Funder. “Overly Positive Self-evaluations and Personality
Negative Implications for Mental Health.” Journal of Personality and Social Psychology 68, no. 6 (1995): 1152–62.
Cummins, Robert A., and Helen Nistico. “Maintaining Life Satisfaction: The Role of Positive Cognitive Bias.” Journal of Happiness Studies 3, no. 1 (2002): 37–69.
Dahl, Melissa. “Most of Us Think We’re Hotter Than Average, Survey Says 60 Percent Satisfied with Appearance in a New Survey from Msnbc.com, ELLE.” MSNBC, Sept. 8, 2010. Web: July 2012, www.msnbc.msn.com/id/39044399/ns/health-skin_and_beauty/t/most-us-think-were-hotter-average-survey-says/.
Dell’Amore, Christine. “Evolution of Narcissism: Why We’re Overconfident, and Why It Works Overestimating Our Abilities Can Be a Strategy for Success, Model Shows.” National Geographic News, Sept. 14, 2011. Web: July 2012, news.nationalgeographic.com/.
Diekmann, Kristina A., Steven M. Samuels, Lee Ross, and Max H. Bazerman. “Self-interest and Fairness in Problems of Resource Allocation: Allocators Versus Recipients.” Journal of Personality and Social Psychology 72, no. 5 (1997): 1061–74.
Dobson, Keith, and Renée-Louise Franche. “A Conceptual and Empirical Review of the Depressive Realism Hypothesis.” Canadian Journal of Behavioural Science 21, no. 4 (1989): 419–33.
Dunning, David, Judith A. Meyerowitz, and Amy D.
Holzberg. “Ambiguity and Self-evaluation: The Role of Idiosyncratic Trait Definitions in Self-serving Assessments of Ability.” Journal of Personality and Social Psychology 57, no. 6 (1989): 1082–90.
Gray, Janice D., and Roxane C. Silver. “Opposite Sides of the Same Coin: Former Spouses’ Divergent Perspectives in Coping with Their Divorce.” Journal of Personality and Social Psychology 59, no. 6 (1990): 1180–91.
Grove, J. R., S. J. Hanrahan, and A. McInman. “Success/Failure Bias in Attributions Across Involvement Categories in Sport.” Personality and Social Psychology Bulletin 17, no. 1 (1991): 93–97.
Headey, Bruce, and Alex Wearing. “The Sense of Relative Superiority: Central to Well-being.” Social Indicators Research 20, no. 5 (1988): 497–516.
Hoorens, Vera. “Self-enhancement and Superiority Biases in Social Comparison.” European Review of Social Psychology 4, no. 1 (1993): 113–39.
Jalonick, Mary Clare. “Americans Are Still Getting Fatter: Obesity Rates Increased in 28 States, Report Finds.” MSNBC, June 29, 2010. Web: July 2012, www.msnbc.msn.com/id/37996593/ns/health-diet_and_nutrition/t/americans-are-still-getting-fatter/#.UP7bkCfBdBE.
Johnson, Dominic. “Overconfidence in Tanks and Banks.” Blog post, Dominic DP Johnson, Aug. 14, 2009. Web: July 2012, www.dominicdpjohnson.com/blog/?p=17.
Johnson, Dominic D. P., and James H. Fowler. “The
Evolution of Overconfidence.” Nature 477, no. 7364 (2011): 317–20.
Jones, Edward E., and Richard E. Nisbett. “The Actor and the Observer: Divergent Perceptions of the Causes of Behavior.” In Attribution: Perceiving the Causes of Behavior. Ed. by Edward Ellsworth Jones. Morristown, NJ: General Learning, 1972, pp. 79–94.
Kruger, Justin, and Thomas Gilovich. “‘Naive Cynicism’ in Everyday Theories of Responsibility Assessment: On Biased Assumptions of Bias.” Journal of Personality and Social Psychology 76, no. 5 (1999): 743–53.
Langer, Ellen J. “The Illusion of Control.” Journal of Personality and Social Psychology 32, no. 2 (1975): 311–28.
Lassiter, G. Daniel, and Patrick J. Munhall. “The Genius Effect: Evidence for a Nonmotivational Interpretation.” Journal of Experimental Social Psychology 37, no. 4 (2001): 349–55.
Loughnan, Steve, Bernhard Leidner, Guy Doron, Nick Haslam, Yoshihisa Kashima, Jennifer Tong, and Victoria Yeung. “Universal Biases in Self-perception: Better and More Human Than Average.” British Journal of Social Psychology 49, no. 3 (2010): 627–36.
McKenna, Frank P., and Lynn B. Myers. “Illusory Self-assessments: Can They Be Reduced?” British Journal of Psychology 88, no. 1 (1997): 39–51.
Major, Brenda, Cheryl R. Kaiser, and Shannon K. McCoy. “It’s Not My Fault: When and Why Attributions to Prejudice Protect Self-esteem.” Personality and
Social Psychology Bulletin 29, no. 6 (2003): 772–81.
Markus, Hazel R., and Shinobu Kitayama. “Culture and the Self: Implications for Cognition, Emotion, and Motivation.” Psychological Review 98, no. 2 (1991): 224–53.
Moore, Michael T., and David M. Fresco. “Depressive Realism: A Meta-analytic Review.” Clinical Psychology Review 32, no. 6 (2012): 496–509.
Pronin, E., D. Y. Lin, and L. Ross. “The Bias Blind Spot: Perceptions of Bias in Self Versus Others.” Personality and Social Psychology Bulletin 28, no. 3 (2002): 369–81.
Rabin, Roni Caryn. “Choosing Self-esteem over Sex or Pizza.” The New York Times, Well (blog), Jan. 11, 2011. Web: July 2012, well.blogs.nytimes.com/2011/01/11/choosing-self-esteem-over-sex-or-pizza/.
Ross, Michael, and Fiore Sicoly. “Egocentric Biases in Availability and Attribution.” Journal of Personality and Social Psychology 37, no. 3 (1979): 322–36.
Sharot, Tali. “The Optimism Bias.” TED Talks. May 2012. Web: July 2012, www.ted.com/talks/tali_sharot_the_optimism_bias.html.
Shea, Christopher. “The Power of Positive Illusions.” The Boston Globe, Boston.com, Sept. 26, 2004. Web: July 2012, www .boston.com/news/globe/ideas/articles/2004/09/26/the_power_of_positive_illusions/?page=full.
Taylor, Shelley E. Positive Illusions: Creative Self-de
deception and the Healthy Mind. New York: Basic, 1989.
Taylor, Shelley E., and Jonathon D. Brown. “Positive Illusions and Well-being Revisited: Separating Fact from Fiction.” Psychological Bulletin 116, no.1 (1994): 21–27.
Twenge, Jean M. Generation Me: Why Today’s Young Americans Are More Confident, Assertive, Entitled—and More Miserable Than Ever Before. New York: Free Press, 2006.
Twenge, Jean M., and W. Keith Campbell. The Narcissism Epidemic: Living in the Age of Entitlement. New York: Free Press, 2009.
Twenge, Jean M., Sara Konrath, Joshua D. Foster, W. Keith Campbell, and Brad J. Bushman. “Egos Inflating over Time: A Cross-temporal Meta-analysis of the Narcissistic Personality Inventory.” Journal of Personality 76, no. 4 (2008): 875–901.