How We Know What Isn't So
4/5
()
About this ebook
When can we trust what we believe—that "teams and players have winning streaks," that "flattery works," or that "the more people who agree, the more likely they are to be right"—and when are such beliefs suspect? Thomas Gilovich offers a guide to the fallacy of the obvious in everyday life. Illustrating his points with examples, and supporting them with the latest research findings, he documents the cognitive, social, and motivational processes that distort our thoughts, beliefs, judgments and decisions. In a rapidly changing world, the biases and stereotypes that help us process an overload of complex information inevitably distort what we would like to believe is reality. Awareness of our propensity to make these systematic errors, Gilovich argues, is the first step to more effective analysis and action.
Thomas Gilovich
Thomas Gilovich is a professor of psychology at Cornell University and author of The Wisest One in the Room (with Lee Ross), How We Know What Isn’t So, Why Smart People Make Big Money Mistakes, and Social Psychology. He lives in Ithaca, New York.
Read more from Thomas Gilovich
Why Smart People Make Big Money Mistakes and How to Correct Them: Lessons from the Life-Changing Science of Behavioral Economics Rating: 4 out of 5 stars4/5The Wisest One in the Room: How You Can Benefit from Social Psychology's Most Powerful Insights Rating: 3 out of 5 stars3/5
Related to How We Know What Isn't So
Related ebooks
The Most Good You Can Do: How Effective Altruism Is Changing Ideas About Living Ethically Rating: 4 out of 5 stars4/5This Idea Must Die: Scientific Theories That Are Blocking Progress Rating: 3 out of 5 stars3/5On Being Certain: Believing You Are Right Even When You're Not Rating: 4 out of 5 stars4/5This Will Make You Smarter: 150 New Scientific Concepts to Improve Your Thinking Rating: 4 out of 5 stars4/5Against Empathy: The Case for Rational Compassion Rating: 3 out of 5 stars3/5How We Believe: Science, Skepticism, and the Search for God Rating: 4 out of 5 stars4/5Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction in Life and Markets Rating: 4 out of 5 stars4/5This Explains Everything: 150 Deep, Beautiful, and Elegant Theories of How the World Works Rating: 4 out of 5 stars4/5The Self Beyond Itself: An Alternative History of Ethics, the New Brain Sciences, and the Myth of Free Will Rating: 0 out of 5 stars0 ratingsThe Human Instinct: How We Evolved to Have Reason, Consciousness, and Free Will Rating: 4 out of 5 stars4/5The Social Leap: The New Evolutionary Science of Who We Are, Where We Come From, and What Makes Us Happy Rating: 4 out of 5 stars4/5COGNITIVE BIASES - A Brief Overview of Over 160 Cognitive Biases: + Bonus Chapter: Algorithmic Bias Rating: 0 out of 5 stars0 ratingsThe Free Will Delusion: How We Settled for the Illusion of Morality Rating: 5 out of 5 stars5/5The Trouble With Testosterone: And Other Essays On The Biology Of The Human Predi Rating: 4 out of 5 stars4/5Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind Rating: 4 out of 5 stars4/5Why We Lie: The Evolutionary Roots of Deception and the Unconscious Mind Rating: 3 out of 5 stars3/5Darwin's Dangerous Idea: Evolution and the Meaning of Life Rating: 4 out of 5 stars4/5What Have You Changed Your Mind About?: Today's Leading Minds Rethink Everything Rating: 3 out of 5 stars3/5Civilized to Death: The Price of Progress Rating: 4 out of 5 stars4/5The Sweet Spot: The Pleasures of Suffering and the Search for Meaning Rating: 4 out of 5 stars4/5Being Wrong: Adventures in the Margin of Error Rating: 4 out of 5 stars4/5How Intelligence Happens Rating: 3 out of 5 stars3/5Minds Make Societies: How Cognition Explains the World Humans Create Rating: 5 out of 5 stars5/5Evil: Inside Human Violence and Cruelty Rating: 4 out of 5 stars4/5Science Fictions: How Fraud, Bias, Negligence, and Hype Undermine the Search for Truth Rating: 4 out of 5 stars4/5Rule Makers, Rule Breakers: How Tight and Loose Cultures Wire Our World Rating: 4 out of 5 stars4/5How to Humble a Wingnut and Other Lessons from Behavioral Economics Rating: 5 out of 5 stars5/5Why Smart People Can Be So Stupid Rating: 4 out of 5 stars4/5Innate: How the Wiring of Our Brains Shapes Who We Are Rating: 4 out of 5 stars4/5
Psychology For You
The Subtle Art of Not Giving a F*ck: A Counterintuitive Approach to Living a Good Life Rating: 4 out of 5 stars4/5The Art of Letting Go: Stop Overthinking, Stop Negative Spirals, and Find Emotional Freedom Rating: 4 out of 5 stars4/5Running on Empty: Overcome Your Childhood Emotional Neglect Rating: 4 out of 5 stars4/5101 Fun Personality Quizzes: Who Are You . . . Really?! Rating: 3 out of 5 stars3/5How to Keep House While Drowning: A Gentle Approach to Cleaning and Organizing Rating: 5 out of 5 stars5/5Changes That Heal: Four Practical Steps to a Happier, Healthier You Rating: 4 out of 5 stars4/5How to Win Friends and Influence People: Updated For the Next Generation of Leaders Rating: 4 out of 5 stars4/5All About Love: New Visions Rating: 4 out of 5 stars4/5The Art of Witty Banter: Be Clever, Quick, & Magnetic Rating: 4 out of 5 stars4/5Maybe You Should Talk to Someone: the heartfelt, funny memoir by a New York Times bestselling therapist Rating: 4 out of 5 stars4/5A People's History of the United States Rating: 4 out of 5 stars4/5What Happened to You?: Conversations on Trauma, Resilience, and Healing Rating: 4 out of 5 stars4/5Maybe You Should Talk to Someone: A Therapist, HER Therapist, and Our Lives Revealed Rating: 4 out of 5 stars4/5Feeling Good: The New Mood Therapy Rating: 4 out of 5 stars4/5Personality Types: Using the Enneagram for Self-Discovery Rating: 4 out of 5 stars4/5Self-Care for People with ADHD: 100+ Ways to Recharge, De-Stress, and Prioritize You! Rating: 5 out of 5 stars5/5Laziness Does Not Exist Rating: 4 out of 5 stars4/5The Source: The Secrets of the Universe, the Science of the Brain Rating: 4 out of 5 stars4/5Nonviolent Communication: A Language of Life: Life-Changing Tools for Healthy Relationships Rating: 5 out of 5 stars5/5Collaborating with the Enemy: How to Work with People You Don't Agree with or Like or Trust Rating: 4 out of 5 stars4/5Why Has Nobody Told Me This Before? Rating: 4 out of 5 stars4/5Why We Sleep: Unlocking the Power of Sleep and Dreams Rating: 4 out of 5 stars4/5Unfu*k Yourself: Get Out of Your Head and into Your Life Rating: 4 out of 5 stars4/5Lost Connections: Uncovering the Real Causes of Depression – and the Unexpected Solutions Rating: 4 out of 5 stars4/5Verbal Judo, Second Edition: The Gentle Art of Persuasion Rating: 4 out of 5 stars4/5The Cult of Trump: A Leading Cult Expert Explains How the President Uses Mind Control Rating: 3 out of 5 stars3/5
Reviews for How We Know What Isn't So
64 ratings0 reviews
Book preview
How We Know What Isn't So - Thomas Gilovich
CLICK HERE TO SIGN UP
How We Know
What Isn’t So
The Fallibility of Human
Reason in Everyday Life
Thomas Gilovich
THE FREE PRESS
THE FREE PRESS
A Division of Simon & Schuster Inc. 1230 Avenue of the Americas
New York, NY 10020
www.SimonandSchuster.com
Copyright © 1991 by Thomas Gilovich
All rights reserved,
including the right of reproduction
in whole or in part in any form.
THE FREE PRESS and colophon are trademarks
of Simon & Schuster Inc.
First Free Press Paperback Edition 1993
Manufactured in the United States of America
20 19 18 17
Library of Congress Cataloging-in-Publication Data
Gilovich, Thomas.
How we know what isn’t so: the fallibility of human reason in
everyday life / Thomas Gilovich.
p. cm.
Includes bibliographical references and index.
ISBN 0-02-911706-2
ISBN-13: 978-0-0291-1706-4
eISBN-13: 978-1-4391-0674-7
1. Reasoning (Psychology)
2. Judgment.
3. Evidence
4. Error.
5. Critical thinking.
6. Fallacies (Logic)
I. Title.
BF442.G55 1991
153.4′3-dc20 90-26727
CIP
To Karen and Ilana
Contents
Acknowledgments
1. Introduction
PART ONE
Cognitive Determinants of Questionable Beliefs
2. Something Out of Nothing: The Misperception and Misinterpretation of Random Data
3. Too Much from Too Little: The Misinterpretation of Incomplete and Unrepresentative Data
4. Seeing What We Expect to See: The Biased Evaluation of Ambiguous and Inconsistent Data
PART TWO
Motivational and Social Determinants of Questionable Beliefs
5. Seeing What We Want to See: Motivational Determinants of Belief
6. Believing What We are Told: The Biasing Effects of Secondhand Information
7. The Imagined Agreement of Others: Exaggerated Impressions of Social Support
PART THREE
Examples of Questionable and Erroneous Beliefs
8. Belief in Ineffective Alternative
Health Practices
9. Belief in the Effectiveness of Questionable Interpersonal Strategies
10. Belief in ESP
PART FOUR Where Do We Go from Here?
11. Challenging Dubious Beliefs: The Role of Social Science
Notes
Index
Acknowledgments
Four people made unusually significant contributions to this work and deserve special thanks. Lee Ross commented on drafts of many of the chapters and provided a number of his uniquely illuminating insights on the phenomena at hand. Beyond that, I would like to thank Lee simply for being Lee—for being the most interesting intuitive psychologist
I know, and for making the discussion of people and their commerce through everyday life so enjoyable. Karen Dashiff Gilovich read every word of this book and at times seemed to have something to say about nearly every one. She was in many respects my most challenging critic, but, as always, she delivered her critiques in the most loving, disarming, and helpful ways. I owe Dennis Regan and Daryl Bem a great debt for the helpful feedback they provided on earlier drafts and for their encouragement throughout the project.
Various chapters were improved by the comments of numerous people, and I would like to express my sincere thanks to all: Robert Frank, Mark Frank, David Hamilton, Robert Johnston, David Myers, James Pennebaker, Barbara Strupp, Richard Thaler, and Elaine Wethington. To protect them from blame for any wrong-headed ideas presented in this book, the usual disclaimers about ultimate responsibility apply.
Finally, I would like to thank the National Institute of Mental Health for the generous financial support that made possible much of my own research that is reported in this book, and Susan Milmoe of The Free Press for her enthusiasm and assistance during the past eighteen months.
1
Introduction
It ain’t so much the things we don’t know that get us into trouble. It’s the things we know that just ain’t so.
Artemus Ward
It is widely believed that infertile couples who adopt a child are subsequently more likely to conceive than similar couples who do not. The usual explanation for this remarkable phenomenon involves the alleviation of stress. Couples who adopt, it is said, become less obsessed with their reproductive failure, and their new-found peace of mind boosts their chances for success.
On closer inspection, however, it becomes clear that the remarkable phenomenon we need to explain is not why adoption increases a couple’s fertility; clinical research has shown that it does not.¹ What needs explanation is why so many people hold this belief when it is not true.
People who are charged with deciding who is to be admitted to a distinguished undergraduate institution, a prestigious graduate school, or a select executive training program all think they can make more effective admissions decisions if each candidate is seen in a brief, personal interview. They cannot. Research indicates that decisions based on objective criteria alone are at least as effective as those influenced by subjective impressions formed in an interview.² But then why do people believe the interview to be informative?
Nurses who work on maternity wards believe that more babies are born when the moon is full. They are mistaken.³ Again, why do they believe it if it just ain’t so?
This book seeks to answer these questions. It examines how questionable and erroneous beliefs are formed, and how they are maintained. As the examples above make clear, the strength and resiliency of certain beliefs cry out for explanation. Today, more people believe in ESP than in evolution,⁴ and in this country there are 20 times as many astrologers as there are astronomers.⁵ Both formal opinion polls and informal conversation reveal widespread acceptance of the reality of astral projection, of the authenticity of channeling,
and of the spiritual and psychic value of crystals. This book attempts to increase our understanding of such beliefs and practices, and, in so doing, to shed some light on various broader issues in the study of human judgment and reasoning.
Several things are clear at the outset. First, people do not hold questionable beliefs simply because they have not been exposed to the relevant evidence. Erroneous beliefs plague both experienced professionals and less informed laypeople alike. In this respect, the admissions officials and maternity ward nurses should know better.
They are professionals. They are in regular contact with the data. But they are mistaken.
Nor do people hold questionable beliefs simply because they are stupid or gullible. Quite the contrary. Evolution has given us powerful intellectual tools for processing vast amounts of information with accuracy and dispatch, and our questionable beliefs derive primarily from the misapplication or overutilization of generally valid and effective strategies for knowing. Just as we are subject to perceptual illusions in spite of, and largely because of, our extraordinary perceptual capacities, so too are many of our cognitive shortcomings closely related to, or even an unavoidable cost of, [our] greatest strengths.
⁶ And just as the study of perceptual illusions has illuminated general principles of perception, and the study of psychopathology has enhanced our knowledge of personality, so too should the study of erroneous beliefs enlarge our understanding of human judgment and reasoning. By design, then, this book dwells on beliefs that are wrong, but in doing so we must not lose sight of how often we are right.
As these remarks suggest, many questionable and erroneous beliefs have purely cognitive origins, and can be traced to imperfections in our capacities to process information and draw conclusions. We hold many dubious beliefs, in other words, not because they satisfy some important psychological need, but because they seem to be the most sensible conclusions consistent with the available evidence. People hold such beliefs because they seem, in the words of Robert Merton, to be the irresistible products of their own experience.
⁷ They are the products, not of irrationality, but of flawed rationality.
So it is with the erroneous belief that infertile couples who adopt are subsequently more likely to conceive. Our attention is automatically drawn to couples who conceive after adopting, but not to those who adopt but do not conceive, or those who conceive without adopting. Thus, to many people, the increased fertility of couples who adopt a child is a fact
of everyday experience. People do not hold this belief because they have much of an emotional stake in doing so; they do so because it seems to be the only sensible conclusion consistent with the information that is most available to them.
Many of these imperfections in our cognitive and inferential tools might never surface under ideal conditions (just as many perceptual illusions are confined to impoverished settings). But the world does not play fair. Instead of providing us with clear information that would enable us to know
better, it presents us with messy data that are random, incomplete, unrepresentative, ambiguous, inconsistent, unpalatable, or secondhand. As we shall see, it is often our flawed attempts to cope with precisely these difficulties that lay bare our inferential shortcomings and produce the facts we know that just ain’t so.
Returning to the infertility example once again, we can readily see how the world does not play fair. Couples who conceive after adopting are noteworthy. Their good fortune is reported by the media, transmitted by friends and neighbors, and therefore is more likely to come to our attention than the fate of couples who adopt but do not conceive, or those who conceive without adopting. Thus, even putting our own cognitive and inferential limitations aside, there are inherent biases in the data upon which we base our beliefs, biases that must be recognized and overcome if we are to arrive at sound judgments and valid beliefs.
In tackling this subject of questionable and erroneous beliefs, I continue the efforts of many social and cognitive psychologists who in the past several years have sought to understand the bounded rationality of human information processing. Part I of this book, Cognitive determinants of questionable beliefs,
contains three chapters that analyze our imperfect strategies for dealing with the often messy data of the real world. Chapter 2 concerns random data and our tendency to see regularity and order where only the vagaries of chance are operating. Chapter 3 deals with incomplete and unrepresentative data and our limited ability to detect and correct for these biases. Chapter 4 discusses our eagerness to interpret ambiguous and inconsistent data in light of our pet theories and a priori expectations.
Although an examination of these cognitive biases is enormously helpful in understanding questionable and erroneous beliefs, the richness and diversity of such beliefs require a consideration of other factors as well. Accordingly, Part II contains three chapters on the Motivational and social determinants of questionable beliefs.
Chapter 5 locates the roots of erroneous belief in wishful thinking and self-serving distortions of reality. This chapter provides a revisionist interpretation of motivational effects by examining how our motives collude with our cognitive processes to produce erroneous, but self-serving, beliefs. Chapter 6 examines the pitfalls of secondhand information and the distortions introduced by communicators—including the mass media—who are obliged to summarize and tempted to entertain. Chapter 7 takes a psychological truism, we tend to believe what we think others believe
and turns it around: We tend to think others believe what we believe. This chapter examines a set of cognitive, social, and motivational processes that prompt us to overestimate the extent to which others share our beliefs, further bolstering our credulity.
Part III adopts a case study approach by bringing all the mechanisms introduced in Parts I and II together in an attempt to understand the origins and durability of several widely held but empirically dubious beliefs. These include beliefs in the efficacy of untested or ineffective health practices (Chapter 8), in the effectiveness of self-defeating interpersonal strategies (Chapter 9), and in the existence of ESP (Chapter 10). These chapters necessarily tread more lightly at times, for it cannot always be said with certainty that the beliefs under examination are false. Nevertheless, there is a notable gap in all cases between belief and evidence, and it is this gap that these chapters seek to explain.
Part IV ends the book with a discussion of how we might improve the way we evaluate the evidence of everyday life, and thus how we can steer clear of erroneous beliefs.
WHY WORRY ABOUT ERRONEOUS BELIEFS?
It is a great discredit to humankind that a species as magnificant as the rhinoceros can be so endangered. Their numbers thinned by the encroachment of civilization in the first half of this century, they now face the menace of deliberate slaughter. In the last 15 years, 90% of the rhinos in Africa have been killed by poachers who sell their horns on the black market. The horns fetch a high price in the Far East where they are used, in powdered form, to reduce fevers, cure headaches, and (less commonly) increase sexual potency. As a consequence of this senseless killing, there are now only a few thousand black rhinos left in Africa, and even fewer in Asia and Indonesia.⁸
Unhappily, the rhinoceros is not alone in this plight. Six hundred black bears were killed in the Great Smoky Mountains during the last three years, their gall bladders exported to Korea where they are thought to be an effective aid for indigestion (bears, the logic runs, are omnivores and are rarely seen to be ill). To understand the severity of this slaughter, it should be noted that the entire bear population in the Great Smoky Mountains at any one time is estimated to be approximately six hundred. A recent raid of a single black-market warehouse in San Francisco uncovered 40,000 seal penises that were to be sold, predictably, for use as aphrodisiacs. The Chinese green-haired turtle has been trapped to near extinction, in part because the Taiwanese believe that it can cure cancer. The list of species that have been slaughtered in the service of human superstition could go on and on.⁹
I mention these depressing facts to provide an unconventional answer to the familiar questions of What’s wrong with a few questionable beliefs?
or Why worry about a little superstition?
This senseless killing makes it clear that the costs of our superstitions are real and severe, and that they are paid for not only by ourselves but by others—including other species. That our mistaken beliefs about aphrodisiacs and cancer cures have brought a number of species to the brink of extinction should challenge our own species to do better—to insist on clearer thinking and the effort required to obtain more valid beliefs about the world. A little superstition
is a luxury we should not be allowed and can ill afford.
Of course, there are other, more conventional answers to this question of what is wrong with having a few questionable beliefs, answers that focus more on the costs to the believers themselves. The most striking are those cases we all hear about from time to time in which someone dies because a demonstrably effective medical treatment was ignored in favor of some quack therapy. Consider the fate of 7 year-old Rhea Sullins.¹⁰ Her father was once president of the American Natural Hygiene Society, which advocates natural
cures such as fasting and the consumption of fruit and vegetable juices in lieu of drugs and other conventional treatments. When Rhea became ill, her father put her on a water-only fast for 18 days and then on a diet of fruit juice for 17 more. She died of malnutrition at the end of this regimen. I trust the reader has read about a number of similar cases elsewhere. Is there anything more pitiful than a life lost in the service of some unsound belief? As the tragedies of people like Rhea Sullins make clear, there are undeniable benefits in perceiving and understanding the world accurately, and terrible costs in tolerating mistakes.
There is still another, less direct price we pay when we tolerate flawed thinking and superstitious belief. It is the familiar problem of the slippery slope: How do we prevent the occasional acceptance of faulty reasoning and erroneous beliefs from influencing our habits of thought more generally? Thinking straight about the world is a precious and difficult process that must be carefully nurtured. By attempting to turn our critical intelligence off and on at will, we risk losing it altogether, and thus jeopardize our ability to see the world clearly. Furthermore, by failing to fully develop our critical faculties, we become susceptible to the arguments and exhortations of those with other than benign intentions. In the words of Stephen Jay Gould, When people learn no tools of judgment and merely follow their hopes, the seeds of political manipulation are sown.
¹¹ As individuals and as a society, we should be less accepting of superstition and sloppy thinking, and should strive to develop those habits of mind
that promote a more accurate view of the world.
ONE
Cognitive Determinants
of Questionable Beliefs
2
Something Out of Nothing
The Misperception and Misinterpretation of Random Data
The human understanding supposes a greater degree of order and equality in things than it really finds; and although many things in nature be sui generis and most irregular, will yet invest parallels and conjugates and relatives where no such thing is.
Francis Bacon, Novum Organum
In 1677, Baruch Spinoza wrote his famous words, Nature abhors a vacuum,
to describe a host of physical phenomena. Three hundred years later, it seems that his statement applies as well to human nature, for it too abhors a vacuum. We are predisposed to see order, pattern, and meaning in the world, and we find randomness, chaos, and meaninglessness unsatisfying. Human nature abhors a lack of predictability and the absence of meaning. As a consequence, we tend to see
order where there is none, and we spot meaningful patterns where only the vagaries of chance are operating.
People look at the irregularities of heavenly bodies and see a face on the surface of the moon or a series of canals on Mars. Parents listen to their teenagers’ music backwards and claim to hear Satanic messages in the chaotic waves of noise that are produced.¹ While praying for his critically ill son, a man looks at the wood grain on the hospital room door and claims to see the face of Jesus; hundreds now visit the clinic each year and confirm the miraculous likeness.² Gamblers claim that they experience hot and cold streaks in random rolls of the dice, and they alter their bets accordingly.
The more one thinks about Spinoza’s phrase, the better it fits as a description of human nature. Nature does not abhor
a vacuum in the sense of to loathe
or to regard with extreme repugnance
(Webster’s definition). Nature has no rooting interest. The same is largely true of human nature as well. Often we impose order even when there is no motive to do so. We do not want
to see a man in the moon. We do not profit from the illusion. We just see it.
The tendency to impute order to ambiguous stimuli is simply built into the cognitive machinery we use to apprehend the world. It may have been bred into us through evolution because of its general adaptiveness: We can capitalize on ordered phenomena in ways that we cannot on those that are random. The predisposition to detect patterns and make connections is what leads to discovery and advance. The problem, however, is that the tendency is so strong and so automatic that we sometimes detect coherence even when it does not exist.
This touches on a theme that will be raised repeatedly in this book. Many of the mechanisms that distort our judgments stem from basic cognitive processes that are usually quite helpful in accurately perceiving and understanding the world. The structuring and ordering of stimuli is no exception. Ignaz Semmelweis detected a pattern in the occurrence of childbed fever among women who were assisted in giving birth by doctors who had just finished a dissection. His observation led to the practice of antisepsis. Charles Darwin saw order in the distribution of different species of finches in the Galapagos, and his insight furthered his thinking about evolution and natural selection.
Clearly, the tendency to look for order and to spot patterns is enormously helpful, particularly when we subject whatever hunches it generates to further, more rigorous test (as both Semmelweis and Darwin did, for example). Many times, however, we treat the products of this tendency not as hypotheses, but as established facts. The predisposition to impose order can be so automatic and so unchecked that we often end up believing in the existence of phenomena that just aren’t there.
To get a better sense of how our structuring of events can go awry, it is helpful to take a closer look at a specific example. The example comes from the world of sports, but the reader who is not a sports fan need not dismay. The example is easy to follow even if one knows nothing about sports, and the lessons it conveys are quite general.
THE MISPERCEPTION OF RANDOM EVENTS
If I’m on, I find that confidence just builds…. you feel nobody can stop you. It’s important to hit that first one, especially if it’s a swish. Then you hit another, and … you feel like you can do anything.
—World B. Free
I must caution the reader not to construe the sentences above as two distinct quotations, the first a statement about confidence, and the second an anti-imperialist slogan. Known as Lloyd Free before legally changing his first name, World B. Free is a professional basketball player. His statement captures a belief held by nearly everyone who plays or watches the sport of basketball, a belief in a phenomenon known as the hot hand.
The term refers to the putative tendency for success (and failure) in basketball to be self-promoting or self-sustaining. After making a couple of shots, players are thought to become relaxed, to feel confident, and to get in a groove
such that subsequent success becomes more likely. In contrast, after missing several shots a player is considered to have gone cold
and is thought to become tense, hesitant, and less likely to make his next few shots.
The belief in the hot hand, then, is really one version of a wider conviction that success breeds success
and failure breeds failure
in many walks of life. In certain areas it surely does. Financial success promotes further financial success because one’s initial good fortune provides more capital with which to wheel and deal. Success in the art world promotes further success because it earns an artist a reputation that exerts a powerful influence over people’s judgments of inherently ambiguous stimuli. However, there are other areas—gambling games immediately come to mind—where the belief may be just as strongly held, but where the phenomenon simply does not exist. What about the game of basketball? Does success in this sport tend to be self-promoting?
My colleagues and I have conducted a series of studies to answer this question.³ The first step, as always, involved translating the idea of the hot hand into a testable hypothesis. If a player’s performance is subject to periods of hot and cold shooting, then he should be more likely to make a shot after making his previous shot (or previous several shots) than after missing his previous shot. This implies, in turn, that a player’s hits (and misses) should cluster together more than one would expect by chance. We