Thinking
Thinking
The human brain is indeed a remarkable thinking machine, capable of amazing, complex,
creative, logical thoughts. Why, then, are we telling you that you need to learn how to think? Mainly
because one major lesson from cognitive psychology is that these capabilities of the human brain are
relatively infrequently realized. Many psychologists believe that people are essentially “cognitive
misers.” It is not that we are lazy, but that we have a tendency to expend the least amount of mental
effort necessary. Although you may not realize it, it actually takes a great deal of energy to think.
Careful, deliberative reasoning and critical thinking are very difficult. Because we seem to be
successful without going to the trouble of using these skills well, it feels unnecessary to develop
them. As you shall see, however, there are many pitfalls in the cognitive processes described in this
module. When people do not devote extra effort to learning and improving reasoning, problem
solving, and critical thinking skills, they make many errors.
As is true for memory, if you develop the cognitive skills presented in this module, you will be
more successful in school. It is important that you realize, however, that these skills will help you
far beyond school, even more so than a good memory will. Although it is somewhat useful to have a
good memory, ten years from now no potential employer will care how many questions you got right-
on multiple-choice exams during college. All of them will, however, recognize whether you are a
logical, analytical, critical thinker. With these thinking skills, you will be an effective, persuasive
communicator and an excellent problem solver.
This material begins by describing different kinds of thought and knowledge, especially
conceptual knowledge and critical thinking. An understanding of these differences will be valuable
as you progress through school and encounter different assignments that require you to tap into
different kinds of knowledge. The second section covers deductive and inductive reasoning, which
are processes we use to construct and evaluate strong arguments. They are essential skills to have
whenever you are trying to persuade someone (including yourself) of some point, or to respond to
someone’s efforts to persuade you. The material ends with a section about problem solving. A solid
understanding of the key processes involved in problem solving will help you to handle many daily
challenges.
1. Different kinds of thought
2. Reasoning and Judgment
3. Problem Solving
Remember and Understand
By reading this material, you should be able to remember and describe:
• Concepts and inferences
• Procedural knowledge
• Metacognition
• Characteristics of critical thinking: skepticism; identify biases, distortions, omissions, and
assumptions; reasoning and problem-solving skills
• Reasoning: deductive reasoning, deductively valid argument, inductive reasoning,
inductively strong argument, availability heuristic, representativeness heuristic
• Fixation: functional fixedness, mental set
• Algorithms, heuristics, and the role of confirmation bias
• Effective problem-solving sequence
Factual and conceptual knowledge
Under the topic memory, the idea of declarative memory was introduced, which is composed
of facts and episodes. If you have ever played a trivia game or watched Jeopardy on TV, you realize
that the human brain is able to hold an extraordinary number of facts. Likewise, you realize that
each of us has an enormous store of episodes, essentially facts about events that happened in our
own lives. It may be difficult to keep that in mind when we are struggling to retrieve one of those
facts while taking an exam, however. Part of the problem is that, many students continue to try to
memorize course material as a series of unrelated facts (picture a history student simply trying to
memorize history as a set of unrelated dates without any coherent story tying them together). Facts
in the real world are not random and unorganized, however. It is the way that they are organized
that constitutes a second key kind of knowledge, conceptual.
Concepts are nothing more than our mental representations of categories of things in the
world. For example, think about dogs. When you do this, you might remember specific facts about
dogs, such as they have fur and they bark. You may also recall dogs that you have encountered and
picture them in your mind. All of this information (and more) makes up your concept of dog. You
can have concepts of simple categories (e.g., triangle), complex categories (e.g., small dogs that
sleep all day, eat out of the garbage, and bark at leaves), kinds of people (e.g., psychology
professors), events (e.g., birthday parties), and abstract ideas (e.g., justice).
Gregory Murphy (2002) refers to concepts as the “glue that holds our mental life together”.
Very simply, summarizing the world by using concepts is one of the most important cognitive tasks
that we do. Our conceptual knowledge is our knowledge about the world. Individual concepts are
related to each other to form a rich interconnected network of knowledge. For example, think about
how the following concepts might be related to each other: dog, pet, play, Frisbee, chew toy, shoe.
Or, of more obvious use to you now, how these concepts are related: working memory, long-term
memory, declarative memory, procedural memory, and rehearsal? Because our minds have a natural
tendency to organize information conceptually, when students try to remember course material as
isolated facts, they are working against their strengths.
One last important point about concepts is that they allow you to instantly know a great deal
of information about something. For example, if someone hands you a small red object and says,
“here is an apple,” they do not have to tell you, “it is something you can eat.” You already know
that you can eat it because it is true by virtue of the fact that the object is an apple; this is called
drawing an inference, assuming that something is true on the basis of your previous knowledge (for
example, of category membership or of how the world works) or logical reasoning.
Procedural knowledge
Physical skills, such as tying your shoes, doing a cartwheel, and driving a car (or doing all three
at the same time) are certainly a kind of knowledge. They are procedural knowledge, the same idea
as procedural memory. Mental skills, such as reading, debating, and planning a psychology
experiment, are procedural knowledge. In short, procedural knowledge is the knowledge how to do
something (Cohen & Eichenbaum, 1993).
Metacognitive knowledge
Floyd used to think that he had a great memory. Now, he has a better memory. Why? Because
he finally realized that his memory was not as great as he once thought it was. Because Floyd
eventually learned that he often forgets where he put things, he finally developed the habit of
putting things in the same place. Because he finally realized that he often forgets to do things, he
finally started using the To Do list app on his phone. And so on. Floyd’s insights about the real
limitations of his memory have allowed him to remember things that he used to forget.
All of us have knowledge about the way our own minds work. You may know that you have a
good memory for people’s names and a poor memory for math formulas. Someone else might realize
that they have difficulty remembering to do things, like stopping at the store on the way home.
Others still know that they tend to overlook details. This knowledge about our own thinking is actually
quite important; it is called metacognitive knowledge, or metacognition. Like other kinds of thinking
skills, it is subject to error. For example, in unpublished research, one of the authors surveyed about
120 General Psychology students on the first day of the term. Among other questions, the students
were asked them to predict their grade in the class and report their current Grade Point Average.
Two-thirds of the students predicted that their grade in the course would be higher than their GPA.
The reality is that students tend to earn lower grades in psychology than their overall GPA. Another
example: Students routinely report that they thought they had done well on an exam, only to
discover, to their dismay, that they were wrong. Both errors reveal a breakdown in metacognition.
The Dunning-Kruger Effect
In general, most college students probably do not study enough. For example, using data from
the National Survey of Student Engagement, Fosnacht, McCormack, and Lerma (2018) reported that
first-year students at 4-year colleges in the U.S. averaged less than 14 hours per week preparing for
classes. The typical suggestion is that you should spend two hours outside of class for every hour in
class, or 24 – 30 hours per week for a full-time student. Clearly, students in general are nowhere
near that recommended mark. Many observers, including some faculty, believe that this shortfall is
a result of students being too busy or lazy. Now, it may be true that many students are too busy,
with work and family obligations, for example. Others, are not particularly motivated in school, and
therefore might correctly be labeled lazy. A third possible explanation, however, is that some
students might not think they need to spend this much time. And this is a matter of metacognition.
Consider the scenario that is mentioned above, students thinking they had done well on an
exam only to discover that they did not. Justin Kruger and David Dunning examined scenarios very
much like this in 1999. Kruger and Dunning gave research participants tests measuring humor, logic,
and grammar. Then, they asked the participants to assess their own abilities and test performance
in these areas. They found that participants in general tended to overestimate their abilities, already
a problem with metacognition. Importantly, the participants who scored the lowest overestimated
their abilities the most. Specifically, students who scored in the bottom quarter (averaging in the
12th percentile) thought they had scored in the 62nd percentile. This has become known as
the Dunning-Kruger effect. Many individual faculty members have replicated these results with their
own student on their course exams. Think about it. Some students who just took an exam and
performed poorly believe that they did well before seeing their score. It seems very likely that these
are the very same students who stopped studying the night before because they thought they were
“done.” Quite simply, it is not just that they did not know the material. They did not know that they
did not know the material. That is poor metacognition.
In order to develop good metacognitive skills, you should continually monitor your thinking
and seek frequent feedback on the accuracy of your thinking (Medina, Castleberry, & Persky 2017).
For example, get in the habit of predicting your exam grades. As soon as possible after taking an
exam, try to find out which questions you missed and try to figure out why. If you do this soon
enough, you may be able to recall the way it felt when you originally answered the question. Did you
feel confident that you had answered the question correctly? Then you have just discovered an
opportunity to improve your metacognition. Be on the lookout for that feeling and respond with
caution.
Terminologies
concept: a mental representation of a category of things in the world
Dunning-Kruger effect: individuals who are less competent tend to overestimate their abilities
more than individuals who are more competent do
inference: an assumption about the truth of something that is not stated. Inferences come from
our prior knowledge and experience, and from logical reasoning
metacognition: knowledge about one’s own cognitive processes; thinking about your thinking
Critical thinking
One particular kind of knowledge or thinking skill that is related to metacognition is critical
thinking (Chew, 2020). You may have noticed that critical thinking is an objective in many college
courses, and it is particularly appropriate in psychology. As the science of (behavior and) mental
processes, psychology is obviously well suited to be the discipline through which you should be
introduced to this important way of thinking.
More importantly, there is a particular need to use critical thinking in psychology. We are all,
in a way, experts in human behavior and mental processes, having engaged in them literally since
birth. Thus, perhaps more than in any other class, students typically approach psychology with very
clear ideas and opinions about its subject matter. That is, students already “know” a lot about
psychology. The problem is, “it isn’t so much the things we don’t know that get us into trouble. It’s
the things we know that just isn’t so” (Ward, quoted in Gilovich 1991). Indeed, many of students’
preconceptions about psychology are just plain wrong. Randolph Smith (2002) wrote a book about
critical thinking in psychology called Challenging Your Preconceptions, highlighting this fact. On the
other hand, many of students’ preconceptions about psychology are just plain right! But, how do you
know which of your preconceptions are right and which are wrong? And when you come across a
research finding or theory in this class that contradicts your preconceptions, what will you do? Will
you stick to your original idea, discounting the information from the class? Will you immediately
change your mind? Critical thinking can help us sort through this confusing mess.
What is critical thinking? The goal of critical thinking is simple to state (but extraordinarily
difficult to achieve): it is to be right, to draw the correct conclusions, to believe in things that are
true and to disbelieve things that are false. This material will provide you with two definitions of
critical thinking. First, a more conceptual one: Critical thinking is thinking like a scientist in your
everyday life (Schmaltz, Jansen, & Wenckowski, 2017). The second definition is more operational;
it is simply a list of skills that are essential to be a critical thinker. Critical thinking entails solid
reasoning and problem-solving skills; skepticism; and an ability to identify biases, distortions,
omissions, and assumptions. Excellent deductive and inductive reasoning, and problem-solving skills
contribute to critical thinking.
Scientists form hypotheses, or predictions about some possible future observations. Then, they
collect data, or information. They do their best to make unbiased observations using reliable
techniques that have been verified by others. Then, and only then, they draw a conclusion about
what those observations mean. And do not forget the most important part which is “Conclusion”
however, it is probably not the most appropriate word because this conclusion is only tentative. A
scientist is always prepared that someone else might come along and produce new observations that
would require a new conclusion be drawn.
A Critical Thinker’s Toolkit
Good critical thinkers (and scientists) rely on a variety of tools to evaluate information.
Perhaps the most recognizable tool for critical thinking is skepticism (and this term provides the
clearest link to the thinking like a scientist definition). Some people intend it as an insult when they
call someone a skeptic. But if someone calls you a skeptic, if they are using the term correctly, you
should consider it a great compliment. Simply put, skepticism is a way of thinking in which you refrain
from drawing a conclusion or changing your mind until good evidence has been provided. As a skeptic,
you are not inclined to believe something just because someone said so, because someone else
believes it, or because it sounds reasonable. You must be persuaded by high quality evidence.
If that evidence is produced, you have a responsibility as a skeptic to change your belief.
Failure to change a belief in the face of good evidence is not skepticism; skepticism has open
mindedness at its core. M. Neil Browne and Stuart Keeley (2018) use the term weak sense critical
thinking to describe critical thinking behaviors that are used only to strengthen a prior belief. Strong
sense critical thinking, on the other hand, has as its goal reaching the best conclusion. Sometimes
that means strengthening your prior belief, but sometimes it means changing your belief to
accommodate the better evidence.
Many times, a failure to think critically or weak sense critical thinking is related to a bias, an
inclination, tendency, leaning, or prejudice. Everybody has biases, but many people are unaware of
them. Awareness of your own biases gives you the opportunity to control or counteract them.
Unfortunately, however, many people are happy to let their biases creep into their attempts to
persuade others; indeed, it is a key part of their persuasive strategy.
Here are some common sources of biases:
• Personal values and beliefs. Some people believe that human beings are basically driven to
seek power and that they are typically in competition with one another over scarce resources.
These beliefs are similar to the world-view that political scientists call “realism.” Other people
believe that human beings prefer to cooperate and that, given the chance, they will do so.
These beliefs are similar to the world-view known as “idealism.” For many people, these
deeply held beliefs can influence, or bias, their interpretations of such wide-ranging situations
as the behavior of nations and their leaders or the behavior of the driver in the car ahead of
you. For example, if your worldview is that people are typically in competition and someone
cuts you off on the highway, you may assume that the driver did it purposely to get ahead of
you. Other types of beliefs about the way the world is or the way the world should be, for
example, political beliefs, can similarly become a significant source of bias.
• Racism, sexism, ageism and other forms of prejudice and bigotry. These are, sadly, a common
source of bias in many people. They are essentially a special kind of “belief about the way the
world is.” These beliefs—for example, that women do not make effective leaders—lead people
to ignore contradictory evidence (examples of effective women leaders, or research that
disputes the belief) and to interpret ambiguous evidence in a way consistent with the belief.
• Self-interest. When particular people benefit from things turning out a certain way, they can
sometimes be very susceptible to letting that interest bias them. For example, a company
that will earn a profit if they sell their product may have a bias in the way that they give
information about their product. A union that will benefit if its members get a generous
contract might have a bias in the way it presents information about salaries at competing
organizations. Home buyers are often dismayed to discover that they purchased their dream
house from someone whose self-interest led them to lie about flooding problems in the
basement or back yard. This principle, the biasing power of self-interest, is likely what led to
the famous phrase Caveat Emptor (let the buyer beware).
Knowing that these types of biases exist will help you evaluate evidence more critically. Do not
forget, though, that people are not always keen to let you discover the sources of biases in their
arguments. For example, companies or political organizations can sometimes disguise their support
of a research study by contracting with a university professor, who comes complete with a seemingly
unbiased institutional affiliation, to conduct the study.
People’s biases, conscious or unconscious, can lead them to make omissions, distortions, and
assumptions that undermine our ability to correctly evaluate evidence. It is essential that you look
for these elements. Always ask, what is missing, what is not as it appears, and what is being assumed
here? In order to be a critical thinker, you need to learn to pay attention to the assumptions that
underlie a message. Let us briefly illustrate the role of assumptions by touching on some people’s
beliefs about the criminal justice system. Some believe that a major problem with our judicial system
is that many criminals go free because of legal technicalities. Others believe that a major problem
is that many innocent people are convicted of crimes. The simple fact is, both types of errors occur.
A person’s conclusion about which flaw in our judicial system is the greater tragedy is based on an
assumption about which of these is the more serious error (letting the guilty go free or convicting
the innocent). This type of assumption is called a value assumption (Browne and Keeley, 2018). It
reflects the differences in values that people develop, differences that may lead us to disregard valid
evidence that does not fit in with our particular values.
skepticism: a way of thinking in which you refrain from drawing a conclusion or changing your mind
until good evidence has been provided
bias: an inclination, tendency, leaning, or prejudice
Problem Solving
Mary has a problem. Her daughter, ordinarily quite eager to please, appears to delight in being
the last person to do anything. Whether getting ready for school, going to piano lessons or karate
class, or even going out with her friends, she seems unwilling or unable to get ready on time. Other
people have different kinds of problems. For example, many students work at jobs, have numerous
family commitments, and are facing a course schedule full of difficult exams, assignments, papers,
and speeches. How can they find enough time to devote to their studies and still fulfill their other
obligations? Speaking of students and their problems: Show that a ball thrown vertically upward with
initial velocity v0 takes twice as much time to return as to reach the highest point (from Spiegel,
1981).
These are three very different situations, but we have called them all problems. What makes
them all the same, despite the differences? A psychologist might define a problem as a situation
with an initial state, a goal state, and a set of possible intermediate states. Somewhat more
meaningfully, we might consider a problem a situation in which you are in here one state (e.g.,
daughter is always late), you want to be there in another state (e.g., daughter is not always late),
and with no obvious way to get from here to there. Defined this way, each of the three situations we
outlined can now be seen as an example of the same general concept, a problem. At this point, you
might begin to wonder what is not a problem, given such a general definition. It seems that nearly
every non-routine task we engage in could qualify as a problem. As long as you realize that problems
are not necessarily bad, this may be a useful way to think about it.
Can we identify a set of problem-solving skills that would apply to these very different kinds
of situations? Let us try to begin to make sense of the wide variety of ways that problems can be
solved with an important observation: the process of solving problems can be divided into two key
parts. First, people have to notice, comprehend, and represent the problem properly in their minds
(called problem representation). Second, they have to apply some kind of solution strategy to the
problem. Psychologists have studied both of these key parts of the process in detail.
When you first think about the problem-solving process, you might guess that most of our
difficulties would occur because we are failing in the second step, the application of strategies.
Although this can be a significant difficulty much of the time, the more important source of difficulty
is probably problem representation. In short, we often fail to solve a problem because we are looking
at it, or thinking about it, the wrong way.
problem: a situation in which we are in an initial state, have a desired goal state, and there is a
number of possible intermediate states (i.e., there is no obvious way to get from the initial to the
goal state)
problem representation: noticing, comprehending and forming a mental conception of a problem
Defining and Mentally Representing Problems in Order to Solve Them
The main obstacle to solving a problem is that we do not clearly understand exactly what the
problem is. Recall the problem with Mary’s daughter always being late. One way to represent, or to
think about, this problem is that she is being defiant. She refuses to get ready in time. This type of
representation or definition suggests a particular type of solution. Another way to think about the
problem, however, is to consider the possibility that she is simply being sidetracked by interesting
diversions. This different conception of what the problem is (i.e., different representation) suggests
a very different solution strategy. For example, if Mary defines the problem as defiance, she may be
tempted to solve the problem using some kind of coercive tactics, that is, to assert her authority as
her mother and force her to listen. On the other hand, if Mary defines the problem as distraction,
she may try to solve it by simply removing the distracting objects.
Unfortunately, however, changing a problem’s representation is not the easiest thing in the
world to do. Often, problem solvers get stuck looking at a problem one way. This is called fixation.
Most people who represent the preceding problem as a problem about a fly probably do not pause to
reconsider, and consequently change, their representation. A parent who thinks her daughter is being
defiant is unlikely to consider the possibility that her behavior is far less purposeful.
Problem-solving fixation was examined by a group of German psychologists called Gestalt
psychologists during the 1930’s and 1940’s. Karl Dunker, for example, discovered an important type
of failure to take a different perspective called functional fixedness. Imagine being a participant in
one of his experiments. You are asked to figure out how to mount two candles on a door and are
given an assortment of odds and ends, including a small empty cardboard box and some thumbtacks.
Perhaps you have already figured out a solution: tack the box to the door so it forms a platform, then
put the candles on top of the box. Most people are able to arrive at this solution. Imagine a slight
variation of the procedure, however. What if, instead of being empty, the box had matches in it?
Most people given this version of the problem do not arrive at the solution given above. Why? Because
it seems to people that when the box contains matches, it already has a function; it is a matchbox.
People are unlikely to consider a new function for an object that already has a function. This is
functional fixedness.
Mental set is a type of fixation in which the problem solver gets stuck using the same solution
strategy that has been successful in the past, even though the solution may no longer be useful. It is
commonly seen when students do math problems for homework. Often, several problems in a row
require the reapplication of the same solution strategy. Then, without warning, the next problem in
the set requires a new strategy. Many students attempt to apply the formerly successful strategy on
the new problem and therefore cannot come up with a correct answer.
The thing to remember is that you cannot solve a problem unless you correctly identify what
it is to begin with (initial state) and what you want the end result to be (goal state). That may mean
looking at the problem from a different angle and representing it in a new way. The correct
representation does not guarantee a successful solution, but it certainly puts you on the right track.
A bit more optimistically, the Gestalt psychologists discovered what may be considered the
opposite of fixation, namely insight. Sometimes the solution to a problem just seems to pop into
your head. Wolfgang Kohler examined insight by posing many different problems to chimpanzees,
principally problems pertaining to their acquisition of out-of-reach food. In one version, a banana
was placed outside of a chimpanzee’s cage and a short stick inside the cage. The stick was too short
to retrieve the banana, but was long enough to retrieve a longer stick also located outside of the
cage. This second stick was long enough to retrieve the banana. After trying, and failing, to reach
the banana with the shorter stick, the chimpanzee would try a couple of random-seeming attempts,
react with some apparent frustration or anger, then suddenly rush to the longer stick, the correct
solution fully realized at this point. This sudden appearance of the solution, observed many times
with many different problems, was termed insight by Kohler.
fixation: when a problem solver gets stuck looking at a problem a particular way and cannot
change his or her representation of it (or his or her intended solution strategy)
functional fixedness: a specific type of fixation in which a problem solver cannot think of a new
use for an object that already has a function
mental set: a specific type of fixation in which a problem solver gets stuck using the same solution
strategy that has been successful in the past
insight: a sudden realization of a solution to a problem
Solving Problems by Trial and Error
Correctly identifying the problem and your goal for a solution is a good start, but recall the
psychologist’s definition of a problem: it includes a set of possible intermediate states. Viewed this
way, a problem can be solved satisfactorily only if one can find a path through some of these
intermediate states to the goal. Imagine a fairly routine problem, finding a new route to school when
your ordinary route is blocked (by road construction, for example). At each intersection, you may
turn left, turn right, or go straight. A satisfactory solution to the problem (of getting to school) is a
sequence of selections at each intersection that allows you to wind up at school.
If you had all the time in the world to get to school, you might try choosing intermediate
states randomly. At one corner you turn left, the next you go straight, then you go left again, then
right, then right, then straight. Unfortunately, trial and error will not necessarily get you where you
want to go, and even if it does, it is not the fastest way to get there. Trial and error is not all bad.
B.F. Skinner, a prominent behaviorist psychologist, suggested that people often behave
randomly in order to see what effect the behavior has on the environment and what subsequent
effect this environmental change has on them. This seems particularly true for the very young person.
Picture a child filling a household’s fish tank with toilet paper, for example. To a child trying to
develop a repertoire of creative problem-solving strategies, an odd and random behavior might be
just the ticket. Eventually, the exasperated parent hopes, the child will discover that many of these
random behaviors do not successfully solve problems; in fact, in many cases they create problems.
Thus, one would expect a decrease in this random behavior as a child matures. You should realize,
however, that the opposite extreme is equally counterproductive. If the children become too rigid,
never trying something unexpected and new, their problem-solving skills can become too limited.
Effective problem solving seems to call for a happy medium that strikes a balance between
using well-founded old strategies and trying new ground and territory. The individual who recognizes
a situation in which an old problem-solving strategy would work best, and who can also recognize a
situation in which a new untested strategy is necessary is halfway to success.
Solving Problems with Algorithms and Heuristics
For many problems there is a possible strategy available that will guarantee a correct solution.
For example, think about math problems. Math lessons often consist of step-by-step procedures that
can be used to solve the problems. If you apply the strategy without error, you are guaranteed to
arrive at the correct solution to the problem. This approach is called using an algorithm, a term that
denotes the step-by-step procedure that guarantees a correct solution. Because algorithms are
sometimes available and come with a guarantee, you might think that most people use them
frequently. Unfortunately, however, they do not. As the experience of many students who have
struggled through math classes can attest, algorithms can be extremely difficult to use, even when
the problem solver knows which algorithm is supposed to work in solving the problem. In problems
outside of math class, we often do not even know if an algorithm is available. It is probably fair to
say, then, that algorithms are rarely used when people try to solve problems.
Because algorithms are so difficult to use, people often pass up the opportunity to guarantee
a correct solution in favor of a strategy that is much easier to use and yields a reasonable chance of
coming up with a correct solution. These strategies are called problem solving heuristics. A problem
solving heuristic is a shortcut strategy that people use when trying to solve problems. It usually works
pretty well, but does not guarantee a correct solution to the problem. For example, one problem
solving heuristic might be “always move toward the goal” (so when trying to get to school when your
regular route is blocked, you would always turn in the direction you think the school is). A heuristic
that people might use when doing math homework is “use the same solution strategy that you just
used for the previous problem.”
Although it is probably not worth describing a large number of specific heuristics, two
observations about heuristics are worth mentioning. First, heuristics can be very general or they can
be very specific, pertaining to a particular type of problem only. For example, “always move toward
the goal” is a general strategy that you can apply to countless problem situations. On the other hand,
“when you are lost without a functioning gps, pick the most expensive car you can see and follow it”
is specific to the problem of being lost. Second, all heuristics are not equally useful. One heuristic
that many students know is “when in doubt, choose c for a question on a multiple-choice exam.”
This is a dreadful strategy because many instructors intentionally randomize the order of answer
choices. Another test-taking heuristic, somewhat more useful, is “look for the answer to one question
somewhere else on the exam.”
You really should pay attention to the application of heuristics to test taking. Imagine that
while reviewing your answers for a multiple-choice exam before turning it in, you come across a
question for which you originally thought the answer was c. Upon reflection, you now think that the
answer might be b. Should you change the answer to b, or should you stick with your first impression?
Most people will apply the heuristic strategy to “stick with your first impression.” What they do not
realize, of course, is that this is a very poor strategy (Lilienfeld et al, 2009). Most of the errors on
exams come on questions that were answered wrong originally and were not changed (so they remain
wrong). There are many fewer errors where we change a correct answer to an incorrect answer. And,
of course, sometimes we change an incorrect answer to a correct answer. In fact, research has shown
that it is more common to change a wrong answer to a right answer than vice versa (Bruno, 2001).
The belief in this poor test-taking strategy (stick with your first impression) is based on
the confirmation bias (Nickerson, 1998; Wason, 1960). People have a bias, or tendency, to notice
information that confirms what they already believe. Somebody at one time told you to stick with
your first impression, so when you look at the results of an exam you have taken, you will tend to
notice the cases that are consistent with that belief. That is, you will notice the cases in which you
originally had an answer correct and changed it to the wrong answer. You tend not to notice the
other two important (and more common) cases, changing an answer from wrong to right, and leaving
a wrong answer unchanged. Because heuristics by definition do not guarantee a correct solution to
a problem, mistakes are bound to occur when we employ them. A poor choice of a specific heuristic
will lead to an even higher likelihood of making an error.
algorithm: a step-by-step procedure that guarantees a correct solution to a problem
problem solving heuristic: a shortcut strategy that we use to solve problems. Although they are
easy to use, they do not guarantee correct judgments and solutions
confirmation bias: people’s tendency to notice information that confirms what they already
believe
An Effective Problem-Solving Sequence
You may be left with a big question: If algorithms are hard to use and heuristics often don’t work,
how am I supposed to solve problems? Robert Sternberg (1996), as part of his theory of what makes
people successfully intelligent described a problem-solving sequence that has been shown to work
rather well:
• Identify the existence of a problem. In school, problem identification is often easy; problems
that you encounter in math classes, for example, are conveniently labeled as problems for
you. Outside of school, however, realizing that you have a problem is a key difficulty that you
must get past in order to begin solving it. You must be very sensitive to the symptoms that
indicate a problem.
• Define the problem. Suppose you realize that you have been having many headaches recently.
Very likely, you would identify this as a problem. If you define the problem as “headaches,”
the solution would probably be to take aspirin or ibuprofen or some other anti-inflammatory
medication. If the headaches keep returning, however, you have not really solved the
problem—likely because you have mistaken a symptom for the problem itself. Instead, you
must find the root cause of the headaches. Stress might be the real problem. For you to
successfully solve many problems it may be necessary for you to overcome your fixations and
represent the problems differently. One specific strategy that you might find useful is to try
to define the problem from someone else’s perspective. How would your parents, spouse,
significant other, doctor, etc. define the problem? Somewhere in these different perspectives
may lurk the key definition that will allow you to find an easier and permanent solution.
• Formulate strategy. Now it is time to begin planning exactly how the problem will be solved.
Is there an algorithm or heuristic available for you to use? Remember, heuristics by their very
nature guarantee that occasionally you will not be able to solve the problem. One point to
keep in mind is that you should look for long-range solutions, which are more likely to address
the root cause of a problem than short-range solutions.
• Represent and organize information. Similar to the way that the problem itself can be
defined, or represented in multiple ways, information within the problem is open to different
interpretations. Suppose you are studying for a big exam. You have chapters from a textbook
and from a supplemental reader, along with lecture notes that all need to be studied. How
should you (represent and) organize these materials? Should you separate them by type of
material (text versus reader versus lecture notes), or should you separate them by topic? To
solve problems effectively, you must learn to find the most useful representation and
organization of information.
• Allocate resources. This is perhaps the simplest principle of the problem solving sequence,
but it is extremely difficult for many people. First, you must decide whether time, money,
skills, effort, goodwill, or some other resource would help to solve the problem Then, you
must make the hard choice of deciding which resources to use, realizing that you cannot
devote maximum resources to every problem. Very often, the solution to problem is simply to
change how resources are allocated (for example, spending more time studying in order to
improve grades).
• Monitor and evaluate solutions. Pay attention to the solution strategy while you are applying
it. If it is not working, you may be able to select another strategy. Another fact you should
realize about problem solving is that it never does end. Solving one problem frequently brings
up new ones. Good monitoring and evaluation of your problem solutions can help you to
anticipate and get a jump on solving the inevitable new problems that will arise.