0% found this document useful (0 votes)
12 views7 pages

Unit 3

bio psychology

Uploaded by

riyagupta.15nov
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views7 pages

Unit 3

bio psychology

Uploaded by

riyagupta.15nov
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Strategies of Effective Problem Solving

Problem-solving is a mental process that involves discovering, analyzing, and solving problems. The
ultimate goal of problem-solving is to overcome obstacles and find a solution that best resolves the
issue.

The best strategy for solving a problem depends largely on the unique situation. In some cases,
people are better off learning everything they can about the issue and then using factual knowledge
to come up with a solution. In other instances, creativity and insight are the best options.

Generate-and-test technique

It consists of generating possible solutions and then testing them. Generate-and-test is a technique
that loses its effectiveness very rapidly when there are many possibilities and when there is no
particular guidance for the generation process. If you forget the combination to your locker, for
instance, the technique will eventually work, but your frustration level by that time might exceed
your willingness to persevere with the task. Moreover, if you don’t have a way to keep track of the
possibilities you have tried, along with the ones you have yet to try, you might be in real trouble.

Generate-and-test can be useful, however, when there aren’t a lot of possibilities to keep track of. If
you’ve lost your keys somewhere between the cafeteria and your room and you made intermediate
stops in a classroom, the snack bar, and the bookstore, you can use this technique to help you
search.

Means–Ends Analysis

Means-End Analysis is a problem-solving technique that identifies the current state, defines the end
goal and determines the action plan to reach the end state in a modular way. End Goals are split into
sub-goals, and sub-sub goals and then action plans are drawn to achieve sub-goals first and then
move towards achieving the main goal progressively.

Suppose you want to visit a friend who lives in Summit, New Jersey, and you are currently residing in
Pomona, California. There are several possible means of transportation: walking, bicycling, taking a
taxi, taking a bus, taking a train, driving my car, or taking a plane or helicopter. The most practical
means might be to fly on a commercial airline; it’s the fastest and fits your budget.Here, you used
means end analysis to understand how to reach a destination.

Means–ends analysis is not always the optimal way to reach a solution, however, because sometimes
the optimal way involves taking a temporary step backward or further from the goal. For example,
imagine you live in an eastern suburb of LosAngeles but want to take a flight from LosAngeles to
Denver. To do so, you have to move, temporarily, a greater distance (further west) from your goal
than your current distance. Means–ends analysis can make it more difficult to see that the most
efficient path toward a goal isn’t always the most direct one.

working-backward approach

In the working-backward approach, the problem solver starts at the end and works toward the
beginning. Working backward is a very important technique for solving many problems, including the
famous Towers of Hanoi problem.

Working backward is most effective when the backward path is unique, which makes the process
more efficient than working forward. And, as you may have noticed, working backward shares with
means–ends analysis the technique of reducing differences between the current state and the goal
state.

Backtracking

In solving a problem, you often need to make certain provisional assumptions. Sometimes they turn
out to be wrong and need to be “unmade.” In those instances, it is useful to have some means of
keeping track of when and which assumptions were made so you can back up to certain points of
choice and start over, a process known as backtracking. The key to backtracking, then, is that the
problem solver keep close track of choice points—places where he/she made a provisional
assumption—so that, if subsequent work leads to a dead end, he/she can “back up” to that choice
point and make a different assumption.

Reasoning by Analogy

If you want to persuade a friend to watch a movie you enjoyed, the easiest way to persuade them
may be to compare the movie to other movies you know that they've watched. Using a comparison
between something new and something known is analogical reasoning, where we draw conclusions
by comparing two things.

Reasoning by analogy is a way to help others understand, to persuade, and to reason. In law, the use
of analogical reasoning is using precedent, where conclusions reached in one court case are applied
to another. Analogies are a tool in which two things are compared and conclusions are drawn based
on their similarities. So if p and q are similar in several ways, an analogical argument might use p to
explain q, or to reason about what's likely true about q based upon what's known about p.

Obstacles/Barriers to Problem Solving

Mental Sets, Entrenchment, and Fixation

One factor that can hinder problem solving is mental set—a frame of mind involving an existing
model for representing a problem, a problem context, or a procedure for problem solving. Another
term for mental set is entrenchment. When problem solvers have an entrenched mental set, they
fixate on a strategy that normally works well in solving many problems but that does not work well in
solving this particular problem.

Another type of mental set involves fixation on a particular use (function) for an object. Specifically,
functional fixedness is the inability to realize that something known to have a particular use may also
be used for performing other functions (German & Barrett, 2005; Rakoczy et al., 2009).

Another type of mental set is considered an aspect of social cognition. Stereotypes are beliefs that
members of a social group tend more or less uniformly to have particular types of characteristics.

Negative and Positive Transfer

Often, people have particular mental sets that prompt them to fixate on one aspect of a problem or
one strategy for problem solving to the exclusion of other possible relevant ones. They are carrying
knowledge and strategies for solving one kind of problem to a different kind of problem.

Transfer is any carryover of knowledge or skills from one problem situation to another (Detterman &
Sternberg, 1993; Gentile, 2000).

Transfer can be either negative or positive.


Negative transfer occurs when solving an earlier problem makes it harder to solve a later one.
Sometimes an early problem gets an individual on a wrong track. For example, police may have
difficulty solving a political crime because such a crime differs so much from the kinds of crime that
they typically deal with. Or when presented with a new tool, a person may operate it in a way similar
to the way in which he or she operated a tool with which he or she was already familiar (Besnard &
Cacitti, 2005).

Positive transfer occurs when the solution of an earlier problem makes it easier to solve a new
problem. That is, sometimes the transfer of a mental set can be an aid to problem solving. For
instance, one may transfer early math skills, such as addition, to advanced math problems of the
kinds found in algebra or physics (Bassok & Holyoak, 1989; Chen & Daehler, 1989; see also Campbell
& Robert, 2008).

Incubation

Incubation—putting the problem aside for a while without consciously thinking about it—offers one
way in which to minimize negative transfer. It involves taking a pause from the stages of problem
solving. For example, suppose you find that you are unable to solve a problem. None of the
strategies you can think of seem to work. Try setting the problem aside for a while to let it incubate.
During incubation, you must not consciously think about the problem. You do, however, allow for the
possibility that the problem will be processed subconsciously. Some investigators of problem solving
have even asserted that incubation is an essential stage of the problem-solving process (e.g., Cattell,
1971; von Helmholtz, 1896).

cognitive illusions & biases to decision making

Research on people’s decision-making skills and styles has consistently demonstrated the existence
of certain systematic and common biases, ways of thinking that lead to systematic errors. Typically,
the biases are understandable and often justifiable ways of thinking under most conditions but can
lead to error when misapplied. These systematic biases have been labeled cognitive illusions (von
Winterfeldt & Edwards, 1986b). The term itself is meant to invoke the analogy to perceptual illusions:
errors of cognition that come about for understandable reasons and that provide information
relevant to understanding normal functioning.

Availability

Tversky and Kahneman (1973) argued that when faced with the task of estimating probability,
frequency, or numerosity, people rely on shortcuts or rules of thumb, known as heuristics, to help
make these judgments easier.

One such heuristic is known as the availability heuristic—“assessing the ease with which the
relevant mental operation of retrieval, construction, or association can be carried out”. In other
words, instances (for example, particular words, particular committees, or particular paths) that are
more easily thought of, remembered, or computed stand out more in one’s mind. Those instances
are particularly salient and hence are deemed to be more frequent or probable.

For example, it turns out to be easier to think of words that begin with l (such aslawn, leftover, and
licorice) than to think of words that have l as the third letter (bell, wall, ill). The reason for this may
stem from the way our lexicons, or “mental dictionaries,” are organized or with how we learn or
practice words— alphabetically by the first letter. As with paper or electronic dictionaries, it’s
relatively easier to search for words by initial letter than by “interior” letters.

Everyday analogs that involve the use of the availability heuristic have also been reported. Ross and
Sicoly (1979), for instance, surveyed 37 married couples (husbands and wives separately and
independently) about the estimated extent of their responsibility for various household activities,
such as making breakfast, shopping for groceries, and caring for children. Husbands and wives, both
were more likely to say they had greater responsibility than did their spouse for 16 of the 20
activities. Moreover, when asked to give examples of their own and their spouse’s contributions to
each activity, each spouse listed more of her or his own activities than activities of her or his spouse.
Ross and Sicoly (1979) explained these findings in terms of the availability heuristic. Our own efforts
and behaviors are more apparent and available to us than are the efforts and behaviors of others.

Representativeness

Under the representativeness heuristic, specific scenarios appear more likely than general ones
because they are more representative of how we imagine particular events.

Representativeness heuristic example

You are sitting at a coffee shop and you notice a person in eccentric clothes reading a poetry book. If
you had to guess whether that person is an accountant or a poet, most likely you would think that
they are a poet. In reality, there are more accountants in the population than poets, which means
that such a person is more likely to be an accountant.

Representativeness heuristic vs. availability heuristic

Although both the representativeness heuristic and the availability heuristic play a role in our
decision-making and help us estimate how likely something is, they are two different types of
heuristics.

 The representativeness heuristic is a mental shortcut for judging the probability of an


outcome in terms of how well it seems to represent or match a particular prototype.
 The availability heuristic is a mental shortcut for judging the probability of an outcome in
terms of how easy it is to bring similar outcomes to mind.

In other words, representativeness causes us to miscalculate probability by paying more attention to


similarity, while availability causes us to focus on ease of recall.

Framing Effects

The framing effect is the cognitive bias wherein an individual’s choice from a set of options is
influenced more by how the information is worded than by the information itself.

For example- While looking for a disinfectant, you choose a product which claims to kill 95% of all the
germs, over one which claims that 5% of the germs will survive.

Anchoring

Anchoring bias is a cognitive bias that causes us to rely too heavily on the first piece of information
we are given about a topic. When we are setting plans or making estimates about something, we
interpret newer information from the reference point of our anchor, instead of seeing it objectively.
This can skew our judgment, and prevent us from updating our plans or predictions as much as we
should.
Imagine you’re out shopping for a present for a friend. You find a pair of earrings that you know
they’d love, but they cost $100, way more than you budgeted for. After putting the expensive
earrings back, you find a necklace for $75—still more than your budget, but hey, it’s cheaper than
the earrings.

Sunk Cost Effects

A sunk cost is any cost that’s already been invested and can’t be retrieved. The sunk cost fallacy
(sometimes called the lost cost fallacy or trap) is a cognitive bias that causes people to stick with a
plan, course, or approach that isn’t working because of how much has already been invested in it.

Example- Eating more than you need to because you’ve unintentionally ordered or prepared too
much food.

Hindsight bias

hindsight bias, the tendency, upon learning an outcome of an event—such as an experiment, a


sporting event, a military decision, or a political election—to overestimate one’s ability to have
foreseen the outcome. Hindsight bias is colloquially known as the “I knew it all along phenomenon.”
It is a type of confirmation bias.

Confirmation bias

Confirmation bias is the tendency of people to favor information that confirms their existing beliefs
or hypotheses. Confirmation bias happens when a person gives more weight to evidence that
confirms their beliefs and undervalues evidence that could disprove it. People display this bias when
they gather or recall information selectively, or when they interpret it in a biased way. The effect is
stronger for emotionally charged issues and for deeply entrenched beliefs.

For instance, on the topics of abortion and transgender rights, people whose religions are against
such things will interpret this information differently than others and will look for evidence to
validate what they believe. Parents go wrong if they only seek information that would potentially
confirm their hunch that a particular option is the best.

Overconfidence

Overconfidence bias is a general tendency of people to overestimate their skills, authority, and
knowledge due to excessive confidence. It can affect their thoughts, decisions, and strategies
associated with particular tasks, strategies, and outcomes.

In finance, entities’ overconfidence based on past and present success can lead to poor decision-
making. The overconfidence effect can affect the rationality present in the investors. It also
influences investors to indulge in trading to achieve higher returns actively. However, they often fail
to get higher returns.

Models of decision making

Image Theory

This theory states that, people make decisions by asking themselves whether a new goal, plan, or
alternative is compatible with three images: the value image (containing the decision maker’s values,
morals, and principles), the trajectory image (containing the decision maker’s goals and aspirations
for the future), and the strategic image (the way in which the decision maker plans to attain his or
her goals).
According to image theory, options judged incompatible with one or more of these three images
(value, trajectory, strategic) are dropped from further consideration. This prechoice screening
process is noncompensatory: Violations of any image are enough to rule out that option, and no
tradeoffs are made.

For example- The student might quickly reject certain majors because they aren’t perceived as fitting
well with the student’s values or principles (for example, “I can’t major in economics because all
econ majors care about is money”).

Recognition Primed Decision Making

Recognition Primed Decision Making (RPD) is a model about the process of decision making when
people need to make quick and effective decisions in complex situations.

The model shows that the person who has to make such a decision first generates a possible course
of action. This is compared with the constraints imposed by the situation and then the course of
action is chosen that will not be rejected.

The model was proposed by Gary A. Klein, in his book Sources of Power. It has since proven to be a
legitimate model for how people make difficult decisions.

Ways to improve decision making

In general, simply telling people about biases in decision making and planning (including
overconfidence) results in little or no improvement (Arkes, 1986; Fischhoff, 1982a). Real
improvement in reducing bias seems to require extensive practice with the task, individual feedback
about one’s performance, and some means of making the statistical and/or probabilistic aspects of
the decisions clearer. Under some of these conditions, substantial reductions in bias have been
reported (Arkes, 1986; Nisbett, Krantz, Jepson, & Kunda, 1983).

contrary to our (strong) intuitions, it is often better, fairer, more rational, and in the long run more
humane to use decision aids than to rely exclusively on human impressions or intuitions (Kleinmuntz,
1990).

1. Take Note of Your Overconfidence- Good decision-makers recognize areas in their lives where
overconfidence could be a problem. Then they adjust their thinking and their behavior
accordingly.
2. Identify the Risks You Take

Familiarity breeds comfort. And there’s a good chance you make some poor decisions simply because
you’ve grown accustomed to your habits and you don’t think about the danger you’re in or the harm
you’re causing.

For example, you might speed on your way to work every day. Each time you arrive safely without a
speeding ticket, you become a little more comfortable with driving fast. But clearly, you’re
jeopardizing your safety and taking a legal risk.

Or maybe you eat fast food for lunch every day. Since you don’t suffer any immediate signs of ill
health, you might not see it as a problem. But over time, you may gain weight or experience other
health issues as a consequence.

3. Frame Your Problems In a Different Way


The way you pose a question or a problem plays a major role in how you’ll respond and how you’ll
perceive your chances of success. So when you’re faced with a decision, frame the issue differently.
Take a minute to think about whether the slight change in wording affects how you view the
problem.

4. Stop Thinking About the Problem

When you’re faced with a tough choice, like whether to move to a new city or change careers,
you might spend a lot of time thinking about the pros and cons or the potential risks and
rewards. And while science shows there is plenty of value in thinking about your options,
overthinking your choices can actually be a problem. Weighing the pros and cons for too long
may increase your stress level to the point that you struggle to make a decision.

Studies show there’s a lot of value in letting an idea “incubate.” Non-conscious thinking is
surprisingly astute. So consider sleeping on a problem. Or get yourself involved in an activity that
takes your mind off a problem. Let your brain work through things in the background and you’re
likely to develop clear answers.

5. Set Aside Time to Reflect on Your Mistakes


6. Acknowledge Your Shortcuts- n fact, your mind has created mental shortcuts—referred to as
heuristics—that help you make decisions faster. And while these mental shortcuts keep you
from toiling for hours over every little choice you make, they can also steer you wrong.

The availability heuristic, for example, involves basing decisions on examples and information
that immediately spring to mind. So if you watch frequent news stories that feature house
fires, you’re likely to overestimate the risk of experiencing a house fire.
7. Consider the Opposite

Once you’ve decided something is true, you’re likely to cling to that belief. It’s a psychological
principle known as belief perseverance. It takes more compelling evidence to change a belief
than it did to create it, and there’s a good chance you’ve developed some beliefs that don’t
serve you well.

For example, you might assume you’re a bad public speaker, so you avoid speaking up in
meetings. Or you might believe you are bad at relationships, so you stop going on dates.

You’ve also developed beliefs about certain groups of people. Perhaps you believe, “People
who work out a lot are narcissists,” or “Rich people are evil.”

Those beliefs that you assume are always true or 100 percent accurate can lead you astray.
The best way to challenge your beliefs is to argue the opposite.

Considering the opposite will help breakdown unhelpful beliefs so you can look at situations
in another light and decide to act differently.

You might also like