Solution Analysis and Management: Module 8: Decision Making

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 26

Solution Analysis and

Management
Module 8: Decision Making

The content of this module has been adapted from:

Gino, Bazerman and Shonk. Decision Making. Harvard Business Publishing, 2016.
Learning Outcomes

• First, we will consider the importance and history of research into decision
making.
• Next we will look at common biases in decision making to watch out for – both
in ourselves and others, using the online quiz you completed prior to class.
• Finally we will look at a three step framework for improved decision making.

Source: maxpixel.net
Introduction to Decision Making

You now understand the overall Business Analysis process for solution evaluation, how
measurements work and how to conduct a cost-benefit analysis.

Now we need to consider – we have all this information, how do we make sure that the
right decision is made?

You should know by now that this is not as easy as adding up numbers based on entirely
objective criteria! Think about the ranking, scoring, and weighting done in Multiple
Objective Analysis – and how differently two people could generate very different
outcomes.

As humans, we have biases based on how we thinking about the world around us, and
this can influence both personal, group, and organizational decision making. In fact,
people can make a series of what appear to be the ‘right’ decisions at the time – only to
add up to a very bad situation.
History of Decision Making Theory

• Decision making is a key investigative area of economics. Historically, there was


a key assumption that people make decisions using rationality and logic,
• However, Herbert Simon addressed this assumption in the 1950’s – instead he
suggested that humans don’t rely exclusively on rationality and logic.
• Two decades later Amos Tversky and Daniel Kahneman began researching the
types of biases that we use in decision making that can hamper logic.
• Since then, there has been a lot of research into the common biases, and how to
address them in practice.
• Biases detract from optimal decision making – although we like to rely on things
like intuition (“I follow my gut instinct!”), evidence suggests that reason and
analysis are superior to intuition.
How We (think we) Make Decisions
Define the Are you looking at the real problem? Have all aspects of the problem
Problem been identified?

Generate What is the optimal search time devoted to determining an


Alternatives appropriate number of alternatives?

Identify Relevant
Ensure that the correct criteria have been identified.
Criteria

Create a weighting and scoring system based on tangible and


Score the Criteria
intangible criteria.

Evaluate How well does each alternative meet the criteria? Is there a lot
Alternatives of variance in people’s scores?

Choose Best Choose the alternative that best meets the criteria. Is there a
Alternative clear winner? What process exists for ‘ties’?
Bounded Rationality

• Herbert Simon suggested in 1957 that humans do not follow a rational and
logical process in decision making.
• The basic idea is that humans become overwhelmed when trying to use logic to
solve complex problems. We therefore rely on shortcuts to help make the
decision making process manageable.
• Pressures that increase the reliance on shortcuts include the available time and
resources needed to gather the right amount, and right quality, of information
required to make a purely objective assessment.
• Even if we can gather all the information, our ability to cognitively process this
information is very limited!
• Therefore, the objective approach can only take us so far in decision making.
Cognitive Bias

• Cognitive bias refers to the combination of flawed approaches to decision


making that are part of a decision making process.
• We agree that bounded rationality exists – therefore what are the common
shortcuts that people use? If we can identify them, we can be aware of them
and reflectively adjust our decision making process.
• Tversky and Kahneman began to research this in the 1970’s. They were looking
for specific, systematic and predictable cognitive biases. In other words, what is
it that ‘bounds’ rationality?
Ideal vs Reality

Assumptions of the Rational Evidence from Organizations


Model
When making a decision, we consider all Due to our limited information-
relevant alternatives and accurately processing capabilities, we typically only
assess and compare their probable consider a small set of alternatives.
outcomes.
We use absolute standards and factual We often have an implicit favourite
information to evaluate and choose choice and bend the ‘facts’ to meet this
among alternatives. preference.
We evaluate all alternatives We frequently evaluate alternatives
simultaneously using objective sequentially and choose the one that is
measures and choose the one that has ‘good enough’.
the highest payoff.

Source: Gino, Bazerman and Shonk. Decision Making. Harvard Business Publishing 2016.
System 1 and System 2 Thinking

Source: upfrontanalytics.com
Seven Common Traps

1. Framing
2. The confirmation trap
3. The availability heuristic
4. The anchoring effect
5. Overconfidence
6. The representatives heuristic
7. The escalation of commitment
1 - Framing

• People typically prefer A over B (72%) and D over C (78%).


• Program A and C are the same: both result in 200 lives saved and 400 lost.
• Program B and D are the same: 33% chance of saving everyone, 67% losing everyone.
• If A is preferred over B, then you should also prefer C over D.

• Decisions are affected when a choice is framed as a gain or a loss.


• We don’t like to lose a ‘sure thing’, but tend to be risk-seeking towards losses.
• Therefore people tend to prefer A (definite about saving 200 people – a gain) and D
(more risky regarding lives lost).
• We evaluate options based on the status quo. If positive, we are risk-averse. If negative,
we will take on risk in order to remove the loss – even when the outcomes would be
the same!
• The reason for this is that we feel the pain of loss more than the pleasure of gain.
2 – The Confirmation Trap

• There tend to be a small number of consistent rules:


• Numbers go up by 2.
• The difference between the first two numbers equals the difference between the last two
numbers.

• The broader rule: any three ascending whole numbers.


• We tend not to look for information that disconfirms, rather than confirms, the
initial theory.
• The confirmation trap is the tendency to look for information that confirms our
beliefs even when information disconfirming them would be more helpful.
3 – The Availability Heuristic

• Most people will guess that more words start with ‘K’.
• However twice as many words typically have ‘K’ as the third letter.
• It is easier to think of words that start with ‘K’ than where ‘K’ is the third letter!
• The availability heuristic refers to a general rule we follow that is based on ease
of recall.
• This can impact emotional aspects of decision making, since we tend to
remember events more easily that have an emotional component.
• For example, consumers tend to be more confident in brands that are familiar.
4 – The Anchoring Effect

• A common response is to answer around 500,000. There are 1.5 million!


• The first piece of information made available strongly affects our judgements –
even when we know the anchor is arbitrary.
• For example, if we had anchored the question at 10 million, your answer would
likely have been higher than what was stated when anchored at 500,000.
• Therefore we tend to start with a potentially arbitrary anchor, and then adjust
from there.
• For example, sales targets are often based on the previous year’s number – but
this may not offer relevant guidance for the future.
• Other research has indicated that even having expert knowledge will not
eliminate the anchoring effect.
5 - Overconfidence

• Actual quantities:
• 1: 73
• 2: 491
• 3: $4.1 billion
• 4: 80 million
• 5: 1,264,360,000
• Most people only guess 2-3 correctly, in the 95% confidence interval (you should
get all if you were truly in the 95% range).
• Overconfidence has very strong effects because we think we are right, and that
we will succeed. E.g. – satellite phones, Blackberry.
• It can also support (multiply the effect of) other biases, such as the anchoring
effect.
• Three types of overconfidence: overprecision, overestimation, overplacement
(we’re better than others on certain dimensions).
6 – The Representativeness Heuristic

• Many think Linda is more likely to be a ‘bank teller who is an active feminist’
rather than just a ‘bank teller’.
• A subset cannot be more likely than the larger set.
• Linda is a bank teller = bank teller + probability of feminist + probability of not being
feminist
• There is a tendency when making initial judgements (can be about a person,
object or event) to look for information that is consistent with our stereotypes.
• We think about how the individual represents a larger group.
7 – Escalation of Commitment

• There is no clear correct answer.


• From a cost-benefit perspective, we should not allow further poor performance.
• However, we tend to want to continue investing in spite of evidence to the
contrary.
• This can be particularly true in initiatives – which is a series of decisions that
build upon one another.
• Past investments should be considered sunk costs – costs that should not be
considered in future decision making.
Overcoming Biases
Bias Strategy
Framing • Look at the problem from different perspectives.
• Rephrase choices in terms of what you would lose and
what you would gain.
Confirmation Trap • Actively seek out data that challenges your beliefs.
Availability Heuristic • Conduct appropriate research to gather all relevant
information.
Anchoring Effect • Focus on previously determined alternatives (prior to
the anchoring data).
Overconfidence • Question your beliefs, seek objective feedback.
• Be aware and plan for ‘worst-case’ possibilities.
Representativeness • Ensure you are using methodical analysis.
Heuristic
Escalation of • Recognize which costs are ‘sunk’.
Commitment • Pause and think: are you overly committed?
Decision Making in Groups

• Individual biases can be amplified in groups.


• Groups tend to rely more on the representativeness heuristic, overconfidence,
and framing.
• Planning fallacy: less accurate estimates regarding when they will complete a
task.
• Groups have the same likelihood of escalating commitment – but when they do
it tends to be more extreme.
• Groups can help mitigate the availability heuristic.
• The same strategies can be used in groups to help avoid biases.
Issues Unique to Groups

Failure to Capitalize on Diversity


A key reason to engage with groups is to consider problems and solutions from
different perspectives. But, in practice, groups tend to discuss information that all
group members already possess. Also, they prefer to discuss information that
supports pre-existing views.
Information Silos
Organizations tend to hold information that is only available to specific groups.
Information has value (organizational currency) and it can be difficult to efficiently
share information between groups.
Groupthink
There are social pressures in the group that work to enforce group harmony.
Individuals may suppress information and opinion. It can result in a feeling of
group invulnerability. It can be beneficial to assign the role of ‘devil’s advocate’ –
this explicit role can help overcome social pressure to conform.
Better Decision Making

1. A bat and a ball cost $1.10 in total. The bat costs


$1.00 more than the ball. How much does the
ball cost?
2. If it takes five machines five minutes to make five
widgets, how long would it take 100 machines to
make 100 widgets?

These questions and answer are sourced from:

Shane Frederick, “Cognitive Reflection and Decision Making.” Journal of Economic Perspectives 19 (Fall 2005): 25-42.
Last Question

3. In a lake, there is a patch of lily pads. Every day, the patch


doubles in size. If it takes 48 days for the patch to cover
the entire lake, how long would it take for the patch to
cover half the lake?

Source: pxhere.com
Three Step Framework

Lever One: Leverage System 1 Thinking


• We can use System 1 to simplify the
decision making process (avoid ‘analysis
paralysis’).
• We can trigger particular emotions and
biases to help ‘nudge’ people toward
better decisions.
• E.g. power consumption was reduced
when homeowners were given a
comparison or measure of their power
usage based on their neighbourhood
average by 2.4 - 10% more than
households given generic ‘energy saving
tips’.
Three Step Framework

Lever Two: Implement System 2 Thinking


• Engage in greater deliberation and analysis.
• Actively switch from evaluating options individually to simultaneously.
• Make enough time available to properly use system 2 thinking.
• Talk decisions through with an outsider – the outsider will have distance from the
problem. Individually, we can also consider a problem from an outsider’s
perspective.
• Reduce bias by increasing accountability for decisions in the organization.
• Overcome confirmation bias and escalation of commitment by actively looking for
information that disconfirms their point of view.
• Require employees to use decision trees and scoring systems to analyze choices.
Three Step Framework

Lever Three: Bypass Both Systems


• Avoid using ‘either system 1 or system 2’ thinking.
• Simplify the decision-making environment to encourage greater deliberation.
Because a key factor in all of this is our inability to process a number of complex
options, work to reduce the number of options or break them up into components
that are easier to process.
• Change the default option (the option if we do nothing). For example – organ
donation.
• Use a choice architecture to make automatic adjustments that compensate for
predictably biased decisions. For example, always adding appropriate buffer time
to projects due to chronic under-estimation of task times.
Self-directed Learning

This week, watch the video posted on the FOL site.

You might also like