Meet
Daniel Kahneman and Amos Tversky
Daniel Kahneman and Amos Tversky were a pair of Israeli-American psychologists who changed the way we think about thinking, particularly in the areas of human judgement and decision-making.
Think Michael Jordan and Scottie Pippen in the world of psychology and sociology. (This comparison is especially accurate, given DK is today more widely recognized than AT, but DK attributes his success directly to the collaboration with AT).
Their joint work gave rise to the field of Behavioral Economics, leading to Kahneman’s Nobel Memorial Prize in 2002 (sadly, Tversky passed away in 1996 and the award is not granted posthumously. There is no doubt, however, that he would have earned it.)
Kahneman and Tversky’s theories and concepts have influenced only economists and policymakers to better understand and improve their decision-making.
The goal of this profile is to help everyday people like you and me make better decisions in our daily lives.
Key Contributions from Daniel Kahneman and Amos Tversky
Below is a concise list of major topics and ideas they are known forProspect Theory +
Definition: Describes how people make choices under risk, valuing gains and losses relative to a reference point.
Example: Preferring a sure gain of $100 over a 50% chance to win $200, even if the expected value is the same.
Heuristics and Biases +
Definition: Mental shortcuts that simplify decision-making but can lead to systematic errors.
Example: Judging a person’s intelligence by how well they match the stereotype of a professor.
Loss Aversion +
Definition: Tendency to feel losses more intensely than equivalent gains.
Example: Rejecting a 50/50 bet to win $150 or lose $100, despite the favorable odds.
Framing Effects +
Definition: Changes in decision outcomes based on how choices are presented.
Example: Choosing a surgery with a 90% survival rate over one with a 10% mortality rate, even though they’re the same.
Anchoring and Adjustment +
Definition: The use of an initial reference point to influence estimates, even if irrelevant.
Example: Guessing higher real estate prices after seeing a high listing price.
Availability Heuristic +
Definition: Estimating likelihood based on how easily examples come to mind.
Example: Thinking shark attacks are more common after seeing one in the news.
Representativeness Heuristic +
Definition: Judging probability by how much something resembles a typical case.
Example: Assuming someone who loves books is more likely to be a librarian than a salesperson.
Cognitive Biases and Errors +
Definition: Systematic deviations from rational judgment or logical inference.
Example: Believing a hot streak in gambling will continue, despite random chance.
Decision Under Uncertainty +
Definition: Making choices without knowing outcomes, often using heuristics.
Example: Choosing between two job offers without knowing future success in either.
System 1 and System 2 Thinking +
Definition: System 1 is fast and intuitive; System 2 is slow and analytical.
Example: Automatically reading a sign (System 1) vs. solving a math problem (System 2).
Planning Fallacy +
Definition: Underestimating the time or resources needed to complete a task.
Example: Thinking you’ll finish a project in one week when it usually takes three.
Endowment Effect +
Definition: Overvaluing something simply because you own it.
Example: Refusing to sell a coffee mug for $5 that you wouldn’t pay $5 to buy.
Prospect Theory
Important statements from the 1979 paper, Prospect Theory: An Analysis of Decision Under Risk.
“In human decision making, gains and losses loom larger than final assets.”
“A common observation is that people prefer risk-averse choices in the domain of gains and risk-seeking choices in the domain of losses.”
“A change from zero to 5% is perceived to be more significant than a change from 50% to 55%.”
Prospect Theory was introduced in 1979 as an alternative to the long-standing and dominant model of decision-making known as expected utility theory, which assumed people weigh outcomes according to their actual probabilities and perceived final wealth.
Through a series of thoughtfully-designed experiments, where Kahneman and Tversky asked thousands of people to make hypothetical decisions involving gains, losses and probabilities, their research showed that predictable flaws occur when we make decisions under risk (i.e. judging the likelihood of outcomes).
On the whole, we tend to “systematically violate the axioms of expected utility theory.” We are not, as it turns out, the rational decision-makers we once believed ourselves to be.
Instead, our choices are shaped by internal and external factors that we often fail to recognize, and our decisions depend heavily on how information is presented to us. Even when we are supplied with sufficient evidence to draw a rational conclusion, we remain steered by these hidden cognitive processes and routinely misjudge probabilities in predictable ways.
The research presented in this paper presented a compelling alternative to a long-standing belief, opening the door to a new field of study known as Behavioral Economics.
Heuristics and Cognitive Biases
Important statements from the 1974 paper, Judgment under Uncertainty: Heuristics and Biases:
“People rely on a limited number of heuristic principles which reduce the complex tasks of assessing probabilities and predicting values to simpler judgmental operations.”
“These heuristics are highly economical and usually effective, but they lead to systematic and predictable errors.”
“In many situations, an event is judged more probable if it is more representative of the class to which it belongs.”
“The availability of consequences associated with an action is often a more compelling guide to behavior than objective likelihood.”
“Adjustments are typically insufficient. That is, different starting points yield different estimates, which are biased toward the initial values.”
Heuristics and Cognitive Biases are the mental shortcuts we rely on (often without realizing it) when we make decisions under uncertainty. These shortcuts help us simplify complex decisions, but often lead us astray.
Our mind and decision-making powers are often influenced by factors that might seem relevant but often have no importance to the current decision. We often zoom in too closely on the immediate choice and miss “the bigger picture” that would help us decide more rationally.
Three of the most well-documented mental shortcuts that illustrate how easily our reasoning can be nudged are: Representativeness, Availability, and Anchoring.
The Representativeness Heuristic describes how we are influenced when something resembles or is similar to a stereotype. “That bottle of wine has a formal label, a French-sounding name, and is more expensive than the others - it must be very good.” The representativeness heuristic is what leads us to make quick, gut decisions based on how well something matches our mental image of the ideal, without checking alternative sources of information (e.g. researching the bottle to see it’s relative quality).
The Availability Heuristic describes how we judge the likelihood of something by the ease with which it comes to mind. “I keep seeing vandalized car dealerships in the news. The country must be on the verge of a riot.” The availability heuristic is powerful because it leads us to ignore all of the non-existent cases, and statistics that may contradict the conclusion we jumped to (e.g. considering the thousands of dealerships not vandalized and thus not reported on). Also known as the media amplification effect, the more vivid or sensational an example, the more likely it is to dominate our perception—even if it’s rare.
Anchoring is the phenomenon where we latch onto an initial value (even if it’s random or unrelated) and let it shape our judgment. “The hostess said weekend wait times are usually 2 hours, so I was happy when we got a table after only 45 minutes last Tuesday.” Anchoring is a powerful tool of salespeople, hiring managers and marketers. Use it ethically to your advantage, and be wary of those using it as an unfair advantage.
Note that these are not the only heuristics and biases (see the list above, as well as additional resources online, like the DecisionLab.com)
The Framing Effect
A now-famous example of the Framing Effect appeared in Kahneman and Tversky’s 1981 study, in what they called the “Asian Disease Problem” (prescient, right?).
The study goes like this:
Imagine the U.S. is preparing for the outbreak of an unusual Asian disease expected to kill 600 people. Two programs are proposed:
If Program A is adopted, 200 people will be saved.
If Program B is adopted, there is a 1/3 chance that all 600 will be saved, and a 2/3 chance that no one will be saved.
(Positive frame: lives saved)
Now consider the same problem framed differently:
If Program C is adopted, 400 people will die.
If Program D is adopted, there is a 1/3 chance that no one will die, and a 2/3 chance that all 600 will die.
(Negative frame: lives lost)
In reality, Programs A and C are equivalent, as are B and D. Rationally, choices should be consistent.
But they’re not.
When framed in terms of lives saved (A vs. B), most people chose the sure option—Program A.
When framed in terms of lives lost (C vs. D), most people chose the risky option—Program D.
Same outcomes. Different wording. Reversed preferences.
The Framing Effect describes how we tend to arrive at different conclusions based on how information is presented to us, even if the information is factually identical.
Consider the term “tax refund.” It’s often positioned as a bonus, despite the fact that it’s your money being returned to you after the government over-taxed you. The term “refund” makes it feel more like a gift and less like a correction on being over-taxed.
Or, how patients are much more likely to try a procedure with a “90% success rate,” compared to a “10% death rate.” Same fact, same procedure, but a very different emotional response.
Much like loss aversion, framing taps into our tendency to chase gains and avoid losses, even when the underlying reality doesn’t change.
And, similar to anchoring, the way information is framed can be intentional, often used to sway our decisions one way or the other.
Read more about The Framing Effect in Kahneman and Tversky’s 1981 paper, The Framing of Decisions and the Psychology of Choice.
These “mind quirks,” uncovered by Daniel Kahneman and Amos Tversky, show up all around us. Recognizing them can help us make clearer, more grounded decisions in an increasingly complex world.
Mental shortcuts like heuristics and framing can be helpful—but they can also lead us astray, especially when the stakes are high or the information is overwhelming.
You'll find these patterns at play in the news, on social media, in advertising, healthcare, personal finance, home buying, job searching, and beyond.
Becoming aware of them doesn’t mean becoming cynical—it just means developing a sharper lens. You begin to notice when something feels off, when messaging is nudging rather than informing, and when your own instincts might need a second look.
Thank you to these references & sources
-
Tversky, A. & Kahneman, D. (1974).
Judgment under Uncertainty: Heuristics and Biases.
Science, 185(4157), 1124–1131.
Full text at UCI – The classic paper introducing representativeness, availability, and anchoring heuristics. -
Kahneman, D. & Tversky, A. (1979).
Prospect Theory: An Analysis of Decision under Risk.
Econometrica, 47(2), 263–292.
Full text at MIT – Seminal paper outlining prospect theory and concepts like loss aversion and the certainty effect. -
Kahneman, D. & Tversky, A. (1981).
The Framing of Decisions and the Psychology of Choice.
Science, 211(4481), 453–458.
Full text at PMC – Key study demonstrating framing effects (e.g., the Asian Disease problem). -
Kahneman, D. (2011).
Thinking, Fast and Slow. Farrar, Straus and Giroux.
– A bestselling book that compiles decades of Kahneman & Tversky’s research. Introduces System 1 and System 2, and explores dozens of biases. - Interview with Daniel Kahneman (2002). NobelPrize.org – Kahneman reflects on Tversky’s role and their unique collaboration.
-
Lewis, M. (2017).
The Undoing Project: A Friendship That Changed Our Minds.
– A biographical account of Kahneman and Tversky’s intellectual partnership and impact. -
Kahneman, D. (TED Talk, 2010).
“The Riddle of Experience vs. Memory.”
Watch on TED.com – A talk that explores happiness, perception, and memory with cognitive insights. -
Character and Context Blog (2019).
“The Planning Fallacy: An Inside View.”
Read on Conversable Economist – Roger Buehler’s summary of planning fallacy research and tips to avoid it. -
Quick Bias References:
The Decision Lab: Biases Index
Wikipedia: List of Cognitive Biases
– Great overviews of common biases, many rooted in Kahneman and Tversky’s work.