A Lesson in Incentives
The Cobra Effect
British India had a cobra problem. The government paid a bounty for dead cobras. Cobra population increased. Why?
The British colonial government in Delhi was concerned about venomous cobras. The solution seemed obvious: pay people for every dead cobra they brought in.
At first, it worked. Cobras flooded into the bounty offices. The government congratulated itself on an elegant market-based solution.
Then something strange happened. The cobra bodies kept coming, but sightings of wild cobras stopped declining.
People were breeding cobras for the bounty.
When the government discovered this and cancelled the program, the breeders released their now-worthless snakes into the streets.
Final cobra population: higher than before.
This is Goodhart's Law weaponized:
“When a measure becomes a target, it ceases to be a good measure.”
Once you incentivize a metric, people optimize for the metric, not the underlying goal. The metric and the goal, once correlated, begin to diverge.
Watch the Cobra Effect Unfold
Adjust the bounty amount and watch what happens to the cobra population. Notice the three phases: hunting success, farming exploitation, and post-cancellation release.
Try setting a higher bounty. The more you incentivize, the more you create the farming behavior.
The Pattern: Metric Does Not Equal Goal
The Cobra Effect isn't a historical curiosity. It's a fundamental pattern that appears whenever we try to measure and incentivize human behavior.
The problem isn't bad intentions. The people gaming the system are responding rationally to the incentives we created. The problem is that any metric we choose will only be a proxy for the goal we actually care about.
Correlation
0.97
Gaming Agents
0
Slide the incentive level and watch the correlation break down. Initially, the metric (horizontal axis) and the true goal (vertical axis) are aligned. As incentives increase, rational actors start gaming the metric, and the relationship falls apart.
Case Studies in Perverse Incentives
The Cobra Effect has appeared throughout history, across cultures and industries. Click each case to see how the intended outcome diverged from reality.
Cobra Bounty
Colonial India, 1800sMetric: Dead cobras submitted
Rat Tail Bounty
French Vietnam, 1902Metric: Rat tails collected
Soviet Nail Factory
USSR, 1930sMetric: Weight of nails (tons) or count of nails
Wells Fargo Accounts
United States, 2016Metric: New accounts opened per employee
School Test Scores
United States, 2001-presentMetric: Standardized test scores
Hospital Wait Times
United Kingdom, 2000sMetric: Time from admission to treatment
Experience It Yourself
The Soviet nail factory problem is famous because it shows how even well-intentioned quotas create perverse outcomes. Play as the factory manager and see how natural it feels to game the metric.
You are a Soviet factory manager.
The central planners have given you a quota. Meet it and earn your bonus. How will you optimize production?
Design Your Own Disaster
Think you can design an incentive that won't be gamed? Pick a goal and a metric, and we'll show you how the Cobra Effect will emerge.
How to Avoid the Cobra Effect
The Cobra Effect seems inevitable, but there are strategies that reduce its impact:
Use Multiple Metrics
Gaming becomes harder when you must optimize several measures simultaneously. If one can be gamed, the others act as checks.
Measure Outcomes, Not Outputs
Instead of counting dead cobras, measure cobra bite incidents. Outputs are easy to fake; true outcomes are harder to game.
Rotate Metrics Unpredictably
When people don't know what will be measured, they can't optimize for it. Random audits are more effective than predictable reviews.
Monitor for Gaming Behavior
Look for statistical anomalies: distributions that don't match natural patterns, suspicious timing, or results that are "too good."
Align Incentives with True Goals
When possible, pay for outcomes rather than activities. The bounty should have been for documented cobra-free neighborhoods, not dead snakes.
Consider the Meta-Game
Before implementing any incentive, ask: "If I were trying to game this, how would I do it?" Then design against those strategies.
The fundamental lesson:
Any system that can be gamed, will be gamed. The question isn't whether to measure, but how to measure in ways that stay aligned with your true goals as people optimize against them.
Every metric is a proxy. Never forget what the metric is supposed to represent.
Want More Explainers Like This?
We build interactive, intuition-first explanations of complex concepts in economics, statistics, and system design.
Reference: Goodhart (1975), Strathern (1997)