Goodhart's Law Corrupts Every Metric

When you optimize hard enough for any proxy measure, the underlying thing you actually care about starts getting worse, not just stagnant. This is the strong version of Goodhart's Law.

"When a measure becomes a target, if it is effectively optimized, then the thing it is designed to measure will grow worse." Marilyn StrathernBritish social anthropologist (b. 1941) at Cambridge, known for her work on audit culture. This formulation, from her 1997 paper Improving Ratings, is a sharpened restatement of Charles Goodhart's original 1975 observation about monetary policy targets in the UK.

The original formulation "when a measure becomes a target, it ceases to be a good measure" is well known. But the strong version, drawn from machine learning's concept of overfittingIn ML, overfitting occurs when a model learns the noise in its training data rather than the underlying pattern — performing well on training examples but poorly on new data. The Goodhart analogy is precise: optimizing too hard for the proxy degrades performance on the real objective., is more alarming. It is not just that the proxy stops tracking the goal. Continued optimization actively degrades the goal. Standardized testing was meant to measure educational quality; optimized aggressively, it produces schools that teach test-taking at the expense of genuine learning. Publication counts were meant to proxy scientific progress; cash bonuses per paper produced paper mills and fraudulent results.

The pattern repeats everywhere. Pay researchers per publication and you get research mills. Optimize social media for engagement and you get conspiracy theories and tribalism. Optimize capitalism as a proxy for resource allocation and you get wealth disparities that undermine the system's own legitimacy. The mechanism is always the same: the agent finds ways to juice the proxy that are uncorrelated with or actively hostile to the underlying goal.

The machine learning field has developed mitigations worth studying: early stopping (halt optimization before it overfits), injecting noise (making the system harder to game), and regularization (penalizing extreme or unusual behavior). These translate surprisingly well to institutional design. Term limits are a form of early stopping. Randomized audits inject noise. Progressive taxation penalizes extreme concentration.

The deepest lesson is that efficiency is not always your friend. Some slack, friction, and apparent waste in a system may be precisely what prevents the proxy from decoupling from reality.

Takeaway: Whenever you set a target, monitor what happens to the thing the target is supposed to represent and be prepared to stop optimizing before you overshoot into harm.


See also: Efficiency Is The Enemy of Resilience | Chesterton's Fence Before You Tear It Down | The Ludic Fallacy Life Is Not a Casino | The Streetlight Effect Distorts What We Know | Wittgenstein's Ruler Measures the Measurer