The Self-Defeating Prophecy (and How it Works)


I write these posts to call attention to common phenomena that make the world work differently than we might think. One less discussed type of unintended consequence is the “self-defeating prophecy,” or “self-negating prediction,” where the existence of a prediction or belief ultimately leads to the opposite of what is expected.

I think of this phenomenon as a tortoise and hare situation. Not the “slow and steady” part, but instead that the unexpected nature of who would win a tortoise-hare race is baked into the participants’ behavior. The hare only stops to rest because it is so obvious that it will win, which in turn becomes the reason it loses.

That’s what happens with self-defeating predictions. That is, self-defeating predictions come from the belief that in the future X will happen, which leads to the opposite of X happening. The prediction itself leads to behavior change which in turn changes the outcome. This is a characteristic of what are sometimes called “level two chaotic systems.” Let’s look at some examples.

Economics

An introduction to the concept comes from Robert Merton’s essay “The Unanticipated Consequences of Purposive Social Action,” which lists five sources of unintended consequences. They are ignorance, error, immediacy of interest, basic values, and the self-defeating prediction. It’s a list combining human psychology and sociology. In other writing, Merton went on to discuss a wide range of topics, including how economic improvements hold the kernel of their deterioration (Hutber’s Law).

In a longer-run self-defeating prophesy, Merton mentioned how the idea and outcome of Weber’s Protestant Ethic and the Spirit of Capitalism can eventually reverse. People work hard when they are poor. Their hard work pays off. They become wealthy. Once wealthy, they no longer need to work hard, which leads to them (or a future generation) becoming poor again. This is partly why there is economic mobility (or, shirtsleeves to shirtsleeves in three generations).

Even when people attempt to avoid this phenomenon it happens. Central bankers and investors dance around economic management risks again and again (though they might not personally pay a price for failure). During the US housing bubble people “decided that the usual rules didn’t apply because home prices nationwide had never fallen before. Based on that idea, prices rose ever higher — so high… that they were destined to fall.”

Environmental Resource Concerns

How many examples are there of environmental crises that don’t emerge? Some environmental crises are slow moving and don’t emerge because action stops their full impact.

In his 1968 best-selling book, The Population Bomb, Paul Ehrlich made a strong case for how population growth would outstrip food and resource supply. Even if humans procreated less they were doomed to mass starvation. From the Prologue: “The battle to feed all of humanity is over. In the 1970’s the world will undergo famines—hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now.”

Years later, when these predictions did not come true thanks to better distribution and increased food production, Ehrlich even suggested that his predictions were not wrong, but may have helped create the solution. (His own treatment of the past may be more personal cognitive dissonance.)

The end of oil as a resource has been predicted many times over the past century, including 1909, 1937, 1945, 1966, 1972, 1980, and 2007. Each time, the end was paired with attempts to move industrial use to other fuel types. But with the exception of a few price shocks, the adjusted price of oil only slightly trends upward over the last 70 years. This, in spite of demand that definitely trends upward over the same period. Where is the end? Did the belief that oil use was going to end lead to new methods of exploration and production?

A go-to heuristic for avoiding unintended consequences is to avoid unnecessary action, which might just make things worse. From The Precautionary Principle in Environmental Science: “taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making.”

How much should environmental scientists or campaigners hype future crises? If experts believe a problem is coming (species extinction, climate change, resource depletion etc), then should they tell the worst-case story in order to draw attention? Should they exaggerate, or encourage others to exaggerate, in order to generate action?

If the worst-case, or worse than worst-case, stories are told and believed and the crisis is averted, how is that interpreted afterward? Long-run, this is a problem. It leads to distrust. But some promote showing the worst-case scenario for example with climate change. This is a different application of the precautionary principle. The only problem is that the principle is applied as much for the behavior change as for belief in the outcome. If crisis situations pass without incident, will people note that their behavior was hacked in a direction one group wanted and thus be less likely to trust future worst-case scenarios? This is a risk.

The Y2K Bug and Aftermath

The Y2K bug (a hard-coded design flaw in older computers that left them unable to handle the calendar shift to the year 2000) was an immense business cost of the late 1990s.

The risks associated with Y2K led to global preparatory spending of hundreds of billions of dollars. Possibly $100 billion went to upgrade computer systems in the US. During the same time, many developing economies did little to upgrade — an inaction that should have led to outright chaos.

But instead, there was no massive shutdown of computer systems on January 1, 2000. Why not? A list of many, though mostly minor, bugs that presented after January 1, 2000 includes errors to nuclear reactor power production, credit card processing, gas pumps, power outages, slot machines, stock exchange record-keeping problems, weather reporting, and more.

It seems that some of the self-defeat of the Y2K bug came from concerted efforts to fix it in advance. But then some of the effects of Y2K seem to have been hyped up in advance in order to fuel the consulting businesses that popped up to help client companies improve or tell their shareholders the story that they were Y2K ready. But there are even non-obvious (to me) stock performance metrics following announcements of a company’s Y2K-readiness or intended Y2K contracting.

One interpretation is that the preparation in the years leading up to 2000 avoided a disaster. That makes sense in countries where there was serious time and resource investment to fix the bug. But what about other countries?

I am surprised at how, even now, the Y2K debate over the hundreds of billions of dollars quietly continues.

Elections

This quote, from March 2016, is an interesting take on inaction related to Trump. “[T]he fact that so many people believed Trump would be prevented from becoming the nominee—whether by the voters suddenly, magically becoming more “serious” or right-leaning fat cats deciding to spend tens of millions on attack ads against him—made it become a self-defeating prophecy. The candidates, donors, and their consultant remoras never bothered to take action because each of them figured that someone else would do it. Everyone figured out that the party would decide, somehow, that it would get rid of the interloper.”

There are even concepts, probably ever-changing (and incorrect) ones, for how to arrange primary elections in favor of one party or another.

I don’t have good data on this, but I wonder if there could be a political benefit to “seeming” to be the underdog. Perhaps it was the believed impossibility of Trump’s election that helped him win. As in, if it is “impossible” for Trump to win in the primaries, then focus on other candidates. If it is “impossible” for Trump to win in the general election, then if you’re on the other side, relax. Years earlier, perhaps it was the believed impossibility of Obama’s election that helped him win. Elections are an interesting test case for the self-defeating prophesy since even in the most contested ones in the US, nothing like a majority of the population votes.

Doomsday Cults

I had the fortunate experience to see members of one of these up close. Years ago when I was running a startup, for a time our team shared a floor with another company. Some members of the other firm were in the Family Radio religion that believed the Rapture was coming on May 21, 2011, followed by the end of the world on October 2011. I can tell you it was awkward around the water cooler during that time. But people in that cult and others, most notably the cult run by Dorothy Martin that believed the end of the world was coming on December 21, 1954, can have a different interpretation of events.

Sure, the end of the world didn’t come as they predicted. But perhaps it didn’t come because of the group members’ faith. That’s the belief shift that tends to happen in these cults. In Martin’s case, the new belief became that “[t]he little group, sitting all night long, had spread so much light that God had saved the world from destruction.”

You cannot disprove this belief. Not a bit. You can only have great stories to tell.

Conclusions

Sometimes you should want a prediction to be defeated, such as when the prediction is one of catastrophe (or the end of the world). But what protects against the self-defeating prediction?

Use bets, like larger versions of the Simon–Ehrlich wager to hedge against these predictions.

Write down your predictions, especially if you keep them private, in order to train your own analytical models or intuition.

Otherwise, don’t make public predictions that may be self-defeating, in order to avoid scale effects from changed action. If the prediction is public, call attention to the way the prediction can fail in order to avoid some of those counteractions. In The Alchemy of Thought, the “Paradox of Prediction” is summed up as “If you want your predictions to come true, keep them to yourself.”