Origins of Error

A source of unintended consequences is error.

But where do errors come from? What kinds of errors are there? What can we do to minimize exposure to error?

This is one of the causes of unintended consequences listed by Robert Merton in “The Unanticipated Consequences of Purposive Social Action” (the others being ignorance, short-term vs long-term interests, basic values, and the self-defeating prophecy).

Over more than 20 years of on and off work, William Shanks calculated Pi to 707 decimal places. He published his results in 1873, but he had made an error. At the 530th decimal place, Shanks’ calculations go off and the digits that followed were incorrect. It wasn’t until 1944 that another mathematician, D.F. Ferguson noticed the mistake. After all his work, Shanks became best known for making that error calculating Pi.

Pi after the 530th decimal place seems like a small issue. But maybe it’s not. The bigger problem is the relationship between creators of mathematical and statistical tables and their much more numerous users. Tables of published figures have power over future actions. Produced at a great cost of time and money, how long until an error will be found? Without accuracy, tables produce a cascade of incorrect results.

The publication of incorrect tables of calculations (many more than the Pi example) used to be a big problem. In other forms, a lack of trust in accuracy of calculated numbers drives extreme behavior. More than 100 years after Shanks’ mistake Intel would go on to spend $475M and rework its designs in order to remove a small calculation error produced by its Pentium processors.

Lets look at the impact of error on unintended consequences.

To start, we’ll go back to Robert Merton’s paper “The Unanticipated Consequences of Purposive Social Action,” which I mentioned in an earlier post on Self-Defeating Prophecies. Another of Merton’s factors contributing to unintended consequences is error. He describes several main types.

Merton on Errors

According to Merton we make errors:

  • In “our appraisal of the present situation.” That is, all the ways we can misunderstand what is actually happening. Given complexity and complications, these misunderstandings can be easy. For example, did a food writer’s rave review really destroy this popular burger bar or were other factors at work?
  • In “our inference… to the future…” Now there are even more things that can go wrong. We take an incorrect appraisal and add to that future projections. Humans are notoriously poor at predicting the way we will act in a theoretical future state. Consider low-accuracy (but high precision) financial projections even where there is significant data to analyze.
  • In “our selection of a course of action.” How do we make the selection? What biases do we bring to the selection?
  • In the “execution of the action chosen.” Even if we have appraised and chosen well, there are often problems carrying out the intended action. Think of cost and time overruns in infrastructure projects. Implementation can be harder than theorizing.
  • Assuming actions that previously led to the “desired outcome will continue to do so.” This is where habits play into future outcomes. This is also where we might see the self-defeating prophecy.
  • Neglecting the “systematic thoroughness in examining the situation.” It is much easier to think in direct, first-order ways. Understanding how the system works is difficult or impossible.
  • In a “determined refusal or inability to consider certain elements of the problem.” Cultural, social, political, religious, group backgrounds all play into what we are even willing to think or consider. This is a reason that “outsiders” can develop different solutions to problems faced by “insiders.”

Other Classifications of Errors

Beyond Merton’s causes, there are other ways to make and evaluate errors. Errors are counted as systematic errors, random error, and blunders.

Systematic errors are those that come from incorrect calibration (used loosely) of a measuring device. That device will consistently provide measurements that are inaccurate in the same direction (your bathroom scale shows that you are five pounds lighter than you really are). Your friend steps on the scale and also gets a too-light measurement.

Random errors are those that come from the difficulty of taking measurements. Random errors tend to fluctuate around the actual value.

Blunders are unintended actions that lead to an incorrect result. Blunders include Shanks performing the wrong calculation at the 530th decimal place of Pi and making a wrong turn that leads to a cascade that starts WWI.

We have incentives to see things in certain ways.

“It is difficult to get a man to understand something when his salary depends upon his not understanding it.” — Upton Sinclair

P-Hacking,” “data dredging,” or correlation hacking is the practice of selectively analyzing data in different ways to produce a desired result. Related to this, we have Goodhart’s Law: when a measure of success becomes a target, it ceases to be a good measure, since people will start to game target outcomes.

Also there is the ability to pull “insight” from any damn thing. Go back to the tables from above — the more numbers the better — and your brain will find some sort of pattern that seems to reveal an undiscovered insight.

There are many ways to err. A well-known and well-meaning, but incorrect quote:

“The good thing about Science is that it’s true whether or not you believe in it.” — Neil deGrasse Tyson

The quote itself contains an error. First, “science” historically shifts in one way or another by ideology, culture, and politics (both national and from scientific communities). As Max Planck said: “Truth never triumphs—its opponents just die out,” and also, “Science advances one funeral at a time.”

Second, while a scientific hypothesis can be rejected and then changed and tested against again, describing the way the world works with accuracy, a hypothesis cannot be proven. Science has a long list of “truths” that were eventually disproven when we learned more.

This was the problem behind a book like The Bible Code and underlying Family Radio founder Harold Camping’s belief that the Rapture would come on May 21, 2011 (the end of the world was to be October 21, 2011).

These were incorrect “insights” from years of “study” that impacted thousands of people. I know, having interacted with some from the Family Radio group before their big day. When I expressed polite skepticism at knowing that the world would end later that week, I was met with confidence. When I asked one member what he used to do before he joined Family Radio, he told me “Oh, before I found this religion I used to march in the streets for communism.”

We want to believe.

Considerations

  • Be aware of Merton’s types of error that lead to unintended consequences, a step toward making fewer mistakes.
  • Accept that there will always be some error.
  • Look to minimize the impact of error by avoiding high-impact, sudden changes in areas of complexity.