The Cobra Effect (Part 2)

When I started this project to learn about unintended consequences, my first post to go viral (top page of Hacker News) was about the Cobra Effect. The Cobra Effect is another name for “perverse results,” or how when we want more (or less) of something, we sometimes instead create the conditions that produce the opposite of our intended outcomes. In that post I took three well-known examples of the cobra effect and invented antidotes for them.

Those well-known Cobra Effect examples all involved animals (cobras, rats, and pigs) and so my antidotes were based around the animals’ reproductive cycles. I made the claim that those animal examples had the solution built into the problem. Readers loved it (creative look at an old topic!) and readers hated it (you can’t stop the Cobra Effect!).

Since the Cobra Effect is a type of unintended consequence that keeps coming up, I decided to write part two.

My first post dealt with actual events (trying to reduce the population of rats in colonial Vietnam and feral pigs in the US) and one invented one (the cobra bounty in colonial India) in this post I’m going in a different direction.

Copy and Paste

The top five most common online account passwords are: 123456, password, 123456789, 12345678, and 12345. While that list seems ridiculous, with perhaps 191 passwords required for many businesspeople to function, it’s reasonable for users to try to simplify things. (That number is perhaps skewed, but I think we can agree that there are many passwords required today.) Too many passwords for people to remember if they use a different password for each account.

As a result, some people use passwords that are easy to remember — often short and simple.

And then as a result of that, many user logins started to require that people adhere to some minimal levels of “difficulty” when choosing a password. For example in some cases, passwords needed to be a certain number of characters, include special characters, not include consecutive numbers, not include words.

Then, after all that, some logins started to disallow copy and paste, even impacting some password managers.

Eliminating copy and paste is an interesting choice because to the average user, that’s how they dealt with the requirements for a long, complicated password. Type something long and complicated, copy it, and then paste it into password fields one and two (many online registrations require the password to be entered twice). Afterward, depend on the browser to save it or the “forgot password” feature to generate a new one.

The intention behind eliminating copy and paste was to improve security. But the impact can have the opposite effect.

Once copy and paste was disabled, some users went back to passwords that were easy to remember — just to be able to retype them for the two password fields. They also wrote down their passwords to be able to refer to them elsewhere.

Could we reinvent the user behavior helpfully?

Demand: Better user process for inputting passwords. Result: Complicating things for users, producing worse passwords that are often simpler and reused.

Reinvention: This is a tough one. The classic Cobra Effect examples involved just one group making the change in behavior. But the password situation is more complicated. There are easily thousands of people setting password policy for their respective sites. This is a situation where promoting a set of best practices will trickle down somewhat but not totally, leaving people with different password policies in different places.

Even Communist Crops Are For Consumption

I’ve written about a few instances where top-down government intervention attempted to improve crop yields. These were situations in which the theory may have made sense but it fell apart when applied. In other words, the theories were wrong.

Four Pests Campaign. Top-down governmental actions create these Cobra Effects so many of the extreme examples are from autocratic governments with large populations. As I wrote in “Eradication’s Good Intentions“, during China’s Great Leap Forward, there were country-wide efforts to “eliminate sparrows” in order to reduce the amount of grain lost to them. But since “sparrows also eat insects, the sparrow elimination was followed by a surge in locusts — which ate even more grain.” Result: even less grain.

Demand: Eradicate sparrows to get higher crop yields and healthier populations. Result: Lower crop yields and mass starvation.

Reinvention: Test with small regions rather than nationally. Don’t roll out full policy nationally for a couple years, giving the chance for test results to be seen and a scapegoat found to take the blame.

Deep-plow farming. As I wrote in “Food From Thought,” deep plowing “came from Soviet scientist (or pseudoscientist) Terentiy Maltsev (and colleague of Stalin’s friend Trofim Lysenko) and was forced top-down on farmers throughout China.” The intent was to increase crop yields, which the theory claimed. But the theory didn’t work and ultimately led to dramatic falls in crop yields. Millions starved as a result.

Demand: Improve crop yields. Feed more people. Result: Decreased crop yields and starving people.

Reinvention: Small test plots. Let the political leaders farm and eat what they produce. Likelihood of success, near zero.

What is Safety?

What about the belief that the world today is safer than it has ever been before?

The belief comes from noting the drop in crime rates, murder, poverty, and deaths from war in recent history. Notable proponents of this idea include Steven Pinker. Here’s why this idea is wrong.

As I wrote in “Autonomous Vehicles and Scaling Risk,” our attempts to make a safer system can lead to more danger if that system impacts more people than what it replaces. Yes, you can find results that show the rate of death falling. But that’s assessing a situation by past data instead of potential outcomes that the new more interconnected system holds.

I’ll go with a war example. The impact of the atomic bombing on Hiroshima and Nagasaki has been studied for almost 75 years.

What happens when technology develops that enables death at great scale? When the US dropped the atomic bombs, “some 70,000–80,000 people, or around 30% of the population of Hiroshima, were killed by the blast and resultant firestorm.” In Nagasaki, “at least 35,000–40,000 people were killed and 60,000 others injured” (source).

The impact of almost instantaneous events like the atomic bombings stand out because they were not possible in the past. But if the US had developed the bombs and never dropped them, their potential impact would still exist. Think of how many new ways there are to destroy life on this planet, by high-impact actions from militaries, microbes, and economic systems. The way that Pinker and others look at the question of safety ignores the whole system. The greater connectedness of many parts of the world mean that, while the day to day trend may currently be toward safety, the potential for great disruptions to that safety can impact many more people.

There are other places where we see ideas like this. Here’s another large-scale top-down concept being discussed.

Desire: Policies and actions that improve safety and health. Result: Those improvements are counteracted by the greater systemic risk felt by a more connected system.

Reinvention: First, update your understanding of safety from looking at the past to looking at the future. Then, consider where connectedness exposes people to a quick disruption to their safety.

Reversing the effects of climate change

There are proposals to reduce the impact of global warming by injecting particles into the upper atmosphere to reflect sunlight. One of the proponents of this plan is David Keith of Harvard, who calculates that:

“[I]f operations were begun in 2020, it would take 25,000 metric tons of sulfuric acid to cut global warming in half after one year. Once under way, the injection of sulfuric acid would proceed continuously.”

But if the plan doesn’t work as expected, how do you reverse it? What other effects will it produce by reducing the amount of sunlight to hit the surface of the earth?

If the plan does work, who gets to decide the degree to which you reflect sunlight to cool the earth? Depending on location, people may have different desires for temperature. Should humans even make this decision?

Desire: Lower temperatures and fewer climate related environmental problems. Possible Result: Unpredictable temperature changes, unpredictable results from temperature changes, other impacts of less sunlight.

Reinvention: …Stay away from this one…

We Want Startups. But Why?

One of my pet peeves is the startup pitch event, as typically run. If you’re not familiar with this term, a pitch event typically brings a group of startups together in front of an audience and panel of judges. There may be prizes or the award may be simply applause. Judges often give feedback after hearing the short pitches and Q&A. Whether the judges can provide helpful feedback and whether the audience is appropriate are other questions.

But why do we want more startups? That is what these plentiful events seem to want to produce. And why put so much attention on the very early-stage (one’s more likely to pitch at an event). The pre-customer, pre-investment startups are risky. Perhaps they are better as side projects first? Perhaps their founders should work in a later-stage business first? Should they even be doing their business at all?

But the pitch event once announced, must produce a winner. Maybe there’s a way to do that while also getting quality output.

Desire: More startups. Result: More things that look like startups but are actually just pitches that will not be built.

Reinvention: Only run (or participate in) pitch events that have cash prizes. Then deliver the prizes after the teams have met certain milestones.

This solution is difficult. Who is going to monitor the startups? Will pictures with a big check that says “potential cash prize” have the same impact on PR? What happens if the money is not available to be disbursed at the later date?

Considerations

  • The Cobra Effect happens when we get a perverse result as an outcome of our actions.
  • Complex systems fail in complex ways. They produce unexpected outcomes. But that doesn’t mean that we should give up and try to avoid likely fail points. That’s why I take an aggressive approach to avoiding the Cobra Effect. Ignoring potential changes of our changes would be foolhardy.
  • Look for the Cobra Effect where there is top-down change, where more people are affected, and it is difficult to reverse decisions.