During World War II, the US Army assigned statistician Abraham Wald the task of statistically figuring out where extra armor should be added to American bombers.
After analyzing the evidence and sharing it with the Army, he recommended the exact opposite of what the Army assumed. The reason was that the Army had engaged in a logical fallacy.
Learn more about survivorship bias and how it manifests itself into everyday thinking, on this episode of Everything Everywhere Daily.
This episode is sponsored by Scottevest.
Autumn is here, and with it comes cooler weather and jackets.
If you are looking for a jacket this season, I highly recommend checking out Scottevest. They have pockets to hold all of the gear you carry with you.
I’ll often go to a cafe to write and work and I can take everything I need in the pockets of my Scottevest jackets. It can easily hold my Macbook Air, charging cables, iPhone, and my AirPods in its pockets, and no one is the wiser that I have anything on me.
In addition to jackets, they also have a great selection of shirts, pants, and dresses, all of which have extra pockets.
You can get 15% off all Scottevest products by going to Scottevest.com and using coupon code “EverythingEverywhere”, all one word, at checkout.
Once again that is Scottevest.com.
Let’s start this discussion with the definition of survivor bias. The simple definition is that is considered a logical fallacy, or a statistical error, or a cognitive shortcut, where you don’t look at an entire population when making an inference because some part of the population didn’t meet certain criteria.
The classic example which illustrates the point is the example that I mentioned in the introduction.
In World War II, the Army Air Corps wanted to provide additional armor to their bombers that were flying missions over Germany. Bomber crews had the highest fatality rate of any type of service group of the Allies during World War II.
In an attempt to reduce their losses, the Americans wanted to put more armor on bombers. The problem was, armor is heavy and the more armor a bomber had, the fewer bombs they could drop.
The solution was to just provide armor to the most vulnerable parts of an aircraft.
The Army began a study to analyze where the planes were being hit. When a plane returned back to base after a mission, all the bullet holes would be documented and they would then be tallied.
A pattern soon emerged. When all the bullet holes were mapped out, there were clear clusters of them on the tips of the wings, the back part of the fuselage, and the tail.
The Army determined that these spots on a plane with the most bullet holes were where the armor should go.
Before they went ahead they let a statistician by the name of Abraham Wald who worked at Columbia University. He saw the same data and came to the exact opposite conclusion as the Army.
He noticed that there were no bullet holes recorded on engines or the cockpit. He realized that they were only recording bullet holes on the planes that made it back to base. What they weren’t counting were all the planes that were shot down.
Bullet holes on the tips of the wings and on the tail didn’t indicate that those places should get extra armor. What it meant, is that you could get shot in those places and still make it back to base.
The Army was only getting data from the planes which survived, not all the planes which actually flew. This became known as survivor bias.
Survivor bias can be found all around us. It can result in coming to the wrong conclusion about things because you aren’t looking at all the data.
There are several examples of survivor bias that creeps up in business. One is trying to use success stories to guide your decisions.
Bill Gates, Steve Jobs, and Mark Zuckerberg were college dropouts that went on top become billionaires. Many people have used their examples to justify dropping out of college or never attending at all.
It is true that these men did drop out of college.
However, just focusing on these exceptions ignores the thousands of people who dropped out but didn’t achieve the same level of success. In fact, if you look at the entire population of people who have dropped out, overall they do more poorly than those who stay in school to complete.
The Dow Jones Industrial Average is claimed to have positive returns most years and that if you invest in the companies which make up the Dow Jones, you will most probably see decent returns over time.
What this ignores is that the companies that actually make up the Down Jones Industrial Average change every few years. If you remember back to my episode on the companies which made up the original Down Jones, if you bought stock in them when the index was created, many of them, like the United States Leather Company, would have gone out of business.
These underperforming companies are removed from the Dow Jones before they can ever bring the index down and replaced with more successful companies.
The same is true with the S&P 500 and other indexes.
In the 1982 best-selling book In Search of Excellence, Tom Peters and Robert Waterman analyzed 40 companies to distill 8 rules for what makes a successful company. Since then, of the 35 publicly traded stocks, most have underperformed the market.
When creating their rules of success, the book ignored all the companies which had failed. As a result, the rules of success turned out to not actually bring about success.
Survivor bias can find its way into scientific studies as well. In one classic study conducted in the 1930s, a researcher named Joseph Banks Rhine claimed to have discovered extrasensory perception or ESP in some people.
His study had people try to guess the order of cards that appeared on the other side of a screen. He tested a large number of people and then eliminated those who performed poorly, indicated they didn’t have ESP.
He eventually kept narrowing down the group until he had a group that he was quite sure exhibited ESP. What really happened, is that he started with a very large number, and then some got lucky.
If you had a million people flipping coins, the odds are that one of them will toss 20 heads in a row. However, finding that one person doesn’t mean they have special powers.
This is a case of the famous investment newsletter scam. The way this highly fraudulent scam works is very simple.
Send a free investment newsletter to 10,000 people. Tell half of them a particular stock will go up, and the other half that it will go down.
Once you have a result, send out another similar letter to the 5,000 people who got the newsletter with the correct prediction.
Do this six times and you’ll eventually have a list of about 150 people. Those 150 people will then think that you’ve correctly picked stocks six times in a row without error. You then ask them to invest their money with you because they now think that you are a genius.
One of the most stunning implications of survivor bias might be the conclusion reached by Stanford Medical researcher John Ioannidis, who concluded that 85% of published scientific research findings might be wrong.
Studies that find a negative result are almost never published. They aren’t sexy and don’t generate funding or attention. Most published studies usually can’t be replicated by other researchers. His conclusion is that there is an abnormally high number of false-positive results being published. Much of this, but not all of it, is due to the fact that negative results, not showing something is true, are usually ignored.
Survivor bias also comes into play with how we see the past.
Music from a certain period often seems better the further we get away from it. The reason is that we only hear good music, and most of the bad music has been forgotten.
Music from the 18th and 19th centuries is dominated by a few composers like Mozart and Beethoven. However, there were a LOT of composers back then. We never hear their music because it isn’t played anymore because they weren’t as good.
The classic example of this can be seen in the movie Amadeus, where Salieri lives long enough to see his music become forgotten
Whenever someone talks about a golden age of anything, it is usually because they only focusing on the good stuff which survived, and they are ignoring all the bad stuff which was eliminated or forgotten.
Sometimes you can have a sort of reverse survivor bias. Instead of ignoring part of a population, you can start counting in and come to a wrong conclusion.
World War I was the first major conflict that saw the widespread use of helmets.
One of the things which medical staff noticed was that the number of head wound cases actually had increased when people began wearing helmets. Most of these injuries were due to shrapnel. This initially lead many top military officials to consider getting rid of the helmets.
However, it was eventually pointed out that the increase in head wounds was due to men not dying from head wounds, which would otherwise have happened if they weren’t wearing a helmet. Dead soldiers don’t go to the medic.
In this case, the increase in injuries, which is normally a bad thing, was actually a good thing. They only knew it was a good thing by looking at the entire population of soldiers, not just the ones who went to the hospital.
I’ve experienced people engaging in survivor bias many times. Many people say that the only way to be successful in podcasting is to have started years ago. They look at the most successful podcasts, and they indeed started years ago.
However, they are only seeing the podcasts that survived, not the podcasts that started at the same time that are no longer around.
Survivor bias is something you should always be on the lookout for, even in your own thinking. If you don’t, you might end up putting your armor in the places that don’t need it.