top of page
Get Our Blog

Great, thanks!

iconfinder_Contols_-_Add_On-32_2454800.p
Writer's pictureEric Hahn

Survivorship Bias Explained


Long Term Capital Management is a perfect example of how hubris can derail even the most brilliant of investors. The hedge fund was filled with PhDs and Nobel Laureates, but was highly leveraged while relying far too heavily on quantitative risk models that didn’t factor in the potential for unexpected events to occur. When Russia defaulted on their debt in the late-1990s, LTCM was toast, requiring a bailout from the Fed and losing nearly all of their investor capital in the process.

Not only is LTCM a lesson in the dangers of overconfidence in the markets, but the way their demise was handled in the hedge fund index world is also instructive on the prevalence of survivorship bias in financial data.

Yale’s David Swensen explains in his book Unconventional Success:

Consider the record of Long Term Capital Management (LTCM), the infamous hedge fund that nearly brought down the world’s financial system. According to the New York Times, the database of Tremont Capital Management, a leading purveyor of hedge fund data, contains LTCM’s performance only through October 1997, nearly a year prior to the firm’s collapse.

Inception-to-date-of-reporting-cessation performance (March 1994 through October 1997) for LTCM stood at 32.4 percent per year net to investors, representing an impressive return on a large amount of capital. Obviously, Long Term Capital’s early record inflated the hedge fund industry’s aggregate results. From the point in October 1997 that Long Term Capital stopped reporting results to the point of the firm’s October 1998 demise, returns (if they can be called returns) amounted to -91.8 percent. The staggering loss appears nowhere in Tremont’s treasure trove of data.

The yawning chasm between Tremont’s reported account of 32.4 percent per annum and LTCM’s actual record of -27.0 percent per annum produces a staggering gap between perception and reality.

This means that all of the gains seen in LTCM’s fund before they collapsed were included in these hedge fund index results, but their enormous losses after the fact were withheld. (All of the gains but none of the losses is a pretty good representation of most investor expectations, as well.) The reason for this is because hedge fund index performance numbers are self-reported by the managers. Once a fund shuts down, reporting fund-killing losses to an index provider is probably not at the top of their to-do list. This means that the actual performance numbers are likely not as good as they may appear on paper.

This is a perfect example of the problem with survivorship bias in reported performance numbers. If you only focus on the winners and don’t pay attention to the losers your entire frame of reference can be thrown off and the data you’re trying to assess can lead to false conclusions. Mutual fund companies are guilty of this as well. They promote the funds that are performing well lately and don’t mention or even shutter the funds that are performing poorly.

Just because failed companies, mutual funds or hedge funds don’t exist anymore doesn’t mean they should be excluded from the historical data set, but that’s often what ends up happening.

And most investors don’t understand this concept — they take all facts and figures at face value.

Beyond survivorship bias, here are a few more considerations when dealing with financial data:

  • Money manager returns. It makes sense to judge a potential money manager based on their track record, but you still have to look beyond the numbers when assessing performance history. How much money was in the strategy when performance was the greatest? Have there been any changes in methodology? Are the same portfolio managers still in place who achieved those results? Were all accounts invested the same way or is this an aggregate roll-up? Is the strategy still viable going forward?

  • Backtests. Quants and academics are quick to point out how easy it should be to beat a market-cap weighted index such as the S&P 500. But in reality, few are able to pull this off in practice. Why is this the case? Beyond the fact that markets are hard, these investors are rarely as disciplined when implementing their strategies under actual market conditions as they are on a spreadsheet when looking at a backtest. Models, algorithms and systematic strategies can be a great way to invest in a rule-based fashion but problems tend to arise when those executing the strategies try to step in and change the rules when things don’t work out as planned. Backtests are easy while the real world implementation of backtests can be hard. One of the reasons indexes are so hard to beat is that they are far more disciplined than investors.

  • Historical market data. We have maybe 100 years’ worth of decent financial market data in the U.S. to work with. There’s no way that the data in the majority of those early decades is completely clean. And even if it was, investors back then didn’t have access to it, so the fact that we now know what they didn’t know then changes how the markets function. Markets and investors adapt and evolve as they continue to learn. Historical market data tells us more about potential risks than anything else.

Here are some of my rules of thumb on how to use data in the financial world:

  • Understand your sources of data.

  • Never blindly accept facts and figures about the markets just because they contain decimal places.

  • All numbers require context.

  • Look for holes in the data and your own thinking.

  • Don’t forget about implementation costs and behavioral issues.

  • You can use historical data to guide your actions, but always be aware of the limitations.

bottom of page