Recommendation
According to critic Harold Bloom, Hamlet’s predicament is not “that he thinks too much” but rather that “he thinks too well,” being ultimately “unable to rest in illusions of any kind.” The same could be said for philosopher, essayist and trader Nassim Nicholas Taleb, who finds something rotten in misguided yet supremely confident investment gurus, traders, hedge fund managers, Wall Street bankers, M.B.A.s, CEOs, Nobel-winning economists and others who claim that they can predict the future and explain the past. Like everyone else, says Taleb, these so-called “experts” fail to appreciate “black swans”: highly consequential but unlikely events that render predictions and standard explanations worse than worthless. Taleb’s style is personal and literary, but his heterodox insights are rigorous (if sometimes jolted by authorial filigree). This combination makes for a thrilling, disturbing, contentious and unforgettable book on chance and randomness. While Taleb offers strong medicine (even somewhat bitter) it's worth the read for anyone who wants a powerful inoculation against gullibility.
SUMMARY
When All Swans Were White
Before 1697, teachers confidently taught European schoolchildren that all swans were white. They had little reason to think otherwise since every swan ever examined had the same snowy plumage. But then Dutch explorer Willem de Vlamingh landed in Australia. Among the many unlikely creatures down under – odd, hopping marsupials called kangaroos, furry duck-billed platypuses, teddy bear-like koalas – Vlamingh found dark feathered birds that looked remarkably like swans. Black swans? Yes. Once observed, they were as unmistakable as they had been unimaginable, and they forced Europeans to revise forever their concept of “swan.” In time, black swans came to seem ordinary.
This pattern is common. Just because you haven’t seen a black swan, doesn’t mean that there are no black swans. Unlikely events seem impossible when they lie in the unknown or in the future. But after they happen, people assimilate them into their conception of the world. The extraordinary becomes ordinary, and “experts” such as policy pundits and market prognosticators kick themselves because they didn’t predict the (now seemingly obvious) occurrence of the (then) unlikely event. Think of the advent of World Wars I and II, the terrorist attacks of 9/11, the popping of the 1990s Internet stock bubble, or world-changing inventions like the internal combustion engine, the personal computer and the Internet. Cultural fads like the Harry Potter books are the same. These events and inventions came out of nowhere, yet in hindsight, they seem almost inevitable. Why?
The human mind is wonderful at simplifying the onslaught of today’s “booming, buzzing confusion” of data. This makes perfect sense: After all, the brain is the product of evolution, which works with what it has, and so it has not crafted some new, ideal cognitive mechanism. The human brain is a marvel, but it is built for living in hunter-gatherer groups on the African Savannah 200,000 years ago. Then, it just needed to be good enough to allow humans to survive until they reached reproductive age. Simplifications, mental schemas, heuristics, biases, self-deception – these are not “bugs” in the cognitive system, but useful features that allow the human mind to concentrate on the task at hand and not get overwhelmed by a literally infinite amount of data. But human simplifying mechanisms are not without their costs. Take stories, for example.
The Narrative Fallacy
Stories help people remember and make sense of the past. Think of a typical business magazine profile of a successful businessman. The story begins in the present after he has become rich beyond his wildest dreams. The story then cuts back to his humble beginnings. He started with nothing and wanted to get rich (in terms of story structure, his “dramatic need”). He faced obstacle after obstacle (perhaps he had a rival – the “antagonist”). But he made shrewd decisions and flouted the wisdom of the Cassandras who counseled caution (”Idiots!”). As success built on success, he amassed a fortune. He retired early, married a model and now has brilliant children who play Chopin blindfolded and will all attend Ivy League colleges. His virtues will be extolled in a B-School case study. Wide-eyed M.B.A. students will sit rapt at his feet when he visits their schools on a lecture tour promoting his latest book. He is a superman, an inspiration.
Now consider an alternative hypothesis: He got lucky. His putative “virtues” had nothing to do with his success. He is, essentially, a lottery winner. The public looks at his life and concocts a story about how brilliant he was, when, in fact, he was merely at the right place at the right time. This is the “ludic fallacy” (ludus means game in Latin): People underestimate luck in life – though they ironically overestimate it in certain games of “chance.” Even the businessman himself falls victim to flawed thinking through the self-sampling bias. He looks at himself, a sample of one, and draws a sweeping conclusion, such as, “If I can do it, anyone can!” Notice that the same reasoning would apply had he merely bought a winning lottery ticket. “I’m a genius for picking 3293927! Those long odds didn’t mean a darn thing. I mean, after all, I won didn’t I!”
Not all success is luck. In some professions, skill matters (for example, if you are a dentist), but luck dominates in others. In the case of the inspiring businessman, consider his population cohort. Where are all the similarly situated people who started out like him and have the same attributes? Are they also rich? Or homeless? Usually, you can’t find this sort of “silent” disconfirming evidence. Artistic success provides a perfect illustration. While Balzac is famous now, perhaps countless other equally talented writers were producing comparable work at the same time. Yet their writings are lost to posterity because they did not succeed. Their “failure” hides the evidence that would undercut Balzac’s “success” as a uniquely great writer. The evidence is silent, lost in the graveyard of history.
The mind uses many more simplifying schemas that can lead to error. Once people have theories, they seek confirming evidence; this is called “confirmation bias.” They fall victim to “epistemic arrogance,” becoming overconfident about their ideas and failing to account for randomness. To make their theories work, people “smooth out” the “jumps” in a time series or historical sequence, looking for and finding patterns that are not there. Their conceptual categories will limit what they see; this is called “tunneling.” They turn to “experts” for help, but often these expert opinions are no better – and often they are worse – than the “insights” gained from flipping a coin or hiring a trained chimp to throw darts at the stock listings. Worst of all, people steadily fail to consider “black swans,” the highly consequential rare events that drive history.
“Mediocristan” or “Extremistan?”
So the human mind tends to smooth away the rough features of reality. Does this matter? It can matter, and a lot, depending on whether you’re in “Mediocristan” or “Extremistan.” Where are these strange places? Nowhere. They are actually memorable metaphors for remembering two wildly different classes of natural phenomena. Mediocristan refers to phenomena you could describe with standard statistical concepts, like the Gaussian distribution, known as the “bell curve.” Extremistan refers to phenomena where a single, curve-distorting event or person can radically skew the distribution. Imagine citing Bill Gates in a comparison of executive incomes.
To understand the difference, think about human height versus movie ticket sales. While a sample of human beings may contain some very tall people (perhaps someone eight feet tall) and some very short people (perhaps someone two feet tall), you wouldn’t find anyone 3,000 feet tall or an inch tall. Nature limits the heights in the sample. Now consider movie ticket sales. One hit movie can have sales that exceed the median value by such a radical extent that modeling the sample with a Gaussian curve is misleading – thereby rendering the notion of “median value” meaningless. You’d be better off using a different kind of curve for such data, for instance, the “power law” curve from the work of Vilfredo Pareto (of 80/20 “law” fame). In a power law-modeled distribution, extreme events are not treated as outliers. In fact, they determine the shape of the curve.
Social phenomena are impossible to model with the Gaussian normal distribution because these phenomena exhibit “social contagion,” that is, abundant feedback loops. For instance, one reason you want to see a hit movie is that everyone else has seen it and is talking about it. It becomes a cultural event that you don’t want to miss. And neither does anyone else. In these situations, the “rich get richer”: The hit film gets increasingly popular because of its popularity until some arbitrarily large number of people have seen it. And speaking of rich, wealth follows this pattern, too. The extremely wealthy are not just a little bit wealthier than normal rich people; they are so much wealthier that they skew the distribution. If you and Bill Gates share a cab, the average wealth in the cab can be north of $25 billion dollars. But the distribution is not bell-shaped. When this happens, odds are you’re no longer in Kansas. You’re in Extremistan.
Phony Forecasting (or Nerds and Herds)
Extremistan might not be so bad if you could predict when outliers would occur and what their magnitude might be. But no one can do this precisely. Consider hit movies. Screenwriter William Goldman is famous for describing the “secret” of Hollywood hits: Nobody can predict one. Similarly, no one knew whether a book by a mother on welfare about a boy magician with an odd birthmark would flop or make the author a billionaire. Stock prices are the same way. Anyone who claims to be able to predict the price of a stock or commodity years in the future is a charlatan. Yet the magazines are filled with the latest “insider” advice about what the market will do. Ditto for technology. Do you know what the “next big thing” will be? No. No one does. Prognosticators generally miss the big important events – the black swans that impel history.
Chalk these errors up to “nerds and herds.” Nerds are people who can only think in terms of the tools they have been taught to use. When all you have is a hammer, everything becomes a nail. If all you have is Gaussian curves, sigma (standard deviation), and mild, ordinary randomness, you’ll see bell curves everywhere and will explain away disconfirming data as “outliers,” “noise” or “exogenous shocks.” (The proliferation of Excel spreadsheets allowing every user to fit a regression line to any messy series of data doesn’t help.) Further, humans follow the herd and look to “experts” for guidance. Yet, some domains can’t have experts because the phenomena the expert is supposed to know are inherently and wildly random. Of course, this discomforting thought requires a palliative, which is to think that the world is much more orderly and uniform than it often is. This soothing belief usually serves people well. Then comes a stock market drop or 9/11 (on the downside), or Star Wars and the Internet (on the upside), and the curve is shot.
Befriending Black Swans
Even given these grim facts, the world need not become, in Hamlet’s words, “a sterile promontory,” nor need a beautiful sky appear “a foul and pestilent congregation of vapors.” You can tame, if not befriend, the black swan by cultivating some “epistemic virtues:”
Keep your eyes open for black swans – Look around and realize when you are in
Extremistan rather than Mediocristan. Social contagion and rich-get-richer phenomena are clues that you’ve just gotten off the bus in Extremistan.
Beliefs are “sticky,” but don’t get glued to them – Revise your beliefs when confronted with contrary evidence. Dare to say, “I don’t know,” “I was wrong” or “It didn’t work.”
Know where you can be a fool and where you can’t – Are you trying to predict what sort of birthday cake your daughter wants? Or the price of oil in 17 years after investing your life’s savings in oil futures? You can’t help being foolish – no one can. But sometimes foolishness is dangerous, and sometimes it is benign.
Know that in many cases, you cannot know – Think outside your usual, customary conceptual categories. Eliminate alternatives that you know are wrong rather than always trying to find out what is right.
As a forecasting period lengthens, prediction errors grow exponentially – Suspend judgment where evidence is lacking and be wary of overly precise predictions. “Fuzzy” thinking can be more useful. Often you should focus only on consequences, not overly precise probabilities.
Expose yourself to “positive black swans” – And, at the same time, hedge against negative ones. “Bet pennies to win dollars.” Look for asymmetries where favorable consequences are greater than unfavorable ones. Maximize the possibilities of serendipity by, say, living in a city, and having a wide circle of diverse friends and business associates.
Look for the non obvious – Seek out disconfirming evidence for pet theories. Think, “What event would refute this theory?” rather than just stacking up confirming evidence for the sake of consistency, and turning out any evidence that contradicts your notion. In other words: Amassing confirming evidence doesn’t prove a theory or a mental model.
Avoid dogmatism – “De-narrate” the past and remember that stories mislead. That’s the whole point: They are psychological armor against the “slings and arrows of outrageous fortune.” Think for yourself. Avoid nerds and herds.This universe, this planet and your life were highly unlikely. But they happened. Enjoy your good fortune and remember that you are a black swan.
Quotes
“We humans are an extremely lucky species, and...we got the genes of the risk takers. The foolish risk takers, that is.”
“We like stories, we like to summarize, and we like to simplify, i.e., to reduce the dimension of matters.”
“Now, I do not disagree with those recommending the use of a narrative to get attention...It is just that narrative can be lethal when used in the wrong places.”
“Notice that close to two centuries ago people had an idealized opinion of their own past, just as we have an idealized opinion of today's past.”
“I know that history is going to be dominated by an improbable event, I just don't know what that event will be.”
“Prediction, not narration, is the real test of our understanding of the world.”
“I find it scandalous that in spite of the empirical record we continue to project into the future as if we were good at it, using tools and methods that exclude rare events.”
“What matters is not how often you are right, but how large your cumulative errors are.”
“Put yourself in situations where favorable consequences are much larger than unfavorable ones.”
“We misunderstand the logic of large deviations from the norm.”
“Every morning the world appears to me more random than it did the day before, and humans seem to be even more fooled by it than they were the previous day.”
About The Author
Nassim Nicholas Taleb, a former derivatives trader, is Dean’s Professor in the Sciences of Uncertainty at the University of Massachusetts and teaches at New York University’s Courant Institute of Mathematical Sciences. He also wrote Fooled by Randomness.