Another reprint from Mormon Matters. I’m sticking with the ‘what is history?’ theme here.
In my last post I talked about how God helped me develop a more realistic, though uncomfortable, world view that excluded faith in myself. As it turns out, there is scientific backing for this view. The first book that introduced me to that science is called The Black Swan: The Impact of the Highly Improbable by Nassim Nicholas Taleb (from here on I’ll abbreviate NNT).
The book’s name comes from the idea that the human brain is not wired to deal with improbable events so we simply discount their possibility:
Before the discovery of Australia, people in the Old World were convinced that all swans were white, an unassailable belief as it seemed completely confirmed by empirical evidence. [i.e. no one had ever seen a Black Swan to that date] …[this story] illustrates a severe limitation to our learning from observations or experience and the fragility of our knowledge. One single observation can invalidate a general statement derived from millennia of confirmatory sightings of millions of white swans. All you need is one single… black bird. (p. xvii)
More to the point: “Black Swan logic makes what you don’t know far more relevant than what you do know.” (p. xix)
A “Black Swan Event” has three attributes:
- “…it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility”
- “it carries an extreme impact.”
- “in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.” (p. xvii)
The book makes the case that everything in life of significance is due to Black Swan events, yet our brains are wired to pretend they don’t exists and won’t happen again.
“Black Swan blindness” plays out as a major part of our lives. It is the reason why we often get into debt to the point where even a few weeks of not having a job would destroy us financially speaking. We are all wired to ignore Black Swans despite their overwhelming impact in all our lives.
…our minds are wonderful explanation machines, capable of making sense out of almost anything, capable of mounting explanations for all manner of phenomena, and generally incapable of accepting the idea of unpredictability. [Black Swan] events [are] unexplainable, but intelligent people thought they were capable of providing convincing explanations for them – after the face. Furthermore, the more intelligent the person, the better sounding the explanation. What’s more worrisome is that all these beliefs and accounts appeared to be logically coherent and devoid of inconsistencies. (p. 10)
Financial Markets and Financial Experts
Finance theory is the first well accepted “science” to be debunked under the Black Swan. Financial markets are primarily based on Black Swan events, not the bell curve based statistical measurements in modern portfolio theory.
…the application of the sciences of uncertainty to real-world problems has had ridiculous effects …Go ask your portfolio manager for his definition of “risk,” and odds are that he will supply you with a measure that excludes the possibility of the Black Swan – hence one that has no better predictive value for assessing the total risks than astrology. (p. xviii)
Yet we continue to use modern portfolio theory as if it meant something. This faith in a model that is a mismatch to real life reaches hilarious proportions at times.
How likely is a 10 sigma event on a bell curve? It’s so infinitesimally small that there is no point in considering it. Every mathematical financial model in existence assumes that there will be no 10 sigma events because they just aren’t worth considering.
But how often do 10 sigma events happen in the stock market? Did you know that if you remove the 10 ten biggest one-day moves for the U.S. Stock market over the past 50 years, your returns are cut in half? Half the gains in the stock market are directly due to Black Swan events that the mathematical models we rely on assume will never happen.
And consider the stock market crash of 1987. That was a 20 sigma event! “If the world of finance were Gaussian [i.e. a bell curve statistical model as we assume], an episode such as the crash [of 1987] (more than 20 standard deviations) would take place every several billion lifetimes of the universe.” (p. 276)
Note: I wrote this article before the current stock market crash were we ended up with multiple sigma events all in one week.
So why do we continue to use a mathematical model that apparently has negative value? Because “people want a number to anchor on. Yet the two methods are logically incompatible.” (p. 276)
On Melting Ice Cubes
Any process that is based on retroactively explaining a set of data suffers from similar problems. NNT challenges us to imagine a melting ice cube and to predict what the puddle will look like after it melts. “If you have the right models… you can predict with great precision” (p. 196)
Now imagine instead a puddle of water on the floor. “Now try to reconstruct in your mind’s eye the shape of the ice cube it may once have been. Note that the puddle may not have necessarily originated from an ice cube.”
The first direction, from the ice cube to the puddle, is called the forward process. The second direction, the backward process, is much, much more complicated. The forward process is generally used in physics and engineering; the backward process is nonrepeatable, nonexperimental historical approaches. In a way, the limitations that prevent us from unfrying an egg also prevent us from reverse engineering history. (p. 196)
NNT applies this concept to trying to determine history. The issue with history is that it’s impossible to reconstruct what really happened, we only know what the end result was:
The brings me to a greater problem with the historian’s craft. I will state the fundamental problem of practice as follows: while in theory randomness is an intrinsic property, in practice, randomness is incomplete information… (p. 198)
Narrative Fallacy
But we can’t help but believe that we can reverse engineer history. Why? Due to something called the Narrative Fallacy. The Narrative Fallacy relates to the third attribute of a Black Swan: “human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.”
I have noticed that we come up with simple explanations for every complex thing around us. Unfortunately these explanations are actually fiction.
The narrative fallacy addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, an arrow of relationship, upon them. Explanations bind facts together. They make them all the more easily remembered; they help them make more sense. (p. 64)
And why do we naturally do this? NNT’s explanation is evolution created our brains this way. However, he points out that this isn’t the whole story. “The problem of narrativity… is not so ‘psychological’… it is more generally a problem of information. …Information wants to be reduced.” (p. 64)
He gives some humorous examples of how the narrative fallacy often leads us astray:
In an experiment, psychologists asked women to select from among twelve pairs of nylon stockings the ones they preferred. The researches then asked the women their reasons for their choices. …All the pairs of stockings were, in fact, identical. The women supplied backfit, post hoc explanations. (p. 65)
He also uses the famous example of split brain patients. (p. 65) In certain epilepsy patients there is an operation done to separate the halves of the brain. It is then possible to talk to each half of the brain separately by showing writing to only one eye. When the person then performs the act asked, unbeknownst to the part of their brain that talks and explains things, they invariably concoct an immediate explanation that is made up but they are convinced is true. For example, the researcher might ask the patient’s right hemisphere to walk across the room. When they do so, they then asked the person (who talks and explains through their left hemisphere) why they just walked across the room. They might respond, “I wanted to get a Coke.”
NNT explains this is because “you interpret pretty much as you perform other activities deemed automatic and outside your control, like breathing.” (p. 66) This is why I could recognize narrative fallacy in others but not myself. This is why I can’t stop myself from falling into them.
And if I could stop narrative fallacies, it might not turn out to be a good thing. Narrative Fallacies are a survival technique to be able to deal with the complexities of the world. “The same condition that makes us simplify [i.e. so that we can deal with our complex lives] pushes us to think that the world is less random than it actually is.” (p. 69)
Does Having More Information Make You Smarter?
In Psych 101 I remember learning about an experiment where subjects were shown an out of focus image slowly being brought into focus. The subjects that watched the image from the beginning were at a disadvantage to correctly comprehend the image compared to subjects brought in part way through the focusing process. NNT uses a similar experiment to prove that more information is actually harmful to comprehension:
Show two groups of people a blurry image of a fire hydrant… For one group, increate the resolution slowly, in ten steps. For the second, do it faster, in five steps. Stop at a point where both groups have been presented an identical image…. The members of the group that saw fewer intermediate steps are likely to recognize the hydrant much faster. Moral? The more information you give someone, the more hypotheses they will formulate along the way, and the worse off they will be. They see more random noise and mistake it for information. The problem is that our ideas are sticky: once we produce a theory, we are not likely to change our minds – so those who delay developing their theories are better off. When you develop your opinions on the basis of weak evidence, you will have difficult interpreting subsequent information that contradicts these opinions, even if this new information is obviously more accurate. (p. 144)
This is why people that listen to the news on the radio or television continuously obtaining up to date information on a subject will be at a disadvantage compared to someone that, say, reads a weekly magazine. (p. 144) This means knowing more information isn’t the same as knowing more.
It also means one person’s theory on an unknowable truth isn’t really measurably better or worse than anyone else’s theory on the same subject. As NNT puts it:
Consider that two people can hold incompatible beliefs based on the exact same data. …One may have a million ways to explain things, but the true explanation if unique, whether or not it is within our reach. (p. 72)
I originally intended to end this post here. But I then came across another book that explained that the basis for our problems with Black Swans is actually very much biological. I will post the rest later this weekend. And I apologize for having so many posts so close together. This is such an important subject that affects all of us. But it’s a complex subject that really does need several posts and time to digest.