The Black Swan Synopsis
Nassim Taleb looks at the impact and human psychology of extreme events. Broadly, it shows that humans tend to create environments where extreme events are more likely and then underestimate their probability. This results in everything from World Wars to major market meltdowns.
It is impossible to avoid black swans, but it is possible to become robust to negative swans and get exposure to positive ones.
The Black Swan Quotes and Notes
Black Swan logic makes what you don’t know far more relevant than what you do know.
Notes: 1) or they make what most people don’t know more relevant. Taleb saw fragility in the 2008 market when others didnt. You can measure fragility.
The next killing in the restaurant industry needs to be an idea that is not easily conceived of by the current population of restaurateurs. It has to be at some distance from expectations.
Notes: 1) The black swan potential ideas will feel risky b/c they are different
The payoff of a human venture is, in general, inversely proportional to what it is expected to be.
Notes: 1) tinkering leads to unicorns
The strategy for the discoverers and entrepreneurs is to rely less on top-down planning and focus on maximum tinkering and recognizing opportunities when they present themselves. So I disagree with the followers of Marx and those of Adam Smith: the reason free markets work is because they allow people to be lucky, thanks to aggressive trial and error, not by giving rewards or “incentives” for skill. The strategy is, then, to tinker as much as possible and try to collect as many Black Swan opportunities as you can.
Notes: 1) you want a portfolio off black sawans fitting the kelly criterion. they should be big enough bets to be meaningful returns.
We tend to learn the precise, not the general. What did people learn from the 9/11 episode? Did they learn that some events, owing to their dynamics, stand largely outside the realm of the predictable? No. Did they learn the built-in defect of conventional wisdom? No. What did they figure out? They learned precise rules for avoiding Islamic proto-terrorists and tall buildings. Many keep reminding me that it is important for us to be practical and take tangible steps rather than to “theorize” about knowledge. The story of the Maginot Line shows how we are conditioned to be specific. The French, after the Great War, built a wall along the previous German invasion route to prevent reinvasion—Hitler just (almost) effortlessly went around it. The French had been excellent students of history; they just learned with too much precision. They were too practical and exceedingly focused for their own safety.
Notes: 1) this is my beef with actionable content
recursive environment.
Notes: 1) increasing recursion increases power law distribution. eg. amazon bestseller list shifts distribution from 80/20 to 90/10
Platonicity, after the ideas (and personality) of the philosopher Plato, is our tendency to mistake the map for the territory, to focus on pure and well-defined “forms,” whether objects, like triangles, or social notions, like utopias (societies built according to some blueprint of what “makes sense”), even nationalities. When these ideas and crisp constructs inhabit our minds, we privilege them over other less elegant objects, those with messier and less tractable structures (an idea that I will elaborate progressively throughout this book). Platonicity is what makes us think that we understand more than we actually do.
Notes: 1) Plato really fucked us.
to avoid dullness may help to filter out the nonessential.
There is a contradiction; this book is a story, and I prefer to use stories and vignettes to illustrate our gullibility about stories and our preference for the dangerous compression of narratives.
People were suddenly brainwashed to believe in the nation-state as an entity.
Notes: 1) There’s no such thing
You can afford to be compassionate, lax, and courteous if, once in a while, when it is least expected of you, but completely justified, you sue someone, or savage an enemy, just to show that you can walk the walk.
The triplet of opacity. They are: the illusion of understanding, or how everyone thinks he knows what is going on in a world that is more complicated (or random) than they realize; the retrospective distortion, or how we can assess matters only after the fact, as if they were in a rearview mirror (history seems clearer and more organized in history books than in empirical reality); and the overvaluation of factual information and the handicap of authoritative and learned people, particularly when they create categories—when they “Platonify.”
History Does Not Crawl, It Jumps
Notes: 1) punctuated equilibrium
I did not grasp much, except that history had some logic and that things developed through contradiction (or opposites) in a way that elevated mankind into higher forms of society
Notes: 1) This is determinism and cartesian dualism. Our philosophical baggage
Platonicity, the desire to cut reality into crisp shapes.
Notes: 1) this is a great idea and it happens everyawhere
Categorizing is necessary for humans, but it becomes pathological when the category is seen as definitive, preventing people from considering the fuzziness of boundaries, let alone revising their categories.
Notes: 1) the calculus of grit. we are all specialists but not necessarily along acknowledged lines of partition.
someone with plans to become a “philosopher” or a “scientific philosopher of history” would wind up in business school, and the Wharton School no less, still escapes me.
Notes: 1) this shaped me a lot when I first read it. I loved Marx and wanted to be a scientific philosopher of history and went into business
the most potent country in the history of the world, the executives of the most powerful corporations were coming to describe what they did for a living, and it was possible that they too did not know what was going on.
My idea is that not only are some scientific results useless in real life, because they underestimate the impact of the highly improbable (or lead us to ignore it), but that many of them may be actually creating Black Swans.
Notes: 1) the efficient market hypothesis ironically increases the number and magnitude of black swans
take a sabbatical year for every three on average to fill up gaps in my scientific and philosophical culture.
Notes: 1) mini retirements
slowly distill my single idea, I wanted to become a flâneur, a professional meditator, sit in cafés, lounge, unglued to desks and organization structures, sleep as long as I needed, read voraciously, and not owe any explanation to anybody. I wanted to be left alone in order to build, small steps at a time, an entire system of thought based on my Black Swan idea.
the central distinction between the Black Swan–generating province of Extremistan and the tame, quiet, and uneventful province of Mediocristan.
Notes: 1) he gave clever names to gaussian and fat tail distributions
If I myself had to give advice, I would recommend someone pick a profession that is not scalable! A scalable profession is good only if you are successful; they are more competitive, produce monstrous inequalities, and are far more random, with huge disparities between efforts and rewards—a few can take a large share of the pie, leaving others out entirely at no fault of their own.
Notes: 1) pick something nonscalable to start which forms your floor then go after upside
believe that the big transition in social life came not with the gramophone, but when someone had the great but unjust idea to invent the alphabet, thus allowing us to store information and reproduce it. It accelerated further when another inventor had the even more dangerous and iniquitous notion of starting a printing press, thus promoting texts across boundaries and triggering what ultimately grew into a winner-take-all ecology.
Now, what was so unjust about the spread of books? The alphabet allowed stories and ideas to be replicated with high fidelity and without limit, without any additional expenditure of energy on the author’s part for the subsequent performances. He didn’t even have to be alive for them—death is often a good career move for an author. This implies that those who, for some reason, start getting some attention can quickly reach more minds than others and displace the competitors from the bookshelves. In the days of bards and troubadours, everyone had an audience.
A storyteller, like a baker or a coppersmith, had a market, and the assurance that none from far away could dislodge him from his territory. Today, a few take almost everything; the rest, next to nothing.
Notes: 1) what is the relationship between black swans and disintermediation. it males the world more just and therefore more unequal?
people do not fall in love with works of art only for their own sake, but also in order to feel that they belong to a community.
Notes: 1) why you have to build a community around your product
When your sample is large, no single instance will significantly change the aggregate or the total. The largest observation will remain impressive, but eventually insignificant, to the sum.
In Extremistan, inequalities are such that one single observation can disproportionately impact the aggregate, or the total.
Mediocristan is where we must endure the tyranny of the collective, the routine, the obvious, and the predicted; Extremistan is where we are subjected to the tyranny of the singular, the accidental, the unseen, and the unpredicted.
The most typical member is mediocre. The most “typical” is either giant or dwarf, i.e., there is no typical member
Notes: 1) organized vs. disorganizded complexity. averages work for disorganized complexity but are harmful for organized compleexity
takes a long time to know what’s going on
Notes: 1) this is why it takes 5-20 years to get rich as an entrepreneur. it takes a lot of data points to understand and you will have many misses but only need one hit
History makes jumps
Notes: 1) there are weeks in which decades happen
Extremistan does not always imply Black Swans. Some events can be rare and consequential, but somewhat predictable, particularly to those who are prepared for them and have the tools to understand them (instead of listening to statisticians, economists, and charlatans of the bell-curve variety).
They are near–Black Swans. They are somewhat tractable scientifically—knowing about their incidence should lower your surprise; these events are rare but expected. I call this special case of “gray” swans Mandelbrotian randomness. This category encompasses the randomness that produces phenomena commonly known by terms such as scalable, scale-invariant, power laws, Pareto-Zipf laws, Yule’s law, Paretian-stable processes, Levy-stable, and fractal laws,
Notes: 1) these are where many people get rich because they are scalable and somewhat forseeable
The überphilosopher Bertrand Russell presents a particularly toxic variant of my surprise jolt in his illustration of what people in his line of business call the Problem of Induction or Problem of Inductive Knowledge
Notes: 1) he took the turkey story from bertrand russell. market tested by Lindty effect.
The turkey problem can be generalized to any situation where the same hand that feeds you can be the one that wrings your neck
Notes: 1) anytime you have centralization you have the turkey problem
Consider that the turkey’s experience may have, rather than no value, a negative value. It learned from observation, as we are all advised to do (hey, after all, this is what is believed to be the scientific method). Its confidence increased as the number of friendly feedings grew, and it felt increasingly safe even though the slaughter was more and more imminent. Consider that the feeling of safety reached its maximum when the risk was at the highest!
But the problem is even more general than that; it strikes at the nature of empirical knowledge itself. Something has worked in the past, until—well, it unexpectedly no longer does, and what we have learned from the past turns out to be at best irrelevant or false, at worst viciously misleading.
Notes: 1) if you are in Extremistan then looking at data can be harmful as it increases your confidence As risk increases. you are most confident when risk levels are highest. Conversely you are least confident when risk levels are the lowest. 2) this is what LTCM looked like.
after the event you start predicting the possibility of other outliers happening locally, that is, in the process you were just surprised by, but not elsewhere. After the stock market crash of 1987 half of America’s traders braced for another one every October
Notes: 1) we deaw overly specific conclusions
their profits were simply cash borrowed from destiny with some random payback time.
Notes: 1) if you are fragile then profits are always borrowed from the future.
you can create Black Swans with science, by giving people confidence that the Black Swan cannot happen—this is when science turns normal citizens into suckers.
Notes: 1) history is a list of surprises. vonnegut quote.
Some Black Swans can come from the slow building up of incremental changes in the same direction, as with books that sell large amounts over years, never showing up on the bestseller lists, or from technologies that creep up on us slowly, but surely.
Notes: 1) Why it takes 5-10 years to get rich
In general, positive Black Swans take time to show their effect while negative ones happen very quickly—it is much easier and much faster to destroy than to build.
an erudite can be dissatisfied with his own knowledge, and such dissatisfaction is a wonderful shield against Platonicity,
Notes: 1) superforecasting
We focus on preselected segments of the seen and generalize from it to the unseen: the error of confirmation. We fool ourselves with stories that cater to our Platonic thirst for distinct patterns: the narrative fallacy. We behave as if the Black Swan does not exist: human nature is not programmed for Black Swans. What we see is not necessarily all that is there. History hides Black Swans from us and gives us a mistaken idea about the odds of these events: this is the distortion of silent evidence. We “tunnel”: that is, we focus on a few well-defined sources of uncertainty, on too specific a list of Black Swans (at the expense of the others that do not easily come to mind).
Notes: 1) wysiati from Kahneman – what you see is all there is
someone who observed the turkey’s first thousand days (but not the shock of the thousand and first) would tell you, and rightly so, that there is no evidence of the possibility of large events, i.e., Black Swans. You are likely to confuse that statement, however, particularly if you do not pay close attention, with the statement that there is evidence of no possible Black Swans.
Notes: 1) turkey problem. this also happens bc black swans are hard to visualize. they are complex and the cause-effect relationship is messy. easy to imagine being hit by a train. There is also social stigma. That’s why i sound crazy for the prepping stuff.
“I never meant to say that the Conservatives are generally stupid. I meant to say that stupid people are generally Conservative,” John Stuart Mill once complained.
Our inferential machinery, that which we use in daily life, is not made for a complicated environment in which a statement changes markedly when its wording is slightly modified. Consider that in a primitive environment there is no consequential difference between the statements most killers are wild animals and most wild animals are killers. There is an error here, but it is almost inconsequential. Our statistical intuitions have not evolved for a habitat in which these subtleties can make a big difference.
Notes: 1) thats big. most saber tooth tigers are killers and most killers are saber tooth tigers
By domain-specific I mean that our reactions, our mode of thinking, our intuitions, depend on the context in which the matter is presented, what evolutionary psychologists call the “domain” of the object or the event. The classroom is a domain; real life is another. We react to a piece of information not on its logical merit, but on the basis of which framework surrounds it, and how it registers with our social-emotional system. Logical problems approached one way in the classroom might be treated differently in daily life. Indeed they are treated differently in daily life.
Doctors in the midst of the scientific arrogance of the 1960s looked down at mothers’ milk as something primitive, as if it could be replicated by their laboratories—not realizing that mothers’ milk might include useful components that could have eluded their scientific understanding—a simple confusion of absence of evidence of the benefits of mothers’ milk with evidence of absence
Notes: 1) hence lindy is a defense against the errors of high modernism
But it remains the case that you know what is wrong with a lot more confidence than you know what is right.
Notes: 1) decide what not to do by emulating great men and women
there is no such animal as corroborative evidence.
So it seems that we are endowed with specific and elaborate inductive instincts showing us the way. Contrary to the opinion held by the great David Hume, and that of the British empiricist tradition, that belief arises from custom, as they assumed that we learn generalizations solely from experience and empirical observations, it was shown from studies of infant behavior that we come equipped with mental machinery that causes us to selectively generalize from experiences (i.e., to selectively acquire inductive learning in some domains but remain skeptical in others). By doing so, we are not learning from a mere thousand days, but benefiting, thanks to evolution, from the learning of our ancestors—which found its way into our biology.
Notes: 1) we generalize along dimensions which existed in our ancestral conceptual framework
pattern perception increases along with the concentration in the brain of the chemical dopamine.
Notes: 1) why you do good work after a workout
the Black Swan is what we leave out of simplification.
Notes: 1) the black swan is in the information that gets lost in compressing the raw data into a narrative.
Platonicity affects us here once again. The very same desire for order, interestingly, applies to scientific pursuits—it is just that, unlike art, the (stated) purpose of science is to get to the truth, not to give you a feeling of organization or make you feel better. We tend to use knowledge as therapy.
Notes: 1) instead of living with the randomness, we platonify it which suits our biology but not our reality.
we will tend to more easily remember those facts from our past that fit a narrative, while we tend to neglect others that do not appear to play a causal role in that narrative.
Notes: 1) the black swan is by definition excluded from the narrative.
they wanted to be wrong with infinite precision (instead of accepting being approximately right, like a fable writer).
Notes: 1) approximately right vs precisely wrong
Empirically, sex, social class, and profession seem to be better predictors of someone’s behavior than nationality
there are two varieties of rare events: a) the narrated Black Swans, those that are present in the current discourse and that you are likely to hear about on television, and b) those nobody talks about, since they escape models—those that you would feel ashamed discussing in public because they do not seem plausible. I can safely say that it is entirely compatible with human nature that the incidences of Black Swans would be overestimated in the first case, but severely underestimated in the second one.
Notes: 1) if everyone s talking about it then it is probably already priced in.
Greg Barron and Ido Erev provide experimental evidence that agents underweigh small probabilities when they engage in sequential experiments in which they derive the probabilities themselves, when they are not supplied with the odds. If you draw from an urn with a very small number of red balls and a high number of black ones, and if you do not have a clue about the relative proportions, you are likely to underestimate the number of red balls. It is only when you are supplied with their frequency—say, by telling you that 3 percent of the balls are red—that you overestimate it in your betting decision.
Notes: 1) intuitively we underweight unlikely events.
Events that are nonrepeatable are ignored before their occurrence, and overestimated after (for a while). After a Black Swan, such as September 11, 2001, people expect it to recur when in fact the odds of that happening have arguably been lowered.
Notes: 1) right after an event type x then type x is lest likely type of black swan bc volatility has expressed itself and we are taking counter measures
like to think about specific and known Black Swans when in fact the very nature of randomness lies in its abstraction.
I prefer the experiments of empirical psychology to the theories-based MRI scans of neurobiologists, even if the former appear less “scientific” to the public.
Notes: 1) neurlogy may be precisely wrong. better to be empirical than precise.
The way to avoid the ills of the narrative fallacy is to favor experimentation over storytelling, experience over history, and clinical knowledge over theories.
You work on a project that does not deliver immediate or steady results; all the while, people around you work on projects that do.
Notes: 1) just the cost of living in extremistan?
But now you have a slow-thinking thirty-year-old security analyst at a downtown Manhattan firm who “judges” your results and reads too much into them.
Notes: 1) public companies are forced into fragility b/c they don’t understand punctuated equilibrium?
Our intuitions are not cut out for nonlinearities. Consider our life in a primitive environment where process and result are closely connected. You are thirsty; drinking brings you adequate satisfaction. Or even in a not-so-primitive environment, when you engage in building, say, a bridge or a stone house, more work will lead to more apparent results, so your mood is propped up by visible continuous feedback. In a primitive environment, the relevant is the sensational.
The economist William Baumol calls this “a touch of madness.” This may indeed apply to all concentrated businesses: when you look at the empirical record, you not only see that venture capitalists do better than entrepreneurs, but publishers do better than writers, dealers do better than artists, and science does better than scientists (about 50 percent of scientific and scholarly papers, costing months, sometimes years, of effort, are never truly read). The person involved in such gambles is paid in a currency other than material success: hope.
Notes: 1) the problem is that mediocristan professions are all dull. Hence the barbell?
So from a narrowly defined accounting point of view, which I may call here “hedonic calculus,” it does not pay to shoot for one large win. Mother Nature destined us to derive enjoyment from a steady flow of pleasant small, but frequent, rewards. As I said, the rewards do not have to be large, just frequent—a little bit here, a little bit there. Consider that our major satisfaction for thousands of years came in the form of food and water (and something else more private), and that while we need these steadily, we quickly reach saturation. The problem, of course, is that we do not live in an environment where results are delivered in a steady manner—Black Swans dominate much of human history. It is unfortunate that the right strategy for our current environment may not offer internal rewards and positive feedback.
Notes: 1) we want to get constannt little rewards likee paycheck but modernity is made of extremes
If you engage in a Black Swan–dependent activity, it is better to be part of a group.
Notes: 1) why entrepreneurs need communities
some business bets in which one wins big but infrequently, yet loses small but frequently, are worth making if others are suckers for them and if you have the personal and intellectual stamina. But you need such stamina.
Notes: 1) marketing and business stamina.
You can rationalize all you want; the hippocampus takes the insult of chronic stress seriously, incurring irreversible atrophy. Contrary to popular belief, these small, seemingly harmless stressors do not strengthen you; they can amputate part of your self. It was the exposure to a high level of information that poisoned Nero’s life. He could sustain the pain if he saw only weekly performance numbers, instead of updates every minute. He did better emotionally with his own portfolio than with those of clients, since he was not obligated to monitor it continuously.
Notes: 1) if you are making black swan bets then you actually want less data bc the data will drive you to quit.
Silent evidence is what events use to conceal their own randomness, particularly the Black Swan type of randomness.
Notes: 1) the dead don’t write history
If both the positive and the negative consequences of an action fell on its author, our learning would be fast. But often an action’s positive consequences benefit only its author, since they are visible, while the negative consequences, being invisible, apply to others, with a net cost to society.
Notes: 1) healthcare in the U.S.
Clearly there is an element of the surviving Casanovas in us, that of the risk-taking genes, which encourages us to take blind risks, unaware of the variability in the possible outcomes. We inherited the taste for uncalculated risk taking. Should we encourage such behavior?
Notes: 1) its not risk taking, it’s wysiati and we think odds of success are higher?
we generally take risks not out of bravado but out of ignorance and blindness to probability!
that we got here by accident does not mean that we should continue to take the same risks. We are mature enough a race to realize this point, enjoy our blessings, and try to preserve, by becoming more conservative, what we got by luck. We have been playing Russian roulette; now let’s stop and get a real job.
Notes: 1) we should have blown ourselves up but we haven’t yet.
ludic fallacy—the attributes of the uncertainty we face in real life have little connection to the sterilized ones we encounter in exams and games.
unknown unknown
Notes: 1) black swan synonym
In real life you do not know the odds; you need to discover them, and the sources of uncertainty are not defined.
Notes: 1) figuring bout the odds is the hard part. not calculating the EV.
Economists, who do not consider what was discovered by noneconomists worthwhile, draw an artificial distinction between Knightian risks (which you can compute) and Knightian uncertainty (which you cannot compute), after one Frank Knight, who rediscovered the notion of unknown uncertainty and did a lot of thinking but perhaps never took risks, or perhaps lived in the vicinity of a casino. Had he taken economic or financial risks he would have realized that these “computable” risks are largely absent from real life! They are laboratory contraptions!
Notes: 1) parallel with Mandelbrot. all the statistics we learn are a trivial part of real life.
Probability is a liberal art;
“We are dogma-prone from our mother’s wombs.”
the cosmetic and the Platonic rise naturally to the surface. This is a simple extension of the problem of knowledge. It is simply that one side of Eco’s library, the one we never see, has the property of being ignored. This is also the problem of silent evidence. It is why we do not see Black Swans: we worry about those that happened, not those that may happen but did not. It is why we Platonify, liking known schemas and well-organized knowledge—to the point of blindness to reality. It is why we fall for the problem of induction, why we confirm. It is why those who “study” and fare well in school have a tendency to be suckers for the ludic fallacy. And it is why we have Black Swans and never learn from their occurrence, because the ones that did not happen were too abstract.
Notes: 1) This is Seeing Like a State. The cosmetic and platonic that rise to the surface are the things which the central planners see on their spreadsheet. “Because the ones that did not happen were too abstract” – The Maginot Line ProblemI
If you want a simple step to a higher form of life, as distant from the animal as you can get, then you may have to denarrate, that is, shut down the television set, minimize time spent reading newspapers, ignore the blogs. Train your reasoning abilities to control your decisions; nudge System 1 (the heuristic or experiential system) out of the important ones. Train yourself to spot the difference between the sensational and the empirical. This insulation from the toxicity of the world will have an additional benefit: it will improve your well-being. Also, bear in mind how shallow we are with probability, the mother of all abstract notions. You do not have to do much more in order to gain a deeper understanding of things around you. Above all, learn to avoid “tunneling.”
Notes: 1) Harari is original in part because he takes long meditation retreats.
When I ask people to name three recently implemented technologies that most impact our world today, they usually propose the computer, the Internet, and the laser. All three were unplanned, unpredicted, and unappreciated upon their discovery, and remained unappreciated well after their initial use.
Notes: 1) because they were aesthetically ugly. worse is better.
the world is far, far more complicated than we think, which is not a problem, except when most of us don’t know
Berra can claim to know something about randomness. He was a practitioner of uncertainty, and, as a baseball player and coach, regularly faced random outcomes, and had to face their results deep into his bones.
Notes: 1) he had fingerspitz
Like many things in life, the discovery was unplanned, serendipitous, surprising, and took a while to digest. Legend has it that Albert and Raiffa, the researchers who noticed it, were actually looking for something quite different, and more boring: how humans figure out probabilities in their decision making when uncertainty is involved (what the learned call calibrating). The researchers came out befuddled. The 2 percent error rate turned out to be close to 45 percent in the population being tested!
Notes: 1) good study that would show how we think the world is more certain than it really is. We think we can predict when we really cant. We don’t realize this because we use narratives to rationalize post hoc. The person who loses all their money rarely owns up to it b/c they don’t want to look stupid.
I asked the participants to take a stab at a range for the number of books in Umberto Eco’s library, which, as we know from the introduction to Part One, contains 30,000 volumes. Of the sixty attendees, not a single one made the range wide enough to include the actual number (the 2 percent error rate became 100 percent). This case may be an aberration, but the distortion is exacerbated with quantities that are out of the ordinary. Interestingly, the crowd erred on the very high and the very low sides: some set their ranges at 2,000 to 4,000; others at 300,000 to 600,000.
The errors get worse with the degree of remoteness to the event. So far, we have only considered a 2 percent error rate in the game we saw earlier, but if you look at, say, situations where the odds are one in a hundred, one in a thousand, or one in a million, then the errors become monstrous. The longer the odds, the larger the epistemic arrogance.
Notes: 1) further out in the tail our ability to estimate gets worse. The further the system is from equilibrium the worse our intuition.
additional knowledge of the minutiae of daily business can be useless, even actually toxic,
Show two groups of people a blurry image of a fire hydrant, blurry enough for them not to recognize what it is. For one group, increase the resolution slowly, in ten steps. For the second, do it faster, in five steps. Stop at a point where both groups have been presented an identical image and ask each of them to identify what they see. The members of the group that saw fewer intermediate steps are likely to recognize the hydrant much faster. Moral? The more information you give someone, the more hypotheses they will formulate along the way, and the worse off they will be. They see more random noise and mistake it for information.
Notes: 1) benefit to having multiple things going on is less time to pay attention to noise
once we produce a theory, we are not likely to change our minds—so those who delay developing their theories are better off. When you develop your opinions on the basis of weak evidence, you will have difficulty interpreting subsequent information that contradicts these opinions, even if this new information is obviously more accurate.
Notes: 1) why investing is best done part time
the psychologist Paul Slovic asked bookmakers to select from eighty-eight variables in past horse races those that they found useful in computing the odds. These variables included all manner of statistical information about past performances. The bookmakers were given the ten most useful variables, then asked to predict the outcome of races. Then they were given ten more and asked to predict again. The increase in the information set did not lead to an increase in their accuracy; their confidence in their choices, on the other hand, went up markedly. Information proved to be toxic.
Notes: 1) There is a dose response curve to information. you need some but not too much.
In fields where we have ancestral traditions, such as pillaging, we are very good at predicting outcomes by gauging the balance of power. Humans and chimps can immediately sense which side has the upper hand, and make a cost-benefit analysis about whether to attack and take the goods and the mates. Once you start raiding, you put yourself into a delusional mind-set that makes you ignore additional information—it is best to avoid wavering during battle. On the other hand, unlike raids, large-scale wars are not something present in human heritage—we are new to them—so we tend to misestimate their duration and overestimate our relative power.
they are quite ashamed to say anything outlandish to their clients—and yet events, it turns out, are almost always outlandish.
Notes: 1) investors are well compensated for being willing to appear foolish.
(traders rarely hire economists for their own consumption, but rather to provide stories for their less sophisticated clients).
Notes: 1) plans are for convincing people not for actually following
we feel a little unique, unlike others, for whom we do not perceive such an asymmetry.
Notes: 1) your circle of competence is more narrow than you think
The most interesting test of how academic methods fare in the real world was run by Spyros Makridakis, who spent part of his career managing competitions between forecasters who practice a “scientific method” called econometrics—an approach that combines economic theory with statistical measurements. Simply put, he made people forecast in real life and then he judged their accuracy. This led to the series of “M-Competitions” he ran, with assistance from Michele Hibon, of which M3 was the third and most recent one, completed in 1999. Makridakis and Hibon reached the sad conclusion that “statistically sophisticated or complex methods do not necessarily provide more accurate forecasts than simpler ones.”
Notes: 1) we are gullible to false precision. this makes us precisely wrong rather than roughly right.
misconception about the nature of uncertainty. The first fallacy: variability matters. The first error lies in taking a projection too seriously, without heeding its accuracy. Yet, for planning purposes, the accuracy in your forecast matters far more than the forecast itself. I will explain it as follows. Don’t cross a river if it is four feet deep on average.
The term serendipity was coined in a letter by the writer Hugh Walpole, who derived it from a fairy tale, “The Three Princes of Serendip.” These princes “were always making discoveries by accident or sagacity, of things which they were not in quest of.”
At the end of the year in which Darwin and Wallace presented their papers on evolution by natural selection that changed the way we view the world, the president of the Linnean society, where the papers were presented, announced that the society saw “no striking discovery,” nothing in particular that could revolutionize science.
Notes: 1) innovations are like frogs boiling in water
Engineers tend to develop tools for the pleasure of developing tools, not to induce nature to yield its secrets.
Notes: 1) optimizing for interesting
If you know a set of basic parameters concerning the ball at rest, can compute the resistance of the table (quite elementary), and can gauge the strength of the impact, then it is rather easy to predict what would happen at the first hit. The second impact becomes more complicated, but possible; you need to be more careful about your knowledge of the initial states, and more precision is called for.
The problem is that to correctly compute the ninth impact, you need to take into account the gravitational pull of someone standing next to the table (modestly, Berry’s computations use a weight of less than 150 pounds). And to compute the fifty-sixth impact, every single elementary particle of the universe needs to be present in your assumptions! An electron at the edge of the universe, separated from us by 10 billion light-years, must figure in the calculations, since it exerts a meaningful effect on the outcome.
Now, consider the additional burden of having to incorporate predictions about where these variables will be in the future. Forecasting the motion of a billiard ball on a pool table requires knowledge of the dynamics of the entire universe, down to every single atom!
Notes: 1) good example of uncertainty
thinkers of the Austrian school, to which Hayek belonged, used the designations tacit or implicit precisely for that part of knowledge that cannot be written down, but that we should avoid repressing. They made the distinction we saw earlier between “know-how” and “know-what”—the latter being more elusive and more prone to nerdification.
Notes: 1) tacit knowledge ties in with fingerspitz
doctors rejected the practice of hand washing because it made no sense to them, despite the evidence of a meaningful decrease in hospital deaths.
Notes: 1) some things work in practice, not in theory
The very word essay conveys the tentative, the speculative, and the nondefinitive.
has been more profitable for us to bind together in the wrong direction than to be alone in the right one. Those who have followed the assertive idiot rather than the introspective wise person have passed us some of their genes.
Notes: 1) no longer true
We can get negative confirmation from history, which is invaluable, but we get plenty of illusions of knowledge along with it.
Notes: 1) learn what not to not to do by studying great men and women
Being a Fool in the Right Places The lesson for the small is: be human! Accept that being human involves some amount of epistemic arrogance in running your affairs. Do not be ashamed of that. Do not try to always withhold judgment—opinions are the stuff of life. Do not try to avoid predicting—yes, after this diatribe about prediction I am not urging you to stop being a fool. Just be a fool in the right places.
Notes: 1) it takes a huge amount of effort to question everything soo distinguish between the big an small things and don’t sweat the small stuff
Know how to rank beliefs not according to their plausibility but by the harm they may cause.
Recall the empirics, those members of the Greek school of empirical medicine. They considered that you should be open-minded in your medical diagnoses to let luck play a role. By luck, a patient might be cured, say, by eating some food that accidentally turns out to be the cure for his disease, so that the treatment can then be used on subsequent patients. The positive accident (like hypertension medicine producing side benefits that led to Viagra) was the empirics’ central method of medical discovery. This same point can be generalized to life: maximize the serendipity around you.
Notes: 1) optimize for serendipity. is there a way to use the graham and doddsville argument here?
People are often ashamed of losses, so they engage in strategies that produce very little volatility but contain the risk of a large loss—like collecting nickels in front of steamrollers.
Notes: 1) we sweat the small stuff but expose ourselves to thee big stuff
Barbell Strategy
Notes: 1) for this chapter you need to differentiate between the big and the small
So the second lesson is more aggressive: you can actually take advantage of the problem of prediction and epistemic arrogance!
Here are the (modest) tricks. But note that the more modest they are, the more effective they will be. First, make a distinction between positive contingencies and negative ones. Learn
Notes: 1) is this convex or concave?
do not try to predict precise Black Swans—it tends to make you more vulnerable to the ones you did not predict.
Seize any opportunity, or anything that looks like opportunity.
Work hard, not in grunt work, but in chasing such opportunities and maximizing exposure to them. This makes living in big cities invaluable because you increase the odds of serendipitous encounters—you gain exposure to the envelope of serendipity.
Beware of precise plans by governments
Notes: 1) the longer range the plan is then the less precise it should be
All these recommendations have one point in common: asymmetry. Put yourself in situations where favorable consequences are much larger than unfavorable ones.
As it happens, many rare events can yield their structure to us: it is not easy to compute their probability, but it is easy to get a general idea about the possibility of their occurrence. We can turn these Black Swans into Gray Swans, so to speak, reducing their surprise effect. A person aware of the possibility of such events can come to belong to the non-sucker variety.
Notes: 1) this is antifragilizing yourself
there seem to be “basins of attraction” directing us to certain beliefs. Some ideas will prove contagious, but not others; some forms of superstitions will spread, but not others; some types of religious beliefs will dominate, but not others.
Notes: 1) Basins are strange attractors which exist as part of archetypal story structure built into our consciousness
Luck is far more egalitarian than even intelligence. If people were rewarded strictly according to their abilities, things would still be unfair—people don’t choose their abilities. Randomness has the beneficial effect of reshuffling society’s cards, knocking down the big guy.
What Anderson saw is that the Web causes something in addition to concentration. The Web enables the formation of a reservoir of proto-Googles waiting in the background. It also promotes the inverse Google, that is, it allows people with a technical specialty to find a small, stable audience.
Notes: 1) the long tail creates a reservoir of people who can survive and if luck strikes. There is a richer ecosystem bc of lower transaction costs.
The role of the long tail is fundamental in changing the dynamics of success, destabilizing the well-seated winner, and bringing about another winner. In a snapshot this will always be Extremistan, always ruled by the concentration of type-2 randomness; but it will be an ever-changing Extremistan.
Notes: 1) punch line is that at any one point in time it looks unequal but existence of long tail mean that who is on top will change frequently.
But consider how the long tail could affect the future of culture, information, and political life. It could free us from the dominant political parties, from the academic system, from the clusters of the press—anything that is currently in the hands of ossified, conceited, and self-serving authority.
Notes: 1) the blockchain makes the long tail longer
The increased concentration among banks seems to have the effect of making financial crisis less likely, but when they happen they are more global in scale and hit us very hard.
Judaism, which had been polygenic, became monogamous in the Middle Ages. One can say that such a strategy has been successful—the institution of tightly monogamous marriage (with no official concubine, as in the Greco-Roman days), even when practiced the “French way,” provides social stability since there is no pool of angry, sexually deprived men at the bottom fomenting a revolution just so they can have the chance to mate.
Notes: 1) monogamy reduces volatility like democracy. it is a pressure release
Fractal Wealth Distribution with Large Inequalities
Notes: 1) We think outliers are more rare than they really are. stay alive and in the longtail and look for the fat tail opportunities.
Eliminating Unfair Influence Let me state here that, except for the grocery-store mentality, I truly believe in the value of middleness and mediocrity—what humanist does not want to minimize the discrepancy between humans? Nothing is more repugnant than the inconsiderate ideal of the Übermensch! My true problem is epistemological. Reality is not Mediocristan, so we should learn to live with it.
Notes: 1) this needs to be addressed. not good but true.
Market moves 3 (or lower)
Notes: 1) would it be fair to say that a higher exponent is a more efficient market? don’t think so but maybe?
knew the data revealed a fractal power law, but we learned that one could not produce a precise number. But what we did know—that the distribution is scalable and fractal—was sufficient for us to operate and make decisions.
Notes: 1) better to be roughly right than precisely wrong
Distribution does tell you whether you have enough data to “build confidence” about what you are inferring. If it is a Gaussian bell curve, then a few points will suffice (the law of large numbers once again).
Notes: 1) if you only have a little data then all distributions look Gaussian. or more precisely, any distribution will appear more gaussian than it really is.
Although we have never known a lay book to sell 200 million copies, we can consider that the possibility is not zero. It’s small, but it’s not zero. For every three Da Vinci Code–style bestsellers, there might be one superbestseller, and though one has not happened so far, we cannot rule it out. And for every fifteen Da Vinci Codes there will be one superbestseller selling, say, 500 million copies.
it makes investment in a book or a drug better than statistics on past data might suggest. But it can make stock market losses worse than what the past shows.
Notes: 1) investments should have possibilities of spiralling upwards but protected by some lower bound aka the barbell
I have written this entire book about the Black Swan. This is not because I am in love with the Black Swan; as a humanist, I hate it. I hate most of the unfairness and damage it causes. Thus I would like to eliminate many Black Swans, or at least to mitigate their effects and be protected from them.
it is contagion that determines the fate of a theory in social science, not its validity.
Notes: 1) authoritarian high modernist theories win out because they are more aesthetically pleasing
I worry less about advertised and sensational risks, more about the more vicious hidden ones. I worry less about terrorism than about diabetes, less about matters people usually worry about because they are obvious worries, and more about matters that lie outside our consciousness and common discourse
I am very aggressive when I can gain exposure to positive Black Swans—when a failure would be of small moment—and very conservative when I am under threat from a negative Black Swan. I am very aggressive when an error in a model can benefit me, and paranoid when the error can hurt.
I am no-nonsense and practical in academic matters, and intellectual when it comes to practice.
Notes: 1) when asked philosophical questions then give practical answers. when asked practical questions give philosophical answers
I have taught myself to resist running to keep on schedule. This may seem a very small piece of advice, but it registered. In refusing to run to catch trains, I have felt the true value of elegance and aesthetics in behavior, a sense of being in control of my time, my schedule, and my life. Missing a train is only painful if you run after it!
Likewise, not matching the idea of success others expect from you is only painful if that’s what you are seeking. You stand above the rat race and the pecking order, not outside of it, if you do so by choice. Quitting a high-paying position, if it is your decision, will seem a better payoff than the utility of the money involved (this may seem crazy, but I’ve tried it and it works). This is the first step toward the stoic’s throwing a four-letter word at fate. You have far more control over your life if you decide on your criterion by yourself.
if we gave Mother Nature to economists, it would dispense with individual kidneys: since we do not need them all the time, it would be more “efficient” if we sold ours and used a central kidney on a time-share basis.
Notes: 1) fat scheduling is redundant
For another example of egregious model error, take the notion of comparative advantage supposedly discovered by Ricardo and behind the wheels of globalization. The idea is that countries should focus, as a consultant would say, on “what they do best” (more exactly, on where they are missing the smallest number of opportunities); so one country should specialize in wine and the other in clothes, although one of them might be better at both. But do some perturbations and alternative scenarios: consider what would happen to the country specializing in wine if the price of wine fluctuated. Just a simple perturbation around this assumption (say, considering that the price of wine is random, and can experience Extremistan-style variations) makes one reach a conclusion the opposite of Ricardo’s. Mother Nature does not like overspecialization, as it limits evolution and weakens the animals.
Notes: 1) comparative advantage is useful but don’t commit so much that it becomes fragilizing
have several years of income in cash before taking any personal risk—exactly my barbell idea of Chapter 11, in which one keeps high cash reserves while taking more aggressive risks but with a small portion of the portfolio.
Notes: 1) barbell applied to your life
debt is dangerous if you have some overconfidence about the future and are Black Swan blind, which we all tend to be.
Mother Nature does not like anything too big. The largest land animal is the elephant, and there is a reason for that. If I went on a rampage and shot an elephant, I might be put in jail, and get yelled at by my mother, but I would hardly disturb the ecology of Mother Nature. On the other hand, my point about banks in Chapter 14—that if you shot a large bank, I would “shiver at the consequences” and that “if one falls, they all fall”—was subsequently illustrated by events: one bank failure, that of Lehman Brothers, in September 2008, brought down the entire edifice. Mother Nature does not limit the interactions between entities; it just limits the size of its units.
Notes: 1) complex is good as long as it’s not too centralized
(Hence my idea is not to stop globalization and ban the Internet; as we will see, much more stability would be achieved by stopping governments from helping companies when they become large and by giving back advantages to the small guy.)
Notes: 1) blockchain!
Myhrvold enlightened me about an additional way to interpret and prove how globalization takes us into Extremistan: the notion of species density. Simply, larger environments are more scalable than smaller ones—allowing the biggest to get even bigger, at the expense of the smallest, through the mechanism of preferential attachment we saw in Chapter 14. We have evidence that small islands have many more species per square meter than larger ones, and, of course, than continents. As we travel more on this planet, epidemics will be more acute—we will have a germ population dominated by a few numbers, and the successful killer will spread vastly more effectively. Cultural life will be dominated by fewer persons: we have fewer books per reader in English than in Italian (this includes bad books).
Companies will be more uneven in size. And fads will be more acute. So will runs on the banks, of course. Once again, I am not saying that we need to stop globalization and prevent travel. We just need to be aware of the side effects, the trade-offs—and few people are. I see the risks of a very strange acute virus spreading throughout the planet.
Notes: 1) more interconnection leads to more winner take all
For Aristotle, an object had a clear purpose set by its designer. An eye was there to see, a nose to smell. This is a rationalistic argument, another manifestation of what I call Platonicity.
Notes: 1) also authoritarian high modernism
the idea is not to correct mistakes and eliminate randomness from social and economic life through monetary policy, subsidies, and so on. The idea is simply to let human mistakes and miscalculations remain confined, and to prevent their spreading through the system, as Mother Nature does. Reducing volatility and ordinary randomness increases exposure to Black Swans—it creates an artificial quiet.
Notes: 1) we cant prevent them but we can limit downside
populations will experience Extremistan-style variability, hence predators will necessarily go through periods of feast and famine. That’s us, humans—we had to have been designed to experience extreme hunger and extreme abundance. So our food intake had to have been fractal.
Why am I using evolutionary arguments? Not because of the optimality of evolution, but entirely for epistemological reasons, how we should deal with a complex system with opaque causal links and complicated interactions. Mother Nature is not perfect, but has so far proven smarter than humans, certainly much smarter than biologists. So my approach is to combine evidence-based research (stripped of biological theory), with an a priori that Mother Nature has more authority than anyone.
Notes: 1) good way of saying it
we can lower 90 percent of Black Swan risks in economic life … by just eliminating speculative debt.
philistines (and Federal Reserve chairpersons) mistake periods of low volatility (caused by stabilization policies) for periods of low risk, not for switches into Extremistan.
Giving these enlightened Bildungsphilisters, commonly called idea-book readers, a real book is like giving vintage Bordeaux to drinkers of Diet Coke and listening to their comments about it. Their typical complaint is that they want diet-book-style “actionable steps” or “better forecasting tools,” satisfying the profile of the eventual Black Swan victim.
Notes: 1) true
In addition to playing into our mental biases, and telling people what they want to hear, these “idea books” often have an abhorrent definitive and investigative tone to their messages, like the reports of management consultants trying to make you believe that they told you more than they actually did.
Sticking my neck out in the real world, lining up my life with my ideas by getting involved in trading, had a therapeutic effect, even apart from the vindication; just having a trade on the books gave me strength to not care.
Notes: 1) skin in the game
When you walk the walk, whether successful or not, you feel more indifferent and robust to people’s opinion, freer, more real.
a Black Swan for the turkey is not a Black Swan for the butcher. The same applies to the crisis of 2008, certainly a Black Swan to almost all economists, journalists, and financiers on this planet (including, predictably, Robert Merton and Myron Scholes, the turkeys of Chapter 17), but certainly not to this author.
Notes: 1) a black swan is only a black swan if you don’t expect it.
Note the unsurprising, but very consequential fact that people with Asperger syndrome are highly averse to ambiguity. Research shows that academics are overrepresented in the systematizing, Black-Swan-blind category; these are the people I called “Locke’s madmen”
we consider the biggest object of any kind that we have seen in our lives as the largest possible item:
Few understand that there is generally no such thing as a reachable long run except as a mathematical construct to solve equations; to assume a long run in a complex system, you need to also assume that nothing new will emerge.
Notes: 1) Keynes error
I am not saying “S**t happens,” I am saying “S**t happens in the Fourth Quadrant,”
Focusing on the True/False distinction, epistemology remained, with very few exceptions, prisoner of an inconsequential, and highly incomplete, 2-D framework. The third missing dimension is, of course, the consequence of the True, and the severity of the False, the expectation. In other words, the payoff from decisions, the impact and magnitude of the result of such a decision.
Indeed, I know of almost no decision that is based on notions of True/False. Once you start examining the payoff, the result of decisions, you will see clearly that the consequences of some errors may be benign, those of others may be severe. And you pretty much know which is which beforehand. You know which errors are consequential and which ones are not so much.
The more remote the event, the less we can get empirical data (assuming generously that the future will resemble the past) and the more we need to rely on theory. Consider that the frequency of rare events cannot be estimated from empirical observation for the very reason that they are rare.
Let me provide once again an illustration of Extremistan. Less than 0.25 percent of all the companies listed in the world represent around half the market capitalization, a less than minuscule percentage of novels on the planet accounts for approximately half of fiction sales, less than 0.1 percent of drugs generate a little more than half the pharmaceutical industry’s sales—and less than 0.1 percent of risky events will cause at least half the damages and losses.
This problem of confusion of the two arrows is very severe with probability, particularly with small probabilities.
Notes: 1) fat tails or black swans are the result of a lack of knowledge about a lack of knowledge.
The inverse problem is more acute when more theories, more distributions can fit a set of data, particularly in the presence of nonlinearities or nonparsimonious distributions.† Under nonlinearities, the families of possible models/parametrization explode in numbers.
Notes: 1) free energy principle states our brains are parsimonious. we are wired to ignore black swans. this could go alongside zipper experiment.
most of what students of mathematical statistics do is assume a structure similar to the closed structures of games, typically with a priori known probability. Yet the problem we have is not so much making computations once you know the probabilities, but finding the true distribution for the horizon concerned.
Notes: 1) the problem is not dealing with known unknowns but unknown unknowns
You may correctly predict that a skilled person will get “rich,” but, conditional on his making it, his wealth can reach $1 million, $10 million, $1 billion, $10 billion—there is no typical number.
Notes: 1) mild to moderate skill is attributable to hard work and intelligence. huge success is attributable to variance.
This absence of “typical” events in Extremistan is what makes something called prediction markets (in which people are assumed to make bets on events) ludicrous, as they consider events to be binary. “A war” is meaningless: you need to estimate its damage—and no damage is typical.
Notes: 1) against prediction markets
Associated with the previous fallacy is the mistake of thinking that my message is that these Black Swans are necessarily more probable than assumed by conventional methods. They are mostly less probable, but have bigger effects.
Our research shows that the way a risk is framed sharply influences people’s understanding of it. If you say that, on average, investors will lose all their money every thirty years, they are more likely to invest than if you tell them they have a 3.3 percent chance of losing a certain amount every year.
What Is Complexity? I will simplify here with a functional definition of complexity—among many more complete ones. A complex domain is characterized by the following: there is a great degree of interdependence between its elements, both temporal (a variable depends on its past changes), horizontal (variables depend on one another), and diagonal (variable A depends on the past history of variable B). As a result of this interdependence, mechanisms are subjected to positive, reinforcing feedback loops, which cause “fat tails.”
That is, they prevent the working of the Central Limit Theorem that, as we saw in Chapter 15, establishes Mediocristan thin tails under summation and aggregation of elements and causes “convergence to the Gaussian.” In lay terms, moves are exacerbated over time instead of being dampened by counterbalancing forces. Finally, we have nonlinearities that accentuate the fat tails.
So, complexity implies Extremistan. (The opposite is not necessarily true.)
Notes: 1) important
risk management program to robustify portfolios against model error, error mostly stemming from the government’s error in the projection of deficits, leading to excessive borrowing and possible hyperinflation.
It is much more sound to take risks you can measure than to measure the risks you are taking.
I. The first type of decision is simple, leading to a “binary” exposure:
Notes: 1) complicated. no unforseen second and third order consequences.
II. The second type of decision is more complex and entails more openended exposures.
Notes: 1) complex
To paraphrase Danny Kahneman, for psychological comfort some people would rather use a map of the Pyrénées while lost in the Alps than use nothing at all. They do not do so explicitly, but they actually do worse than that while dealing with the future and using risk measures. They would prefer a defective forecast to nothing. So providing a sucker with a probabilistic measure does a wonderful job of making him take more risks.
How do you live long? By avoiding death. Yet people do not realize that success consists mainly in avoiding losses, not in trying to derive profits.
religion saved lives by taking the patient away from the doctor. You could satisfy your illusion of control by going to the Temple of Apollo rather than seeing the doctor.
Notes: 1) sometimes I think the good thing abut writing is that it is a cheap distraction that keeps me from fucking up other things that are working.
The most obvious way to exit the Fourth Quadrant is by “truncating,” cutting certain exposures by purchasing insurance, when available, putting oneself in the “barbell” situation described in Chapter 13. But if you are not able to barbell, and cannot avoid the exposure, as with, say, climate notions, exposure to epidemics, and similar items from the previous table, then we can subscribe to the following rules of “wisdom” to increase robustness.
Notes: 1) what to do when the barbell does not work.
the burden of proof lies on someone disturbing a complex system, not on the person protecting the status quo.
Complex systems survive thanks to slack and redundancy, not debt and optimization.
Last Updated on January 9, 2020 by Taylor Pearson