# Antifragile
**Nassim Nicholas Taleb**

---
_Stop predicting what will happen and start asking what breaks if you're wrong._
Fragile, robust, antifragile. Most thinking stops at the second category. We try to build things that survive shocks, that hold up under stress. Taleb's argument is that this isn't nearly ambitious enough. The antifragile doesn't just survive volatility; it improves from it. Wind extinguishes a candle and energises a fire. The question is which one you are.
This distinction matters more than it sounds. Resilience is a defensive posture. Antifragility is an active one. A system that gets better from disorder has a structural advantage over one that merely tolerates it, because disorder is the baseline condition.
---
**Fragility is measurable; risk is not.** You can't reliably predict rare events, but you can detect fragility before it matters. The test is asymmetry: anything with more upside than downside from random events is antifragile; the reverse is fragile. This shifts the entire frame. Instead of trying to forecast what will happen, which you can't do well in fat-tailed domains, you examine the shape of your exposure. If you're hurt badly by volatility and helped only modestly, you're fragile. That's detectable now, before the event.
The barbell strategy follows from this. Maximally safe at one end, maximally speculative at the other, nothing in the middle. The middle feels prudent but often isn't: moderate risk exposure means you're hurt by bad events and don't benefit much from good ones. The barbell lets volatility work for you at the speculative end whilst capping downside at the safe end.
---
**Eliminating volatility doesn't eliminate risk. It concentrates it.** This is the deepest practical implication in the book, and the one most routinely ignored. Stabilising a system by removing small shocks deprives it of the feedback it needs to self-correct. Problems that would have surfaced early, at manageable scale, accumulate instead until they surface catastrophically. Attempting to eliminate the business cycle creates the conditions for the mother of all business cycles. The [[Variance]] that looks like noise is often the signal.
Taleb's bodies example is memorable: under a standard training stimulus, the body doesn't just restore itself to baseline. It overshoots, building more capacity than was damaged. Evolution works the same way, through individual failure and selection rather than immortal organisms gradually improving. The antifragility of the whole often depends on the fragility of the parts.
---
**Don't mistake the catalyst for the cause.** When a fragile bridge collapses, attributing it to the last truck that crossed it is unintelligent. The fragility was already there; the truck was incidental. Most post-mortems focus on the triggering event because it's specific and recent and visible. The underlying fragility, which is what actually caused the failure, is structural and accumulated and invisible. This is worth keeping in mind when something goes wrong and everyone reaches for an explanation.
**The more frequently you look at data, the more noise you get** is how Taleb frames the [[Variance]] problem. Obsessive monitoring manufactures intervention. Significant signals reach you without constant surveillance. Checking less often isn't negligence; it's noise filtration.
---
**Optionality is a substitute for intelligence.** An option gives you asymmetric exposure: upside without corresponding downside. You don't need to understand the world correctly if you have enough favourable asymmetries, because the long run will work in your favour. Nature exploits this relentlessly. The rationality of optionality lies in keeping what's good and ditching the bad. The fragile has no option: it takes its exposure symmetrically, which means every bad outcome hits as hard as it looks. This is the logic behind [[Reversible decisions]]: preserve the ability to course-correct when the world turns out differently from what you expected.
**Respect things that have survived a long time.** This is perhaps the simplest heuristic in the book. Technologies, practices, and institutions that have been around for a long time have been tested across a wider range of conditions than anything recent. Their longevity is evidence of robustness, probably antifragility. Technologies prone to ageing are already dead. What's still here has already survived.
---
The uncomfortable implication for people who plan for a living: the antifragile don't need forecasting in the traditional sense. The question isn't "what will happen?" but "what breaks if I'm wrong?" That shift in framing is more useful than any prediction. It orients attention toward the shape of exposure rather than the probability of events, and exposure is something you can actually do something about, whether or not you can see the future.
---