Bounded Rationality Most bad decisions aren't bad. They're rational responses to incomplete information, made by people who can't see the whole system. Before asking who got it wrong, ask what they couldn't see. --- ## The information boundary Herbert Simon observed in 1955 that people don't optimise. They satisfice: choose the first option that clears a threshold, using whatever information is within reach. In any system with more moving parts than one person can track, optimisation isn't possible. The search for good-enough is the rational response. What bounds the rationality is position. A regional sales manager cutting price to win a deal can see their territory target, the competitor's offer, and the client's deadline. They can't see that three other regional managers are making the same concession this quarter, or that the pattern will reset buyer expectations across the entire market. Their decision is rational within their boundary. The aggregate outcome isn't. The gap between the local view and the system view is where most organisational dysfunction lives. The judgement is sound. The picture is incomplete. --- ## The diagnostic shift The instinct when results disappoint is to look for who got it wrong. Bounded rationality says the more useful question is what they couldn't see. This reframe changes what you do about it. If the problem is judgement, the fix is better people or more oversight. If the problem is information, the fix is cheaper and more durable: widen what people can see before they decide. Meadows puts it cleanly in [[Thinking in Systems]]: "Bounded rationality explains most 'irrational' behaviour: people make reasonable decisions based on incomplete information about distant parts of the system. Widen the information boundaries and behaviour changes without anyone having to try harder." [[Execution trap]] traces a related failure: leaders diagnose an effort problem when the real constraint is what people can see and how work flows between them. ---