_Data only becomes information once it passes through a decision process._ --- ## The missing layer Dashboards are everywhere. ERP extracts, CRM reports, customer click-streams, cost curves, funnel metrics. Yet basic questions still stall. Eliyahu Goldratt spotted this decades ago. Data only becomes information once it passes through a decision process. Without that layer, data is just accumulation. Most organisations confuse the two. They build dashboards, track metrics, celebrate when numbers go up. But ask what they'll do differently if churn rises or NPS drops, and the room goes quiet. The data exists. The decision process doesn't. This gap is expensive. Teams spend weeks building reports that get glanced at in meetings and set aside. Executives say they're "data-driven" but make calls based on intuition, politics, or whoever spoke last. The dashboards impress, but the conversation always collapses into the same question: what do we do next? --- ## Rituals aren't processes Many management routines look systematic but aren't. A finance meeting that reviews sales variance and then asks "thoughts?" feels structured. It has a cadence, an agenda, a slide deck. But it still runs on memory and bias. There's no explicit logic connecting the data to a decision. Compare that to a rule: if demand gap exceeds 3%, trigger a capacity review. Now the inputs, thresholds, and responses are explicit. The data flows into a decision. Someone owns what happens next. Systematic doesn't mean rigid. Rules can flex, thresholds can adjust. But the logic is visible. You can see it, question it, improve it. Most organisations have rituals disguised as processes. The cadence exists, the data gets reviewed, but the decision logic lives in someone's head — or nowhere at all. --- ## The discipline The shift is to start with the decision, not the data. What choice are you trying to make? What would you do if the answer were X versus Y? If you can't answer that, the measurement is decoration. Then work backwards. What information would actually change your action? How will you collect it? Who needs to see it? What cadence matches the decision cycle? This forces clarity. In practice, most measurements turn out to be unnecessary. The ones that remain are sharp, actionable, and directly tied to choices someone will actually make. Take SG&A variance. Most finance teams produce monthly reports showing spend by category — travel up 12%, contractors down 8%, software flat. These numbers sit in slide decks and get discussed in general terms. Nothing changes. A decision-first approach starts differently. The question isn't "what did we spend?" It's "which variances require action?" So you set thresholds upfront. Variance over 15%? Investigate root cause and propose correction. Between 10-15%? Department head explains in writing. Under 10%? Monitor, no action needed. Now the report has purpose. The data flows into explicit thresholds, which trigger defined actions, owned by named people. The measurement becomes information because it connects to a decision. --- ## Why this matters When you build this properly, three things happen. First, you collect less data — only what actually matters. Second, decisions get faster — the process is already designed. Third, you learn systematically — because you've stated what you expect and can compare it with what actually happened. That last point is the real payoff. Decision design isn't just about making better calls today. It's about building a learning loop that sharpens your model of reality over time. The bottleneck is rarely data volume. It's decision design. Most organisations are drowning in data while starving for information — because no one designed the layer that connects them. --- **Related:** [[Notes/When Numbers Twitch|When Numbers Twitch]] · [[Notes/Hidden Bottleneck|Hidden Bottleneck]] **See also:** [[Ideas/Goodhart's Law|Goodhart's Law]] · [[Ideas/McNamara Fallacy|McNamara Fallacy]]