How to think clearly about uncertain outcomes before you act on them.
A fair coin is flipped. Heads you win $150, tails you lose $100. Should you play? Most people say yes without being able to explain precisely why. Expected value is the tool that makes "should you play" a calculation rather than a feeling, and it shows up in insurance pricing, drug trials, investment decisions, and game theory.
Expected value is the probability-weighted average of all possible outcomes. For the coin flip:
E = (0.5 x $150) + (0.5 x -$100) = $75 - $50 = $25
The expected value is $25. This means that if you played this game thousands of times, your average profit per game would converge to $25. On any single game you win $150 or lose $100, but the long-run average is $25. Because that number is positive, playing is rational if you can tolerate the variance.
A lottery ticket costs $2. The jackpot is $10 million, hit with probability 1 in 10 million. There is also a $10 secondary prize hit with probability 1 in 100.
E = (1/10,000,000 x $10,000,000) + (1/100 x $10) + (9,989,999/10,000,000 x -$2)
E = $1.00 + $0.10 - $1.998
E = -$0.898
Each ticket costs you roughly 90 cents in expectation. The lottery is a losing bet. This is always true of lotteries, by design: they must pay out less than they take in to fund operations and profit.
Consider this offer: flip a coin. Heads, you win $10,000. Tails, you lose everything you own. If your net worth is $8,000, the expected value is positive. Almost no one should take this bet.
The reason is that expected value treats all dollars as equivalent regardless of context. But losing everything you own is not the same as losing $8,000 from a large fortune. The same dollar amount can be catastrophic in one context and trivial in another. This is why economists work with expected utility rather than expected value for high-stakes decisions: utility accounts for the diminishing returns of additional wealth and the asymmetric pain of large losses.
Expected value is the right tool when stakes are small relative to your resources, outcomes are repeatable, and you are trying to optimize long-run performance. For one-off, high-stakes decisions, it is a starting point, not the full answer.
Expected value appears constantly in statistics under different names. The mean of a random variable is its expected value. The bias of an estimator is the difference between its expected value and the true parameter. The concept of a fair game, an unbiased estimator, and a breakeven bet all reduce to expected value equals zero. Getting comfortable with the mechanics of probability-weighted averaging is foundational to understanding what statistical formulas are doing at a conceptual level.
How to think clearly about uncertain outcomes before you act on them.
Lorem ipsum dolor sit amet consectetur at amet felis nulla molestie non viverra diam sed augue gravida ante risus pulvinar diam turpis ut bibendum ut velit felis at nisl lectus.