Expected Value: The Long-Run Average Expected value is the one number that tells you whether a bet, investment, or decision is worth making in the long run. Most people have never calculated it. Here's how.
Random Variables: Numbers That Depend on Chance Before you can compute expected values, variances, or probability distributions, you need a way to assign numbers to uncertain outcomes. Random variables are that bridge — an apparently simple concept that makes all of statistical mathematics possible.
Bayes' Theorem: Updating Beliefs with Evidence Most people update their beliefs wrong. Bayes' theorem gives the correct formula — and the gap between human intuition and Bayesian logic explains most bad reasoning about risk and evidence.
Conditional Probability: When Information Changes the Odds Every time you learn something new, the probability of everything else shifts. Conditional probability is the math that formalizes this — and it's behind Bayesian reasoning, medical testing, and every poker hand.
Basic Probability Rules: And Or and Not All of probability theory builds from three operations: AND (both events), OR (at least one), and NOT (neither). These rules have precise mathematical forms, and getting them right — especially the OR rule's overlap correction — is where beginners most often go wrong.
What Is Probability? Quantifying Uncertainty Pascal and Fermat invented probability to settle a gambling dispute. Three and a half centuries later, it underpins quantum mechanics, Bayesian reasoning, machine learning, and the entire statistical enterprise. What probability *is*, philosophically, remains surprisingly contested.
Probability Explained Probability starts with simple rules about how likelihoods combine — but those rules generate everything from the central limit theorem to machine learning. This is a grounded tour of the core framework: sample spaces, conditional probability, independence, and why Bayes' theorem keeps appearing.