Definition
Concept
Expected value (EV) or expectation: weighted average of all possible values of a random variable. Represents long-run mean outcome. Denoted E[X] for random variable X.
Mathematical Formulation
EV defined as an integral or summation over all outcomes weighted by their probabilities.
Interpretation
Interpreted as center of mass of distribution, balancing point of probability-weighted outcomes.
Discrete Random Variables
Definition
For discrete X with values x_i with probabilities p_i: E[X] = Σ x_i p_i.
Conditions
Sum of p_i = 1; series Σ |x_i| p_i convergent (finite expectation).
Example
Die roll: values 1 to 6, each with p=1/6; E[X] = (1+2+3+4+5+6)/6 = 3.5.
Table: Discrete Distribution Example
| Value (x_i) | Probability (p_i) |
|---|---|
| 1 | 1/6 |
| 2 | 1/6 |
| 3 | 1/6 |
| 4 | 1/6 |
| 5 | 1/6 |
| 6 | 1/6 |
Continuous Random Variables
Definition
For continuous X with probability density function (pdf) f(x): E[X] = ∫ x f(x) dx over support.
Conditions
Integral of |x| f(x) finite; pdf integrates to 1 over domain.
Example
Uniform distribution U(a,b): E[X] = (a + b)/2 as pdf = 1/(b - a).
E[X] = ∫_{-∞}^∞ x f(x) dxProperties
Existence
EV exists if integral or sum of |x| times probability finite.
Uniqueness
EV unique for given distribution.
Monotonicity
If X ≥ Y almost surely, then E[X] ≥ E[Y].
Boundedness
If X bounded by M, then |E[X]| ≤ M.
Linearity of Expectation
Definition
E[aX + bY] = aE[X] + bE[Y] for any random variables X,Y and constants a,b.
Independence
Independence not required for linearity.
Extension
Extends to finite or countable sums: E[Σ X_i] = Σ E[X_i].
E( aX + bY ) = a E(X) + b E(Y)Calculation Techniques
Direct Summation/Integration
Apply definition using pmf/pdf.
Using Moment Generating Functions
Derive EV as first derivative of MGF at zero.
Law of the Unconscious Statistician
Calculate E[g(X)] = Σ g(x_i) p_i or ∫ g(x) f(x) dx without pdf of g(X).
Conditional Expectation
E[X] = E[ E[X|Y] ] (Law of total expectation).
Examples
Simple Discrete
Bernoulli(p): E[X] = p.
Binomial(n,p)
E[X] = np.
Geometric(p)
E[X] = 1/p.
Continuous Uniform
E[X] = (a+b)/2.
Normal Distribution
E[X] = μ (mean parameter).
| Distribution | Expected Value E[X] |
|---|---|
| Bernoulli(p) | p |
| Binomial(n,p) | np |
| Geometric(p) | 1/p |
| Uniform(a,b) | (a + b)/2 |
| Normal(μ, σ²) | μ |
Applications
Decision Theory
Optimize choices by maximizing expected utility or payoff.
Statistics
Expected value as estimator of central tendency and population mean.
Finance
Calculate expected returns, risk assessment, pricing derivatives.
Game Theory
Evaluate strategies based on expected payoffs.
Machine Learning
Loss functions and expected risk minimization.
Relation to Variance
Definition
Variance Var(X) = E[(X − E[X])²] quantifies spread around expectation.
Formula
Var(X) = E[X²] − (E[X])²Implications
EV alone insufficient to describe variability; variance complements expectation.
Expected Value in Common Distributions
Bernoulli
E[X] = p.
Binomial
E[X] = np.
Poisson
E[X] = λ.
Uniform
E[X] = (a + b)/2.
Normal
E[X] = μ.
Limitations and Misconceptions
Non-existence
EV may not exist if distribution has heavy tails (e.g. Cauchy distribution).
Not a Typical Outcome
EV may be non-attainable or non-representative of typical outcomes.
Misuse in Decision Making
Ignoring variance and risk leads to misleading conclusions.
Extensions and Generalizations
Conditional Expectation
Expectation conditioned on another variable; key in stochastic processes.
Vector-valued Random Variables
EV defined component-wise for random vectors.
General Measure-Theoretic Definition
Expectation as Lebesgue integral with respect to probability measure.
Higher Moments
Expectation extends to moments of all orders: E[Xⁿ].
References
- Feller, W. An Introduction to Probability Theory and Its Applications, Vol. 1, Wiley, 1968, pp. 45-80.
- Ross, S. M. A First Course in Probability, 9th ed., Pearson, 2014, pp. 105-140.
- Grimmett, G., Stirzaker, D. Probability and Random Processes, 3rd ed., Oxford University Press, 2001, pp. 60-95.
- Billingsley, P. Probability and Measure, 3rd ed., Wiley, 1995, pp. 130-160.
- Casella, G., Berger, R. L. Statistical Inference, 2nd ed., Duxbury Press, 2002, pp. 200-230.