!main_tags!Discrete Random Variables - probability | What's Your IQ !main_header!

Definition and Basic Concepts

Random Variable

Random variable (RV): function mapping outcomes of a random experiment to real numbers. Discrete RV: takes countable set of distinct values. Examples: number of heads in coin tosses, number of arrivals in queue.

Sample Space and Events

Sample space (Ω): set of all possible outcomes. Event: subset of Ω. Discrete RV values correspond to events with assigned probabilities.

Range and Support

Range: set of values RV can assume. Support: subset of range with positive probability. Finite or countably infinite sets.

Probability Mass Function (PMF)

Definition

PMF p_X(x): probability that discrete RV X equals value x. Formal: p_X(x) = P(X = x). Satisfies two conditions: non-negativity and sum to one.

Properties

Non-negativity: p_X(x) ≥ 0 ∀ x. Normalization: Σ p_X(x) = 1 over all x in support. Determines full distribution of discrete RV.

Examples

Bernoulli PMF: p_X(0) = 1-p, p_X(1) = p. Binomial PMF: p_X(k) = C(n,k) p^k (1-p)^(n-k).

Distribution PMF Support
Bernoulli(p) p_X(x) = p^x (1-p)^{1-x}, x ∈ {0,1} {0,1}
Binomial(n,p) p_X(k) = C(n,k) p^k (1-p)^{n-k}, k=0,...,n {0,...,n}
Poisson(λ) p_X(k) = e^{-λ} λ^k / k!, k=0,1,... {0,1,2,...}

Cumulative Distribution Function (CDF)

Definition

CDF F_X(x) = P(X ≤ x). Non-decreasing, right-continuous, limits: 0 as x→-∞, 1 as x→∞.

Relation to PMF

For discrete RV: F_X(x) = Σ_{t ≤ x} p_X(t). CDF fully characterizes distribution.

Properties

Step function with jumps at points in support. Jump size equals PMF at that point.

F_X(x) = { 0, x < x_1 Σ_{x_i ≤ x} p_X(x_i), otherwise}

Expectation and Variance

Expectation (Mean)

Expected value E[X] = Σ x p_X(x). Measure of central tendency. Exists if sum converges absolutely.

Variance

Variance Var(X) = E[(X - E[X])²] = E[X²] - (E[X])². Measure of dispersion.

Higher Moments

n-th moment: E[X^n]. Used to quantify skewness, kurtosis, and other shape features.

E[X] = Σ x p_X(x)Var(X) = Σ (x - E[X])² p_X(x) = E[X²] - (E[X])²

Common Discrete Distributions

Bernoulli Distribution

Single trial with success probability p. Support: {0,1}. E[X] = p, Var(X) = p(1-p).

Binomial Distribution

Number of successes in n independent Bernoulli trials. Parameters: n, p. E[X] = np, Var(X) = np(1-p).

Poisson Distribution

Models count of events in fixed interval with rate λ. Support: nonnegative integers. E[X] = Var(X) = λ.

Geometric Distribution

Number of trials until first success. Support: {1,2,...}. E[X] = 1/p, Var(X) = (1-p)/p².

Distribution Parameters Mean (E[X]) Variance (Var(X))
Bernoulli p p p(1-p)
Binomial n, p np np(1-p)
Poisson λ λ λ
Geometric p 1/p (1-p)/p²

Functions of Discrete Random Variables

Definition

Function g(X): transforms RV X into new RV Y = g(X). Y remains discrete if X discrete.

Distribution of g(X)

PMF of Y: p_Y(y) = Σ_{x: g(x)=y} p_X(x). Requires summation over pre-images of y.

Examples

If X counts successes, g(X) = indicator if X > 0 (Bernoulli RV). Transformation simplifies distributions.

Joint Discrete Random Variables

Joint PMF

For RVs X, Y: p_{X,Y}(x,y) = P(X=x, Y=y). Defines joint distribution on product support.

Marginal PMF

Marginal for X: p_X(x) = Σ_y p_{X,Y}(x,y). Similar for Y.

Conditional PMF

p_{X|Y}(x|y) = p_{X,Y}(x,y) / p_Y(y), if p_Y(y) > 0. Describes distribution of X given Y=y.

p_X(x) = Σ_y p_{X,Y}(x,y)p_{X|Y}(x|y) = p_{X,Y}(x,y) / p_Y(y)

Independence of Discrete Random Variables

Definition

X and Y independent if p_{X,Y}(x,y) = p_X(x) p_Y(y) ∀ x,y.

Properties

Independence implies uncorrelatedness but converse not always true. Joint moments factorize.

Testing Independence

Compare joint PMF with product of marginals. Deviations imply dependence.

Moment Generating Functions (MGF)

Definition

MGF M_X(t) = E[e^{tX}] = Σ e^{tx} p_X(x). Exists in neighborhood of t=0.

Uses

Characterizes distribution uniquely if exists. Facilitates calculation of moments via derivatives.

Properties

MGF of sum of independent RVs is product of individual MGFs. Useful in limit theorems.

M_X(t) = Σ_x e^{t x} p_X(x)E[X^n] = M_X^{(n)}(0) = (d^n/dt^n) M_X(t) |_{t=0}

Law of Large Numbers

Statement

Sample averages of i.i.d. discrete RVs converge to expectation as sample size → ∞.

Types

Weak Law: convergence in probability. Strong Law: almost sure convergence.

Implications

Justifies frequency interpretation of probability. Basis for statistical estimation.

Limit Theorems and Convergence

Central Limit Theorem (CLT)

Sum of i.i.d. discrete RVs normalized converges in distribution to normal. Applies for large n.

Convergence Modes

Almost sure, in probability, in distribution. Each with distinct implications.

Applications

Approximations of binomial by normal, Poisson by normal under suitable conditions.

Applications in Statistics and Engineering

Statistical Modeling

Model discrete outcomes: successes, failures, count data. Basis for hypothesis testing, estimation.

Queueing Theory

Model arrivals, service counts with Poisson, geometric RVs. Analyze system performance metrics.

Reliability Engineering

Model component failures, lifetimes using discrete distributions. Calculate system reliability.

Information Theory

Discrete RVs model source symbols. Entropy and mutual information defined on PMFs.

References

  • Ross, S. M. "Introduction to Probability Models", Academic Press, 11th Ed., 2014, pp. 45-120.
  • Grimmett, G., Stirzaker, D. "Probability and Random Processes", Oxford University Press, 3rd Ed., 2001, pp. 85-130.
  • Feller, W. "An Introduction to Probability Theory and Its Applications", Vol. 1, Wiley, 3rd Ed., 1968, pp. 150-210.
  • Billingsley, P. "Probability and Measure", Wiley, 3rd Ed., 1995, pp. 200-250.
  • Durrett, R. "Probability: Theory and Examples", Cambridge University Press, 5th Ed., 2019, pp. 90-160.
!main_footer!