Definition and Basic Concept
What is Conditional Probability?
Conditional probability quantifies the likelihood of an event occurring given that another event has already occurred. It refines probability estimates by incorporating known information.
Mathematical Expression
For events A and B, with P(B) > 0, conditional probability of A given B is:
P(A|B) = P(A ∩ B) / P(B)Interpretation
Focus: probability of A restricted to the sample space where B occurs. Effect: reduces uncertainty when partial information is available.
Notation and Terminology
Events
Event A: the event of interest. Event B: the condition or known event.
Symbols
P(A|B): probability of A conditional on B. P(A ∩ B): joint probability of both A and B. P(B): probability of B.
Terminology
Conditioning: restricting the sample space to event B. Joint event: occurrence of both A and B simultaneously.
Multiplication Rule
Formula
The multiplication rule relates joint probability to conditional probability:
P(A ∩ B) = P(B) × P(A|B) = P(A) × P(B|A)Usage
Calculates joint probabilities when conditional probabilities known. Useful in sequential or dependent events.
Symmetry
Relation is symmetric: P(A ∩ B) can be expressed via conditioning on A or B.
Independent and Dependent Events
Independent Events
Definition: events A and B are independent if occurrence of one does not affect the probability of the other.
Property
Mathematically: P(A|B) = P(A) and P(B|A) = P(B). Equivalently, P(A ∩ B) = P(A) × P(B).
Dependent Events
Definition: events where occurrence of B affects probability of A or vice versa. Conditional probability differs from marginal.
Bayes’ Theorem
Statement
Bayes' theorem updates conditional probabilities based on new evidence or information.
Formula
P(A|B) = [P(B|A) × P(A)] / P(B)Interpretation
Reverses conditioning: computes probability of A given B using likelihood P(B|A) and prior P(A).
Applications
Used in diagnostics, machine learning, decision theory, and inference.
Law of Total Probability
Partition of Sample Space
Sample space can be partitioned into mutually exclusive and exhaustive events B1, B2, ..., Bn.
Formula
P(A) = Σ P(A|Bi) × P(Bi), i=1 to nUse Case
Computes total probability of A by conditioning on all partitions.
Examples and Applications
Medical Testing
Probability of disease given positive test: P(Disease|Positive) using Bayes’ theorem.
Card Drawing
Probability of drawing an ace given a red card: P(Ace|Red).
Reliability Engineering
System failure probability given component failure.
Weather Forecasting
Rain probability given humidity levels or pressure changes.
Common Misconceptions
Confusing P(A|B) with P(B|A)
Conditional probabilities are not symmetric; P(A|B) ≠ P(B|A) generally.
Ignoring Conditioning
Failure to update probabilities when new information is available.
Assuming Independence Incorrectly
Assuming events independent without verifying can lead to errors.
Calculation Techniques and Tips
Stepwise Approach
Identify known probabilities. Define events clearly. Apply conditional probability formulas.
Use of Trees
Probability trees to visualize sequences with conditional branching.
Tables and Matrices
Organize joint and conditional probabilities systematically.
Conditional Probability Tables
Structure
Tables show P(A|B) values for combinations of events A and B.
Example Table
| Event B | P(A|B) |
|---|---|
| B1 | 0.2 |
| B2 | 0.5 |
| B3 | 0.7 |
Interpretation
Facilitates quick reference of conditional probabilities for multiple scenarios.
Conditional Probability in Distributions
Discrete Distributions
Conditional PMF: P(X = x | Y = y) used in joint discrete variables.
Continuous Distributions
Conditional PDF: fX|Y(x|y) = fX,Y(x,y) / fY(y), where densities exist.
Applications
Used in Bayesian networks, Markov chains, regression analysis.
Summary and Key Points
Core Concept
Conditional probability computes event likelihood given prior event occurrence.
Key Formulas
P(A|B) = P(A ∩ B)/P(B), P(A ∩ B) = P(B) × P(A|B), Bayes’ theorem: P(A|B) = [P(B|A) × P(A)] / P(B)Importance
Essential for understanding dependent events, updating beliefs, and decision making under uncertainty.
References
- Feller, W. "An Introduction to Probability Theory and Its Applications." Vol. 1, Wiley, 1968, pp. 50-80.
- Ross, S. M. "A First Course in Probability." 10th ed., Pearson, 2018, pp. 95-120.
- Grimmett, G., and Stirzaker, D. "Probability and Random Processes." 3rd ed., Oxford University Press, 2001, pp. 65-90.
- Casella, G., and Berger, R. "Statistical Inference." 2nd ed., Duxbury, 2002, pp. 40-70.
- Jaynes, E. T. "Probability Theory: The Logic of Science." Cambridge University Press, 2003, pp. 120-150.