Probability Guide

Classical Probability

P(A) = (favorable outcomes) / (total equally likely outcomes). Example: P(heads) = 1/2, P(rolling 6) = 1/6

Addition Rule

P(A โˆช B) = P(A) + P(B) โˆ’ P(A โˆฉ B)
If mutually exclusive: P(A โˆช B) = P(A) + P(B)

Multiplication Rule

P(A โˆฉ B) = P(A) ร— P(B|A)
If independent: P(A โˆฉ B) = P(A) ร— P(B)

Conditional Probability

P(A|B) = P(A โˆฉ B) / P(B) โ€” probability of A given B has occurred.

Bayes' Theorem

P(A|B) = P(B|A) ร— P(A) / P(B)
Used to update belief based on new evidence. Foundation of Bayesian statistics and ML classifiers.

Expected Value

E(X) = ฮฃ xแตข ยท P(xแตข) โ€” the weighted average of all possible outcomes. Example: E(fair die) = (1+2+3+4+5+6)/6 = 3.5