The conditional probability that occurs given that has already occurred is denoted by .
It’s not too hard to reason that
The multiplication rule says
Let and be events, since , we get
This is useful, because sometimes it’s easier to calculate conditional probabilities than direct probabilities
Bayes’ theorem states that
By extension, if are mutually exclusive events and we’d like to know which has occurred given ,
The above demonstrates the law of total probability, where are mutually disjoint and cover the sample space
The odds of an event are . After evidence has been introduced,
It follows that two events are independent if . Otherwise, two events are dependent.
events are independent if any subset of the events are independent. An infinite set of events is independent if any finite subset is independent.
Example: Say Alice flips coins and Bob flips coins that have heads with probability . What’s the probability that Alice flips more?
The probability that Bob gets exactly coins is
If there are flips for Bob, the probability that Alice wins is
So
Altogether we get
This is fucking ugly
It does turn out that when ,
ALTERNATE way of looking at it,
Let both players flip coins. At this point, either Alice is winning, Bob is winning, or it’s a tie.
It’s pretty clear that
If she’s already winning then she’s won. If she’s already losing then she’s lost. So the probability she wins in total is
This second argument is very elegant. BUT it is also very fragile. Just about any change to the problem statement would break the logic…