Introduction to discrete probability. The first part of the class was a quiz on discrete probability.
Notes originally by @xiamx, with some formatting/language changes by @dellsystem.
1Probability¶
The study of probability was initially motivated by gambling. We will consider probability w.r.t. finite probability spaces (or finitely countable spaces). A sample space S is the set of all possible outcomes of an experiment. For example, for a die, the sample space is S={1,2,3,4,5,6}.
An event A is a subset of the sample space S.
We associate with each outcome x∈S the probability p(x) that x is the outcome of the experiment. We say that p:2S↦R is a valid probability distribution on S if:
- p(x)≥0 ∀x∈S
- ∑x∈Sp(x)=1
- p(A)=∑x∈Ap(x)
Sometimes, the outcomes are all equally likely, with p(x)=1|S|∀x∈S (for example, p(x)=16 for any number x on a die). Usually, though, this is not the case.
1.1Examples¶
- What is the probability of winning Lotto 6/49, in which the goal is to correctly pick 6 numbers, all in {1,…,49}?
- Well, the size of the sample space is \binom{49}{6} = 13983816. So the probability is \frac{1}{13983816}.
- What is the probability of a full house in a game of poker?
- The sample space is \binom{52}{5} = 2598960. The number of ways to get a full house is 13 \cdot \binom{4}{3} \cdot 12 \cdot \binom{4}{2} = 3744, so the probability is \frac{3744}{2598960}.
1.2Union and intersection¶
Lemma: For any probability distribution P:S \to \mathbb{R} and any 2 events A and B, we have:
p(A\cup B) = p(A) +p(B) - p(A \cap B)
The proof is trivial using set algebra.
Furthermore:
P(A \cup B) = P(A-B) + P(B-A) + P(A\cup B)