Thursday, February 20, 2014 CC-BY-NC
Discrete probability continued

Maintainer: admin

I wasn't here. Brief overview below.

1The Monty Hall problem

You should switch. Basically he merges two doors for you. Proof by Bayes' theorem.

2Discrete random variables

Probability density function: $P(X = v)$. The sum over all values is of course 1.

Expectation: sum over all values, multiplying by $v$ for each value.

Linearity of expectation: $E(aX + bY) = aE(X) + bE(Y)$, proof by definitions. Does not require independence.

If $X$ and $Y$ are independent, then $E(X\cdot Y) = E(X) \cdot E(Y)$.

2.1Conditional expectation

$$E(X | Y = y)= \sum_{x} x \cdot P(X = x|Y=y) = \sum_{x} x \cdot \frac{P(X =x, Y=y)}{P(Y=y)}$$

2.2The binomial distribution

Bernoulli trial: experiment with two outcomes (success/failure), success with probability $p$.

Binomial distribution: counts the number of successes in $n$ independent Bernoulli trials.

$$\binom{n}{k} = \frac{n!}{k!(n-k)!} \tag{number of ways to pick $k$ elements out of $n$}$$

$$E(X) = np \tag{by the linearity of expectation}$$