Engineering Probability Class 7 Thu 2018-02-08

1   Discrete Random Variables

  1. Chapter 3, p 96. Discrete random variables
    1. From now on our random experiments will always produce numbers, called random variables, at least indirectly.
    2. Then we can compute, e.g., a fair value to pay for a gamble. What should you pay to play roulette so that betting on red breaks even on average?
    3. Discrete is different from discreet.
    4. Random experiment \(\rightarrow\) nonnumerical outcome \(\zeta\) \(\rightarrow\)
    5. Random Variable \(X(\zeta )\). Any real number.
    6. Random vars in general: X, Y, ...
    7. particular values: x, y, ...
    8. It's the outcome that's random, not the r.v., which is a deterministic function of the outcome.
  2. Example 3.1 Coin tosses
    1. Define X to be the number of heads from 3 tosses.
    2. \(\zeta\)
  3. Example 3.2 Betting game addon to 3.1
    1. Define another random var Y to be payoff: 8 if X=3, 1 if X=2, 0 else.
    2. Y is derived from X
  4. Example 3.3 add probs to 3.2, assuming fair coin. P[X=2], P[Y=8]
  5. 3.1.1 Ignore since it's starred.
  6. 3.2 Discrete r.v. and Probability mass function (pmf).
    1. The pmf shows the probability of every value of random variable X, and of every real number.
    2. If X cannot have the value x, then the pmf is 0 at x.
    3. \(p_X(x) = P[X=x] = P[\{\zeta:X(\zeta)=x\}]\)
  7. p100: 3 properties of pmf. They're all common sense.
    1. Nonnegative.
    2. Sums to one.
    3. The probability of an event B is the sum of the probabilities of the outcomes in B.
  8. Example 3.5 probability of # heads in 3 coin tosses: \(p_X(0)=1/8\)
  9. Example 3.6 betting game \(p_Y(1)=3/8\)
  10. Fig 3.4. You can graph the pmf.
  11. There are many types of random variables, depending on the shape of the pmf. These start out the same as the various probability laws in Chapter 2. However we'll see more types (e.g., Poisson) and more properties of each type (e.g., mean, standard deviation, generating function).
  12. Example 3.7 random number generator
    1. produces integer X equally likely in range 0..M-1
    2. \(S_X=\{0, 1, ... M-1 \}\)
    3. pmf: \(p_X(k)=1/M\) for k in 0..M-1.
    4. X is a uniform random variable over that set.
  13. Example 3.8 Bernoulli random variable
    1. indicator function \(I_A(\zeta)=1\) iff \(\zeta\in A\)
    2. pmf of \(I_A\): \(p_I(0)=1-p, p_I(1)=p\)
    3. \(I_A\) is a Bernoulli random variable.
  14. Example 3.9 Message transmission until success
    1. \(p_X(k)=q^{k-1}p, k=1,2,3,...\)
    2. Geometric random variable
    3. What about P[X is even]?
  15. Example 3.10 Number of transmission errors
    1. \(p_X(k) = {n \choose k} p^k q^{n-k}, k=0,1,...n\)
    2. binomial random variable
  16. Fig 3.5 You can graph the relative frequencies from running an experiment repeatedly.
    1. It will approach the pmf graph (absent pathological cases like the Cauchy distribition that are beyond this course.)
  17. 3.3 p104 Expected value and other moments
    1. This is a way to summarize a r.v., and capture important aspects.
    2. E.g., What's a fair price to pay for a lottery ticket?
    3. Mean or expected value or center of mass: \(m_X = E[X] = \sum_{x\in S_X} x p_X(x)\)
    4. Defined iff absolute convergence: \(\sum |x| p(x) < \infty\)
  18. Example 3.11 Mean of Bernoulli r.v.
  19. Example 3.12 Mean of Binomial r.v. What's the expected # of heads in 3 tosses?
  20. Example 3.13 Mean of uniform discrete r.v.
  21. Run an experiment n times and observe \(x(1), x(2), ...\)
    1. \(N_k(n)\) # times \(x_k\) was seen
    2. \(f_k(n) = N_k(n)/n\) frequencies
    3. Sample mean \(<X>_n = \sum x_kf_k(n)\)
    4. With lots of experiments, frequencies approach probabilities and sample mean converges to E[X]
    5. However it may take a long time, which is why stock market investors can go broke first.
  22. Example 3.14 p 106 Betting game
  23. Example 3.15 Mean of a geometric r.v.
  24. Example 3.16 St Petersburg paradox