Skip to main content

Engineering Probability Class 6 Mon 2020-02-03

1   Probability in the real world - enrichment

The Million Random Digit book below.

2   Chapter 2 ctd

  1. Multinomial probability law

    1. There are M different possible outcomes from an experiment, e.g., faces of a die showing.

    2. Probability of particular outcome: \(p_i\)

    3. Now run the experiment n times.

    4. Probability that i-th outcome occurred \(k_i\) times, \(\sum_{i=1}^M k_i = n\)

      \begin{equation*} P[(k_1,k_2,...,k_M)] = \frac{n!}{k_1! k_2! ... k_M!} p_1^{k_1} p_2^{k_2}...p_M^{k_M} \end{equation*}
  2. Example 2.41 p63 dartboard.

  3. Example 2.42 p63 random phone numbers.

  4. 2.7 Computer generation of random numbers

    1. Skip this section, except for following points.
    2. Executive summary: it's surprisingly hard to generate good random numbers. Commercial SW has been known to get this wrong. By now, they've gotten it right (I hope), so just call a subroutine.
    3. Arizona lottery got it wrong in 1998.
    4. Even random electronic noise is hard to use properly. The best selling 1955 book A Million Random Digits with 100,000 Normal Deviates had trouble generating random numbers this way. Asymmetries crept into their circuits perhaps because of component drift. For a laugh, read the reviews.
    5. Pseudo-random number generator: The subroutine returns numbers according to some algorithm (e.g., it doesn't use cosmic rays), but for your purposes, they're random.
    6. Computer random number routines usually return the same sequence of number each time you run your program, so you can reproduce your results.
    7. You can override this by seeding the generator with a genuine random number from linux /dev/random.
  5. 2.8 and 2.9 p70 Fine points: Skip.

  6. Review Bayes theorem, since it is important. Here is a fictitious (because none of these probilities have any justification) SETI example.

    1. A priori probability of extraterrestrial life = P[L] = \(10^{-8}\).
    2. For ease of typing, let L' be the complement of L.
    3. Run a SETI experiment. R (for Radio) is the event that it has a positive result.
    4. P[R|L] = \(10^{-5}\), P[R|L'] = \(10^{-10}\).
    5. What is P[L|R] ?
  7. Some specific probability laws

    1. In all of these, successive events are independent of each other.
    2. A Bernoulli trial is one toss of a coin where p is probability of head.
    3. We saw binomial and multinomial probilities in class 4.
    4. The binomial law gives the probability of exactly k heads in n tosses of an unfair coin.
    5. The multinomial law gives the probability of exactly ki occurrances of the i-th face in n tosses of a die.

3   Review questions

  1. You have a coin where the probability of a head is p=2/3 If you toss it twice, what's the probability that you will see one head and one tail?
    1. 1/2
    2. 1/3
    3. 2/9
    4. 5/9
    5. 4/9
  2. Imagine that the coin you toss might land on its edge (and stay there). P(head)=.5, p(tail)=.4, p(edge)=.1. You toss it 3 times. What's the probability that it lands on its head twice, and on edge once?
    1. .025
    2. .05
    3. .081
    4. .1
    5. .333
  3. Now you toss the coin repeatedly until it lands on edge. What's the probability that this happens for the first time on the 3rd toss?
    1. .025
    2. .05
    3. .081
    4. .1
    5. .333

4   Chapter 2 ctd

  1. 2.6.4 p63 Geometric probability law

    1. Repeat Bernoulli experiment until 1st success.
    2. Define outcome to be # trials until that happens.
    3. Define q=(1-p).
    4. \(p(m) = (1-p)^{m-1}p = q^{m-1}p\) (p has 2 different uses here).
    5. \(\sum_{m=1}^\infty p(m) =1\)
    6. Probability that more than K trials are required = \(q^K\).
  2. Example: probability that more than 10 tosses of a die are required to get a 6 = \(\left(\frac{5}{6}\right)^{10} = 0.16\)

  3. Review: You have a coin where the probability of a head is p=2/3 What's the probability that the 1st head occurs on the 2nd toss?

    1. 1/2
    2. 1/3
    3. 2/9
    4. 5/9
    5. 4/9
  4. Example 2.43 p64: error control by retransmission. A sent over a noisy channel is checksummed so the receiver can tell if it got mangled, and then ask for retransmission. TCP/IP does this.

    Aside: This works better when the roundtrip time is reasonable. Using this when talking to Mars is challenging.

  5. 2.6.5 p64 Sequences, chains, of dependent experiments.

    1. This is an important topic, but mostly beyond this course.
    2. In many areas, there are a sequence of observations, and the probability of each observation depends on what you observed before.
    3. It relates to Markov chains.
    4. Motivation: speech and language recognition, translation, compression
    5. E.g., in English text, the probability of a u is higher if the previous char was q.
    6. The probability of a b may be higher if the previous char was u (than if it was x), but is lower if the previous two chars are qu.
    7. Need to look at probabilities of sequences, char by char.
    8. Same idea in speech recognition: phonemes follow phonemes...
    9. Same in language understanding: verb follows noun...
  6. Example 2.44, p64. #. Example 2.45, p66.

5   Discrete Random Variables

  1. Chapter 3, p 96. Discrete random variables
    1. From now on our random experiments will always produce numbers, called random variables, at least indirectly.
    2. Then we can compute, e.g., a fair value to pay for a gamble. What should you pay to play roulette so that betting on red breaks even on average?
    3. Discrete is different from discreet.
    4. Random experiment \(\rightarrow\) nonnumerical outcome \(\zeta\) \(\rightarrow\)
    5. Random Variable \(X(\zeta )\). Any real number.
    6. Random vars in general: X, Y, ...
    7. particular values: x, y, ...
    8. It's the outcome that's random, not the r.v., which is a deterministic function of the outcome.
  2. Example 3.1 p97 Coin tosses
    1. Define X to be the number of heads from 3 tosses.
    2. \(\zeta\)
  3. Example 3.2 Betting game addon to 3.1
    1. Define another random var Y to be payoff: 8 if X=3, 1 if X=2, 0 else.
    2. Y is derived from X
  4. Example 3.3 add probs to 3.2, assuming fair coin. P[X=2], P[Y=8]
  5. 3.1.1 Ignore since it's starred.
  6. 3.2 Discrete r.v. and Probability mass function (pmf).
    1. The pmf shows the probability of every value of random variable X, and of every real number.
    2. If X cannot have the value x, then the pmf is 0 at x.
    3. \(p_X(x) = P[X=x] = P[\{\zeta:X(\zeta)=x\}]\)
  7. p100: 3 properties of pmf. They're all common sense.
    1. Nonnegative.
    2. Sums to one.
    3. The probability of an event B is the sum of the probabilities of the outcomes in B.
  8. Example 3.5 p101 probability of # heads in 3 coin tosses: \(p_X(0)=1/8\)
  9. Example 3.6 betting game \(p_Y(1)=3/8\)
  10. Fig 3.4. You can graph the pmf.
  11. There are many types of random variables, depending on the shape of the pmf. These start out the same as the various probability laws in Chapter 2. However we'll see more types (e.g., Poisson) and more properties of each type (e.g., mean, standard deviation, generating function).
  12. Example 3.7 random number generator
    1. produces integer X equally likely in range 0..M-1
    2. \(S_X=\{0, 1, ... M-1 \}\)
    3. pmf: \(p_X(k)=1/M\) for k in 0..M-1.
    4. X is a uniform random variable over that set.
  13. Example 3.8 Bernoulli random variable
    1. indicator function \(I_A(\zeta)=1\) iff \(\zeta\in A\)
    2. pmf of \(I_A\): \(p_I(0)=1-p, p_I(1)=p\)
    3. \(I_A\) is a Bernoulli random variable.
  14. Example 3.9 Message transmission until success
    1. \(p_X(k)=q^{k-1}p, k=1,2,3,...\)
    2. Geometric random variable
    3. What about P[X is even]?
  15. Example 3.10 Number of transmission errors
    1. \(p_X(k) = {n \choose k} p^k q^{n-k}, k=0,1,...n\)
    2. binomial random variable
  16. Fig 3.5 You can graph the relative frequencies from running an experiment repeatedly.
    1. It will approach the pmf graph (absent pathological cases like the Cauchy distribition that are beyond this course.)
  17. 3.3 p104 Expected value and other moments
    1. This is a way to summarize a r.v., and capture important aspects.
    2. E.g., What's a fair price to pay for a lottery ticket?
    3. Mean or expected value or center of mass: \(m_X = E[X] = \sum_{x\in S_X} x p_X(x)\)
    4. Defined iff absolute convergence: \(\sum |x| p(x) < \infty\)
  18. Example 3.11 Mean of Bernoulli r.v.
  19. Example 3.12 Mean of Binomial r.v. What's the expected # of heads in 3 tosses?
  20. Example 3.13 Mean of uniform discrete r.v.
  21. Run an experiment n times and observe \(x(1), x(2), ...\)
    1. \(N_k(n)\) # times \(x_k\) was seen
    2. \(f_k(n) = N_k(n)/n\) frequencies
    3. Sample mean \(<X>_n = \sum x_kf_k(n)\)
    4. With lots of experiments, frequencies approach probabilities and sample mean converges to E[X]
    5. However it may take a long time, which is why stock market investors can go broke first.
  22. Example 3.14 p 106 Betting game
  23. Example 3.15 Mean of a geometric r.v.
  24. Example 3.16 p107 St Petersburg paradox