Skip to main content

Engineering Probability Class 7 Thu 2020-02-06

1   Piazza

Remember that we have a piazza site for posting questions and answers.

3   Review questions

  1. Imagine that the coin you toss might land on its edge (and stay there). P(head)=.5, p(tail)=.4, p(edge)=.1. You toss it 3 times. What's the probability that it lands on its head twice, and on edge once?
    1. .025
    2. .05
    3. .075
    4. .081
    5. .1
  2. Now you toss the coin repeatedly until it lands on edge. What's the probability that this happens for the first time on the 3rd toss?
    1. .025
    2. .05
    3. .081
    4. .1
    5. .333
  3. review: You have a coin where the probability of a head is p=2/3 What's the probability that the 1st head occurs on the 2nd toss?
    1. 1/2
    2. 1/3
    3. 2/9
    4. 5/9
    5. 4/9

4   Wikipedia

Wikipedia's articles on technical subjects can be excellent. In fact, they often have more detail than you want. Here are some that are relevant to this course. Read at least the first few paragraphs.

  1. https://en.wikipedia.org/wiki/Outcome_(probability)
  2. https://en.wikipedia.org/wiki/Random_variable
  3. https://en.wikipedia.org/wiki/Indicator_function
  4. https://en.wikipedia.org/wiki/Gambler%27s_fallacy
  5. https://en.wikipedia.org/wiki/Fat-tailed_distribution
  6. https://en.wikipedia.org/wiki/St._Petersburg_paradox

5   Two types of testing errors

  1. There's an event A, with probability P[A]=p.
  2. There's a dependent event, perhaps a test or a transmission, B.
  3. You know P[B|A] and P[B|A'].
  4. Wikipedia:
    1. https://en.wikipedia.org/wiki/Type_I_and_type_II_errors
    2. https://en.wikipedia.org/wiki/Sensitivity_and_specificity
  5. Terminology:
    1. Type I error, False negative.
    2. Type II error, false positive.
    3. Sensitivity, true positive proportion.
    4. Selectivity, true negative proportion.

6   Chapter 3 ctd

  1. This chapter covers Discrete (finite or countably infinite) r.v.. This contrasts to continuous, to be covered later.
  2. Discrete r.v.s we've seen so far:
    1. uniform: M events 0...M-1 with equal probs
    2. bernoulli: events: 0 w.p. q=(1-p) or 1 w.p. p
    3. binomial: # heads in n bernoulli events
    4. geometric: # trials until success, each trial has probability p.
  3. 3.1.1 p107 Expected value of a function of a r.v.
    1. Z=g(X)
    2. E[Z] = E[g(x)] = \(\sum_k g(x_k) p_X(x_k)\)
  4. Example 3.17 p107 square law device
  5. \(E[a g(X)+b h(X)+c] = a E[g(X)] + b E[h(x)] + c\)
  6. Example 3.18 Square law device continued
  7. Example 3.19 Multiplexor discards packets
  8. Compute mean of a binomial distribution.
  9. Compute mean of a geometric distribution.
  10. 3.3.1, page 107: Operations on means: sums, scaling, functions
  11. review: From a deck of cards, I draw a card, look at it, put it back and reshuffle. Then I do it again. What's the probability that exactly one of the 2 cards is a heart?
    • A: 2/13
    • B: 3/16
    • C: 1/4
    • D: 3/8
    • E: 1/2
  12. review: From a deck of cards, I draw a card, look at it, put it back and reshuffle. I keep repeating this. What's the probability that the 2nd card is the 1st time I see hearts?
    • A: 2/13
    • B: 3/16
    • C: 1/4
    • D: 3/8
    • E: 1/2
  13. 3.3.2 page 109 Variance of an r.v.
    1. That means, how wide is its distribution?
    2. Example: compare the performance of stocks vs bonds from year to year. The expected values (means) of the returns may not be so different. (This is debated and depends, e.g., on what period you look at). However, stocks' returns have a much larger variance than bonds.
    3. \(\sigma^2_X = VAR[X] = E[(X-m_X)^2] = \sum (x-m_x)^2 p_X(x)\)
    4. standard deviation \(\sigma_X = \sqrt{VAR[X]}\)
    5. \(VAR[X] = E[X^2] - m_X^2\)
    6. 2nd moment: \(E[X^2]\)
    7. also 3rd, 4th... moments, like a Taylor series for probability
    8. shifting the distribution: VAR[X+c] = VAR[X]
    9. scaling: \(VAR[cX] = c^2 VAR[X]\)
  14. Derive variance for Bernoulli.
  15. Example 3.20 3 coin tosses
    1. general rule for binomial: VAR[X]=npq
    2. Derive it.
    3. Note that it sums since the events are independent.
    4. Note that variance/mean shrinks as n grows.
  16. review: The experiment is drawing a card from a deck, seeing if it's hearts, putting it back, shuffling, and repeating for a total of 100 times. The random variable is the total # of hearts seen, from 0 to 100. What's the mean of this r.v.?
    • A: 1/4
    • B: 25
    • C: 1/2
    • D: 50
    • E: 1
  17. The experiment is drawing a card from a deck, seeing if it's hearts, putting it back, shuffling, and repeating for a total of 100 times. The random variable is the # of hearts seen, from 0 to 100. What's the variance of this r.v.?
    • A: 3/16
    • B: 1
    • C: 25/4
    • D: 75/4
    • E: 100

7   Chapter 3 ctd

  1. Geometric distribution: review mean and variance.

  2. Suppose that you have just sold your internet startup for $10M. You have retired and now you are trying to climb Mt Everest. You intend to keep trying until you make it. Assume that:

    1. Each attempt has a 1/3 chance of success.
    2. The attempts are independent; failure on one does not affect future attempts.
    3. Each attempt costs $70K.

    Review: What is your expected cost of a successful climb?

    1. $70K.
    2. $140K.
    3. $210K.
    4. $280K.
    5. $700K.