Engineering Probability Class 10 Thurs 2019-02-14

1   Homework 4

1.1   Due date

I said Thurs Feb 16 when I meant Thurs Feb 14. We'll compromise at Tues Feb 19.

Since you're studying for the test, the next homework will be due on 2019-02-21.

1.2   About question 1

When you have ambiguous evidence, you have to decide which way you're going to lean. There's no perfect answer. If you realize that, in fact, you are making a decision here about how to decide, then you're one step ahead.

Another example: imagine that on 1960-10-05, your radar that is watching Greenland for incoming Russian bombers sees a reflection. What is it? If you guess wrong one way, you let the USSR clobber the USA. If you guess wrong the other way, you accidentally start WWIII. Quick! You have to decide now!! (Really, it was the moon.)

This is getting beyond this course, but you might next decide to pay more money to get better evidence, or whatever. However it will never be perfect.

2   Poisson vs Binomial vs Normal distributions

The binomial distribution is the exact formula for the probability of k successes from n trials (with replacement).

When n and k are large but p=k/n is small, then the Poisson distribution is a good approximation to the binomial. Roughly, n>10, k<5.

When n is large and p is not too small or too large, then the normal distribution, which we haven't seen yet, is an excellent approximation. Roughly, n>10 and \(|n-k|>2\ \sqrt{n}\) .

For big n, you cannot use binomial, and for really big n, cannot use Poisson. Imagine that your experiment is to measure the number of atoms decaying in this uranium ore . How would you compute \(\left(10^{23}\right)!\) ?

OTOH, for small n, you can compute binomial by hand. Poisson and normal probably require a calculator.

3   Homework solutions

are online under the Files tab at the top of the page.

5   Iclicker questions

What is the best discrete probability distribution in the following cases.

  1. Your car has five tires (including the spare), which may each independently be flat. The event is that not more than one tire is flat.
    1. Bernoulli
    2. binomial
    3. geometric
    4. Poisson
    5. uniform
  2. 1,000,000 widgets are made this year, of which 1,000 are bad. You buy 5 at random. The event is that not more than one widget is bad.
    1. Bernoulli
    2. binomial
    3. geometric
    4. Poisson
    5. uniform
  3. You toss a weighted coin, which lands heads 3/4 of the time.
    1. Bernoulli
    2. binomial
    3. geometric
    4. Poisson
    5. uniform
  4. You toss a fair 12-sided die.
    1. Bernoulli
    2. binomial
    3. geometric
    4. Poisson
    5. uniform
  5. You're learning to drive a car, and trying to pass the test. The event of interest is the number of times you have to take the test to pass. Assume that the tests are independent of each other and have equal probability.
    1. Bernoulli
    2. binomial
    3. geometric
    4. Poisson
    5. uniform
  6. It's Nov 17 and you're outside in a dark place looking for Leonid meteorites. The event of interest is the number of meteorites per hour that you see.
    1. Bernoulli
    2. binomial
    3. geometric
    4. Poisson
    5. uniform
  7. It's Nov 17.... The new event of interest is the number of seconds until you see the next meteorite.
    1. Bernoulli
    2. binomial
    3. geometric
    4. Poisson
    5. uniform

6   Exam 1

  1. Closed book but a calculator and one 2-sided letter-paper-size note sheet is allowed.
  2. Material is from chapters 1-3.
  3. Questions will be based on book, class, and homework, examples and exercises.
  4. The hard part for you may be deciding what formula to use.
  5. Any calculations will (IMHO) be easy.
  6. Speed should not be a problem; most people should finish in 1/2 the time.

7   Chapter 3 exercises

We'll try these exercises from the text in class.

  1. 3.51a in page 135.
  2. 3.88 on page 139.
  3. 3.91.

8   Chapter 4

  1. I will try to ignore most of the theory at the start of the chapter.
  2. Now we will see continuous random variables.
    1. The probability of the r.v being any exact value is infinitesimal,
    2. so we talk about the probability that it's in a range.
  3. Sometimes there are mixed discrete and continuous r.v.
    1. Let X be the time X to get a taxi at the airport.
    2. 80% of the time a taxi is already there, so p(X=0)=.8.
    3. Otherwise we wait a uniform time from 0 to 20 minutes, so p(a<x<b)=.01(b-a), for 0<a<b<20.
  4. Remember that for discrete r.v. we have a probability mass function (pmf).
  5. For continuous r.v. we now have a probability density function (pdf), \(f_X(x)\).
  6. p(a<x<a+da) = f(a)da
  7. For any r.v., we have a cumulative distribution function (cdf) \(F_X(x)\).
  8. The subscript is interesting only when we are using more than one cdf and need to tell them apart.
  9. Definition: F(x) = P(X<=x).
  10. The <= is relevant only for discrete r.v.
  11. As usual Wikipedia isn't bad, and is deeper than we need here, Cumulative_distribution_function.
  12. We compute means and other moments by the obvious integrals.