Skip to main content

Engineering Probability Class 8 Mon 2020-02-10

1   Probability in the real world - enrichment

Oct. 5, 1960: The moon tricks a radar.

Where would YOU make the tradeoff between type I and type II errors?

2   Chapter 3 ctd

  1. 3.4 page 111 Conditional pmf

  2. Example 3.24 Residual waiting time

    1. X, time to xmit message, is uniform in 1...L.
    2. If X is over m, what's probability that remaining time is j?
    3. \(p_X(m+j|X>m) = \frac{P[X =m+j]}{P[X>m]} = \frac{1/L}{(L-m)/L} = 1/(L-m)\)
  3. \(p_X(x) = \sum p_X(x|B_i) P[B_i]\)

  4. Example 3.25 p 113 device lifetimes

    1. 2 classes of devices, geometric lifetimes.
    2. Type 1, probability \(\alpha\), parameter r. Type 2 parameter s.
    3. What's pmf of the total set of devices?
  5. Example 3.26, p114.

  6. 3.5 p115 More important discrete r.v

  7. Table 3.1: We haven't seen \(G_X(z)\) yet.

  8. 3.5.1 p 117 The Bernoulli Random Variable

    We'll do mean and variance.

  9. Example 3.28 p119 Variance of a Binomial Random Variable

  10. Example 3.29 Redundant Systems

  11. 3.5.3 p119 The Geometric Random Variable

    It models the time between two consecutive occurrences in a sequence of independent random events. E.g., the length of a run of white bits in a scanned image (if the bits are independent).

  12. 3.5.4 Poisson r.v.

    1. The experiment is observing how many of a large number of rare events happen in, say, 1 minute.

    2. E.g., how many cosmic particles hit your DRAM, how many people call to call center.

    3. The individual events are independent. (In the real world this might be false. If a black hole occurs, you're going to get a lot of cosmic particles. If the ATM network crashes, there will be a lot of calls.)

    4. The r.v. is the number that happen in that period.

    5. There is one parameter, \(\alpha\). Often this is called \(\lambda\).

      \begin{equation*} p(k) = \frac{\alpha^k}{k!}e^{-\alpha} \end{equation*}
    6. Mean and std dev are both \(\alpha\).

    7. In the real world, events might be dependent.

  13. Example 3.32 p123 Errors in Optical Transmission

  14. 3.5.5 p124 The Uniform Random Variable

3   Poisson vs Binomial vs Normal distributions

The binomial distribution is the exact formula for the probability of k successes from n trials (with replacement).

When n and k are large but p=k/n is small, then the Poisson distribution is a good approximation to the binomial. Roughly, n>10, k<5.

When n is large and p is not too small or too large, then the normal distribution, which we haven't seen yet, is an excellent approximation. Roughly, n>10 and \(|n-k|>2\ \sqrt{n}\) .

For big n, you cannot use binomial, and for really big n, cannot use Poisson. Imagine that your experiment is to measure the number of atoms decaying in this uranium ore . How would you compute \(\left(10^{23}\right)!\) ?

OTOH, for small n, you can compute binomial by hand. Poisson and normal probably require a calculator.

4   Chapter 4

  1. I will try to ignore most of the theory at the start of the chapter.
  2. Now we will see continuous random variables.
    1. The probability of the r.v being any exact value is infinitesimal,
    2. so we talk about the probability that it's in a range.
  3. Sometimes there are mixed discrete and continuous r.v.
    1. Let X be the time X to get a taxi at the airport.
    2. 80% of the time a taxi is already there, so p(X=0)=.8.
    3. Otherwise we wait a uniform time from 0 to 20 minutes, so p(a<x<b)=.01(b-a), for 0<a<b<20.
  4. Remember that for discrete r.v. we have a probability mass function (pmf).
  5. For continuous r.v. we now have a probability density function (pdf), \(f_X(x)\).
  6. p(a<x<a+da) = f(a)da
  7. For any r.v., we have a cumulative distribution function (cdf) \(F_X(x)\).
  8. The subscript is interesting only when we are using more than one cdf and need to tell them apart.
  9. Definition: F(x) = P(X<=x).
  10. The <= is relevant only for discrete r.v.
  11. As usual Wikipedia isn't bad, and is deeper than we need here, Cumulative_distribution_function.
  12. We compute means and other moments by the obvious integrals.