Engineering Probability Class 21 Mon 2019-04-01

1   Iclicker questions

  1. What is $$\int_{-\infty}^\infty e^{\big(-\frac{x^2}{2}\big)} dx$$?
    1. 1/2
    2. 1
    3. $2\pi$
    4. $\sqrt{2\pi}$
    5. $1/\sqrt{2\pi}$
  2. What is the largest possible value for a correlation coefficient?
    1. 1/2
    2. 1
    3. $2\pi$
    4. $\sqrt{2\pi}$
    5. $1/\sqrt{2\pi}$
  3. The most reasonable probability distribution for the number of defects on an integrated circuit caused by dust particles, cosmic rays, etc, is
    1. Exponential
    2. Poisson
    3. Normal
    4. Uniform
    5. Binomial
  4. The most reasonable probability distribution for the time until the next request hits your web server is:
    1. Exponential
    2. Poisson
    3. Normal
    4. Uniform
    5. Binomial
  5. If you add two independent normal random variables, each with variance 10, what is the variance of the sum?
    1. 1
    2. $\sqrt2$
    3. 10
    4. $10\sqrt2$
    5. 20

2   Material from text

2.1   6.1.2 Joint Distribution Functions, ctd.

  1. Example 6.7 Multiplicative Sequence, p 308.

2.2   6.1.3 Independence, p 309

  1. Definition 6.16.

  2. Example 6.8 Independence.

  3. Example 6.9 Maximum and Minimum of n Random Variables

    Apply this to uniform r.v.

  4. Example 6.10 Merging of Independent Poisson Arrivals, p 310

  5. Example 6.11 Reliability of Redundant Systems

  6. Reminder for exponential r.v.:

    1. $f(x) = \lambda e^{-\lambda x}$
    2. $F(x) = 1-e^{-\lambda x}$
    3. $\mu = 1/\lambda$

2.3   6.2.2 Transformations of Random Vectors

  1. Let A be a 1 km cube in the atmosphere. Your coordinates are in km.
  2. Pick a point uniformly in it. $f_X(\vec{x}) = 1$.
  3. Now transform to use m, not km. Z=1000 X.
  4. $F_Z(\vec{z}) = 1/(1000^3) f_X(\vec{z}/1000)$

2.4   6.2.3 pdf of General Transformations

We skip Section 6.2.3. However, a historical note about Student's T distribution:

Student was a pseudonymn of a mathematician working for Guinness in Ireland. He developed several statistical techniques to sample beer to assure its quality. Guinness didn't let him publish under his real name because these were trade secrets.

2.5   6.3 Expected values of vector random variables, p 318

  1. Section 6.3, page 316, extends the covariance to a matrix. Even with N variables, note that we're comparing only pairs of variables. If there were a complicated 3 variable dependency, which could happen (and did in a much earlier example), all the pairwise covariances would be 0.
  2. Note the sequence.
    1. First, the correlation matrix has the expectations of the products.
    2. Then the covariance matrix corrects for the means not being 0.
    3. Finally the correlation coefficents (not shown here) correct for the variances not being 1.