Engineering Probability Class 25 Thu 20180419
Table of contents
1 Grades
 I'll try to upload a guaranteed minimum grade by the end of tomorrow. That will assume that all the grades that I don't yet have are zero.
 There will be eleven homeworks.
2 Handwritten notes and homework solutions
I added buttons to the page headers that go directly there.
3 Iclicker questions
 What is $$\int_{\infty}^\infty e^{\big(\frac{x^2}{2}\big)} dx$$?
 1/2
 1
 $2\pi$
 $\sqrt{2\pi}$
 $1/\sqrt{2\pi}$
 What is the largest possible value for a correlation coefficient?
 1/2
 1
 $2\pi$
 $\sqrt{2\pi}$
 $1/\sqrt{2\pi}$
 The most reasonable probability distribution for the number of defects on an integrated circuit caused by dust particles, cosmic rays, etc, is
 Exponential
 Poisson
 Normal
 Uniform
 Binomial
 The most reasonable probability distribution for the time until the next request hits your web server is:
 Exponential
 Poisson
 Normal
 Uniform
 Binomial
 If you add two independent normal random variables, each with variance 10, what is the variance of the sum?
 1
 $\sqrt2$
 10
 $10\sqrt2$
 20
4 Material from text
4.1 6.1.2 Joint Distribution Functions, ctd.
 joint cumulative distribution function, p 305.
 marginal cdf’s
 joint probability mass function
 conditional pmf’s
 jointly continuous random variables
 joint probability density function.
 marginal pdf’s
 conditional pdf’s
 Example 6.7 Multiplicative Sequence, p 308.
4.2 6.1.3 Independence
 Example 6.8 Independence.
4.3 6.2 Functions of several random variables
4.3.1 6.2.1 One Function of Several Random Variables

Example 6.9 Maximum and Minimum of n Random Variables
Apply this to uniform r.v.

Example 6.11 Reliability of Redundant Systems
Reminder for exponential r.v.:
 $f(x) = \lambda e^{\lambda x}$
 $F(x) = 1e^{\lambda x}$
 $\mu = 1/\lambda$
I may extend this example to find pdf and mean.
4.3.3 6.2.3 pdf of General Transformations
We skip Section 6.2.3. However, a historical note about Student's T distribution:
Student was a pseudonymn of a mathematician working for Guinness in Ireland. He developed several statistical techniques to sample beer to assure its quality. Guinness didn't let him publish under his real name because these were trade secrets.
4.4 6.3 Expected values of vector random variables
 Section 6.3, page 316, extends the covariance to a matrix. Even with N variables, note that we're comparing only pairs of variables. If there were a complicated 3 variable dependency, which could happen (and did in a much earlier example), all the pairwise covariances would be 0.
 Note the sequence.
 First, the correlation matrix has the expectations of the products.
 Then the covariance matrix corrects for the means not being 0.
 Finally the correlation coefficents (not shown here) correct for the variances not being 1.