Engineering Probability Class 17 Mon 2019-03-18
Table of contents
1 Test 2
You may bring 2 2-sided note sheets, each 8.5x11". They may be handwritten or printed or both. You may collaborate with others on the sheet. You may form a business to sell note sheets. You may study the textbook before the test. You may bring a calculator. You may not communicate during the exam except with the TAs or me. Since it is impossible for me to police all possible communication technologies, I have to trust you on this.
2 Sect 5.5 Independence, page 254
- Example 5.19 on page 255.
- Independence: Example 5.22 on page 256. Are 2 normal r.v. independent for different values of $\rho$ ?
3 Sect 5.6 Joint moments, p 257
- Example 5.24, sum.
- Example 5.25, product.
- Example 5.26 page 259. Covariance of independent variables.
- Correlation coefficient.
- Example 5.27 page 260 uncorrelated but dependent
4 Sect 5.7 Conditional page 261
-
Example 5.29 loaded dice
-
Example 5.30.
-
Example 5.31 on page 264. This is a noisy comm channel, now with Gaussian (normal) noise. The problems are:
- what input signal to infer from each output, and
- how accurate is this?
-
5.6.2 Joint moments etc
- Work out for 2 3-sided dice.
- Work out for tossing dart onto triangular board.
-
Example 5.27: correlation measures ''linear dependence''. If the dependence is more complicated, the variables may be dependent but not correlated.
-
Covariance, correlation coefficient.
-
Section 5.7, page 261. Conditional pdf. There is nothing majorly new here; it's an obvious extension of 1 variable.
- Discrete: Work out an example with a pair of 3-sided loaded dice.
- Continuous: a triangular dart board. There is one little trick because for P[X=x]=0 since X is continuous, so how can we compute P[Y=y|X=x] = P[Y=y & X=x]/P[x]? The answer is that we take the limiting probability P[x<X<x+dx] etc as dx shrinks, which nets out to using f(x) etc.
-
Example 5.31 on page 264. This is a noisy comm channel, now with Gaussian (normal) noise. This is a more realistic version of the earlier example with uniform noise. The application problems are:
- what input signal to infer from each output,
- how accurate is this, and
- what cutoff minimizes this?
In the real world there are several ways you could reduce that error:
- Increase the transmitted signal,
- Reduce the noise,
- Retransmit several times and vote.
- Handshake: Include a checksum and ask for retransmission if it fails.
- Instead of just deciding X=+1 or X=-1 depending on Y, have a 3rd decision, i.e., uncertain if $|Y|<0.5$, and ask for retransmission in that case.
-
Section 5.8 page 271: Functions of two random variables.
- We already saw how to compute the pdf of the sum and max of 2 r.v.
-
What's the point of transforming variables in engineering? E.g. in video, (R,G,B) might be transformed to (Y,I,Q) with a 3x3 matrix multiply. Y is brightness (mostly the green component). I and Q are approximately the red and blue. Since we see brightness more accurately than color hue, we want to transmit Y with greater precision. So, we want to do probabilities on all this.
-
Functions of 2 random variables
- This is an important topic.
- Example 5.44, page 275. Tranform two independent Gaussian r.v from (X,Y) to (R, $\theta$} ).
- Linear transformation of two Gaussian r.v.
- Sum and difference of 2 Gaussian r.v. are independent.
-
Section 5.9, page 278: pairs of jointly Gaussian r.v.
-
I will simplify formula 5.61a by assuming that $\mu=0, \sigma=1$.
$$f_{XY}(x,y)= \frac{1}{2\pi \sqrt{1-\rho^2}} e^{ \frac{-\left( x^2-2\rho x y + y^2\right)}{2(1-\rho^2)} } $$ .
-
The r.v. are probably dependent. $\rho$} says how much.
-
The formula degenerates if $|\rho|=1$ since the numerator and denominator are both zero. However the pdf is still valid. You could make the formula valid with l'Hopital's rule.
-
The lines of equal probability density are ellipses.
-
The marginal pdf is a 1 variable Gaussian.
-
-
Example 5.47, page 282: Estimation of signal in noise
- This is our perennial example of signal and noise. However, here the signal is not just $\pm1$ but is normal. Our job is to find the ''most likely'' input signal for a given output.
-
Next time: We've seen 1 r.v., we've seen 2 r.v. Now we'll see several r.v.