Engineering Probability Class 20 Mon 2018-04-02

1   Final exam conflicts

  1. If you have 3 exams on that day, or another exam at the same time, please email me this week. Tell me the other courses.
  2. The RPI rule is that the lower numbered course has precedence. If one of your other classes has a number higher than 2500, then it gives the conflict exam. If all the other courses with exams that day are lower than 2500, then I do.

2   Normal distribution table

For your convenience. I computed it with Matlab.:

x          f(x)      F(x)      Q(x)
-3.0000    0.0044    0.0013    0.9987
-2.9000    0.0060    0.0019    0.9981
-2.8000    0.0079    0.0026    0.9974
-2.7000    0.0104    0.0035    0.9965
-2.6000    0.0136    0.0047    0.9953
-2.5000    0.0175    0.0062    0.9938
-2.4000    0.0224    0.0082    0.9918
-2.3000    0.0283    0.0107    0.9893
-2.2000    0.0355    0.0139    0.9861
-2.1000    0.0440    0.0179    0.9821
-2.0000    0.0540    0.0228    0.9772
-1.9000    0.0656    0.0287    0.9713
-1.8000    0.0790    0.0359    0.9641
-1.7000    0.0940    0.0446    0.9554
-1.6000    0.1109    0.0548    0.9452
-1.5000    0.1295    0.0668    0.9332
-1.4000    0.1497    0.0808    0.9192
-1.3000    0.1714    0.0968    0.9032
-1.2000    0.1942    0.1151    0.8849
-1.1000    0.2179    0.1357    0.8643
-1.0000    0.2420    0.1587    0.8413
-0.9000    0.2661    0.1841    0.8159
-0.8000    0.2897    0.2119    0.7881
-0.7000    0.3123    0.2420    0.7580
-0.6000    0.3332    0.2743    0.7257
-0.5000    0.3521    0.3085    0.6915
-0.4000    0.3683    0.3446    0.6554
-0.3000    0.3814    0.3821    0.6179
-0.2000    0.3910    0.4207    0.5793
-0.1000    0.3970    0.4602    0.5398
      0    0.3989    0.5000    0.5000
 0.1000    0.3970    0.5398    0.4602
 0.2000    0.3910    0.5793    0.4207
 0.3000    0.3814    0.6179    0.3821
 0.4000    0.3683    0.6554    0.3446
 0.5000    0.3521    0.6915    0.3085
 0.6000    0.3332    0.7257    0.2743
 0.7000    0.3123    0.7580    0.2420
 0.8000    0.2897    0.7881    0.2119
 0.9000    0.2661    0.8159    0.1841
 1.0000    0.2420    0.8413    0.1587
 1.1000    0.2179    0.8643    0.1357
 1.2000    0.1942    0.8849    0.1151
 1.3000    0.1714    0.9032    0.0968
 1.4000    0.1497    0.9192    0.0808
 1.5000    0.1295    0.9332    0.0668
 1.6000    0.1109    0.9452    0.0548
 1.7000    0.0940    0.9554    0.0446
 1.8000    0.0790    0.9641    0.0359
 1.9000    0.0656    0.9713    0.0287
 2.0000    0.0540    0.9772    0.0228
 2.1000    0.0440    0.9821    0.0179
 2.2000    0.0355    0.9861    0.0139
 2.3000    0.0283    0.9893    0.0107
 2.4000    0.0224    0.9918    0.0082
 2.5000    0.0175    0.9938    0.0062
 2.6000    0.0136    0.9953    0.0047
 2.7000    0.0104    0.9965    0.0035
 2.8000    0.0079    0.9974    0.0026
 2.9000    0.0060    0.9981    0.0019
 3.0000    0.0044    0.9987    0.0013

3   Not in text enrichment - large effect of small bias

Consider tossing $n=10^6$ fair coins.

  1. P[more heads than tails] = 0.5

  2. Now assume that each coin has chance of being heads $p=0.5005$.

    What's P[more heads than tails]?

  3. Now assume that 999,000 of the coins are fair, but 1,000 will always be heads.

    What's P[more heads than tails]?

4   Material from text

  1. Example 5.17 on page 253. P[X+Y<=1]

  2. Example 5.18 on page 253. Joint Gaussian.

  3. Example 5.19 on page 255. Independence.

  4. Example 5.20 on page 255. Independence of Q and R in the block transmission example.

  5. Independence: Example 5.22 on page 256. Are 2 normal r.v. independent for different values of $\rho$ ?

  6. Example 5.31 on page 264. This is a noisy comm channel, now with Gaussian (normal) noise. The problems are:

    1. what input signal to infer from each output, and
    2. how accurate is this?
  7. 5.6.2 Joint moments etc

    1. Work out for 2 3-sided dice.
    2. Work out for tossing dart onto triangular board.
  8. Example 5.27: correlation measures ''linear dependence''. If the dependence is more complicated, the variables may be dependent but not correlated.

  9. Covariance, correlation coefficient.

  10. Section 5.7, page 261. Conditional pdf. There is nothing majorly new here; it's an obvious extension of 1 variable.

    1. Discrete: Work out an example with a pair of 3-sided loaded dice.
    2. Continuous: a triangular dart board. There is one little trick because for P[X=x]=0 since X is continuous, so how can we compute P[Y=y|X=x] = P[Y=y &amp; X=x]/P[x]? The answer is that we take the limiting probability P[x<X<x+dx] etc as dx shrinks, which nets out to using f(x) etc.
  11. Example 5.31 on page 264. This is a noisy comm channel, now with Gaussian (normal) noise. This is a more realistic version of the earlier example with uniform noise. The application problems are:

    1. what input signal to infer from each output,
    2. how accurate is this, and
    3. what cutoff minimizes this?

    In the real world there are several ways you could reduce that error:

    1. Increase the transmitted signal,
    2. Reduce the noise,
    3. Retransmit several times and vote.
    4. Handshake: Include a checksum and ask for retransmission if it fails.
    5. Instead of just deciding X=+1 or X=-1 depending on Y, have a 3rd decision, i.e., uncertain if $|Y|<0.5$, and ask for retransmission in that case.
  12. Section 5.8 page 271: Functions of two random variables.

    1. We already saw how to compute the pdf of the sum and max of 2 r.v.
  13. What's the point of transforming variables in engineering? E.g. in video, (R,G,B) might be transformed to (Y,I,Q) with a 3x3 matrix multiply. Y is brightness (mostly the green component). I and Q are approximately the red and blue. Since we see brightness more accurately than color hue, we want to transmit Y with greater precision. So, we want to do probabilities on all this.

  14. Functions of 2 random variables

    1. This is an important topic.
    2. Example 5.44, page 275. Tranform two independent Gaussian r.v from (X,Y) to (R, $\theta$} ).
    3. Linear transformation of two Gaussian r.v.
    4. Sum and difference of 2 Gaussian r.v. are independent.
  15. Section 5.9, page 278: pairs of jointly Gaussian r.v.

    1. I will simplify formula 5.61a by assuming that $\mu=0, \sigma=1$.

      $$f_{XY}(x,y)= \frac{1}{2\pi \sqrt{1-\rho^2}} e^{ \frac{-\left( x^2-2\rho x y + y^2\right)}{2(1-\rho^2)} } $$ .

    2. The r.v. are probably dependent. $\rho$} says how much.

    3. The formula degenerates if $|\rho|=1$ since the numerator and denominator are both zero. However the pdf is still valid. You could make the formula valid with l'Hopital's rule.

    4. The lines of equal probability density are ellipses.

    5. The marginal pdf is a 1 variable Gaussian.

  16. Example 5.47, page 282: Estimation of signal in noise

    1. This is our perennial example of signal and noise. However, here the signal is not just $\pm1$ but is normal. Our job is to find the ''most likely'' input signal for a given output.
  17. Next time: We've seen 1 r.v., we've seen 2 r.v. Now we'll see several r.v.