Engineering Probability Class 23 Thu 2018-04-12

Table of contents

1   Material from text

  1. Example 5.35 Maximum A Posteriori Receiver on page 268.

  2. Example 5.37, page 270.

  3. Remember equations 5.49 a,b for total probability on page 269-70 for conditional expectation of Y given X.

  4. Section 5.8 page 271: Functions of two random variables.

    1. This is an important topic.
    2. Linear transformation of two Gaussian r.v.
    3. Sum and difference of 2 Gaussian r.v. are independent.
  5. What's the point of transforming variables in engineering? E.g. in video, (R,G,B) might be transformed to (Y,I,Q) with a 3x3 matrix multiply. Y is brightness (mostly the green component). I and Q are approximately the red and blue. Since we see brightness more accurately than color hue, we want to transmit or compress Y with greater precision. So, we want to do probabilities on all this.

  6. Example 5.39 Sum of Two Random Variables, page 271.

  7. Example 5.40 Sum of Nonindependent Gaussian Random Variables, page 272.

    I'll do an easier case of independent N(0,1) r.v. The sum will be N(0, $\sqrt{2}$ ).

  8. Example 5.44, page 275. Tranform two independent Gaussian r.v from

    (X,Y) to (R, $\theta$).

  9. Section 5.9, page 278: pairs of jointly Gaussian r.v.

    1. I will simplify formula 5.61a by assuming that $\mu=0, \sigma=1$.

      $$f_{XY}(x,y)= \frac{1}{2\pi \sqrt{1-\rho^2}} e^{ \frac{-\left( x^2-2\rho x y + y^2\right)}{2(1-\rho^2)} } $$ .

    2. The r.v. are probably dependent. $\rho$} says how much.

    3. The formula degenerates if $|\rho|=1$ since the numerator and denominator are both zero. However the pdf is still valid. You could make the formula valid with l'Hopital's rule.

    4. The lines of equal probability density are ellipses.

    5. The marginal pdf is a 1 variable Gaussian.

Engineering Probability Class 22 Mon 2018-04-09

3   Exam 2 and estimated final grade

Exam 2 will be returned in class on Thurs. Shortly after I will compute an estimated letter grade, if you don't write exam 3 (the final exam). Later, it will be refined into a guaranteed letter grade, assuming you don't do any more homeworks. That will require working in the latest iclicker scores etc.

4   Final exam

Six students need a conflict exam (if everyone writes it). Next week, I'll set up an online poll for those students, to pick a good time.

5   Material from text

  1. Example 5.31 on page 264 in detail. This is the noisy comm channel, now with Gaussian (normal) noise.
  2. Remember equation 5.45 for conditional pdf on page 265.
  3. Remember equation 5.48 for total probability on page 266.
  4. Example 5.33 on page 267.

Engineering Probability Homework 9 due Mon 2018-04-16 2359 EST

How to submit

Submit to LMS; see details in syllabus.

Questions

All questions are from the text, starting on page 290.

Each part of a question is worth 5 points.

  1. (20 pts) Problem 5.17. (c has 2 parts.)
  2. (15 pts) Problem 5.18.
  3. (15 pts) Problem 5.25.
  4. (30 pts) Problem 5.26. (d has 3 parts.)
  5. (15 pts) Problem 5.35.
  6. (5 pts) Problem 5.37.
  7. (15 pts) Problem 5.56.

Total: 115 points.

Engineering Probability Class 21 Thu 2018-04-05

1   Parallel computer access

As I mentioned Monday, parallel.ecse.rpi.edu is available. It has a dual 14-core (56 hyperthread) 2GHz Intel Xeon CPU, Intel Xeon Phi coprocessor with 60 cores running 240 threads, Nvidia GeForce GTX 1080 GPU with 2560 CUDA cores, and 256GB main memory.

Parallel SW includes CUDA, Thrust, OpenMP, TBB.

Using it for private business is against RPI policy.

It is slower than an IBM Blue Gene, but it cost only $10K total.

I manage it; ask me for an account.

Many datasets that are considered to be big data will fit into main memory; MPI etc are not required.

2   Final exam

will be 80 minutes like the first 2 exams.

3   Material from text

  1. Example 4.33, page 177.

  2. Example 4.36, page 180, Amplitude Samples of a Sinusoidal Waveform

  3. Equation 5.32 on page 259.

  4. Example 5.27 on page 260. Uncorrelated but Dependent Random Variables.

  5. Example 5.29 on page 263. Loaded Dice

  6. Example 5.30 on page 263. Number of Defects in a Region; Random Splitting of Poisson Counts

  7. Example 5.31 on page 264. This is a noisy comm channel, now with Gaussian (normal) noise. This is a more realistic version of the earlier example with uniform noise. The application problems are:

    1. what input signal to infer from each output,
    2. how accurate is this, and
    3. what cutoff minimizes this?

    In the real world there are several ways you could reduce that error:

    1. Increase the transmitted signal,
    2. Reduce the noise,
    3. Retransmit several times and vote.
    4. Handshake: Include a checksum and ask for retransmission if it fails.
    5. Instead of just deciding X=+1 or X=-1 depending on Y, have a 3rd decision, i.e., uncertain if $|Y|<0.5$, and ask for retransmission in that case.

    This is relevant to solving Problem 5.3, page 288.