Engineering Probability Class 8 Mon 2018-02-12

1   Office hours and TAs

  1. There are 2 grad TAs:

    1. Yi Fan fany4ATrpi.edu
    2. Amelia Peterson petera7ATrpi.edu

    and one undergrad:

    1. Jieyu Chen chenj35ATrpi.edu

    replace AT with (you figure it out).

  2. Jieyu has taken this course, although from a different prof. She will prepare homework solutions.

  3. Yi and Amelia will grade.

  4. All three will hold office hours:

    1. Jieyu Chen: Fri 2:30 in the ECSE Flip Flop lounge, JEC6037.
    2. Amelia Peterson: Wed 12-2 in the Flip Flop lounge.
    3. Yi Fan: Tue 3-5, on the third floor of the library, in the open area near the window.

    Any changes will be pre-announced.

    Go towards the start of the time, otherwise they may leave.

    OTOH, if you need more time, or another time, write them. The goal is that every student who wants personal time will get it. (Want to talk with me after class over coffee; ok.)

  5. Between the lectures, where I stay after class to answer questions, and their office hours, you have someone available 5 days a week.

2   Wikipedia

Wikipedia's articles on technical subjects can be excellent. In fact, they often have more detail than you want. Here are some that are relevant to this course. Read at least the first few paragraphs.

  1. https://en.wikipedia.org/wiki/Outcome_(probability)
  2. https://en.wikipedia.org/wiki/Random_variable
  3. https://en.wikipedia.org/wiki/Indicator_function
  4. https://en.wikipedia.org/wiki/Gambler%27s_fallacy
  5. https://en.wikipedia.org/wiki/Fat-tailed_distribution

Nevertheless, Wikipedia can have errors, or at least definition differences from our text. E.g., https://en.wikipedia.org/wiki/Outcome_(probability) says that a discrete probability distribition has a finite sample space. Wrong, it could be countably infinite.

3   Independence of 3 random variables

There was a question last week. If 3 r.v. are independent, does that mean that pairs of them are independent? The answer is no. See this Venn diagram:

/images/triple-indep.png
  1. Let R,G,B mean red, green, blue.
  2. P(R) = P(B) = P(G) = 1/2
  3. P(R and G and B) = 1/8 = P(R) P(B) P(G) so the triple is independent.
  4. P(R and G) = 0 != P(R) P(G) so the pair are not independent.
  5. You have to test every subset for independence.
  6. Wikipedia also talks about this.

4   Two types of testing errors

  1. There's an event A, with probability P[A]=p.
  2. There's a dependent event, perhaps a test or a transmission, B.
  3. You know P[B|A] and P[B|A'].
  4. Wikipedia:
    1. https://en.wikipedia.org/wiki/Type_I_and_type_II_errors
    2. https://en.wikipedia.org/wiki/Sensitivity_and_specificity
  5. Terminology:
    1. Type I error, False negative.
    2. Type II error, false positive.
    3. Sensitivity, true positive proportion.
    4. Selectivity, true negative proportion.

5   Chapter 3 ctd

  1. This chapter covers Discrete (finite or countably infinite) r.v.. This contrasts to continuous, to be covered later.

  2. Discrete r.v.s we've seen so far:

    1. uniform: M events 0...M-1 with equal probs
    2. bernoulli: events: 0 w.p. q=(1-p) or 1 w.p. p
    3. binomial: # heads in n bernoulli events
    4. geometric: # trials until success, each trial has probability p.
  3. 3.1.1 Expected value of a function of a r.v.

    1. Z=g(X)
    2. E[Z] = E[g(x)] = \(\sum_k g(x_k) p_X(x_k)\)
  4. Example 3.17 square law device

  5. \(E[a g(X)+b h(X)+c] = a E[g(X)] + b E[h(x)] + c\)

  6. Example 3.18 Square law device continued

  7. Example 3.19 Multiplexor discards packets

  8. Compute mean of a binomial distribution (started last Fri).

  9. Compute mean of a geometric distribution (started last Fri).

  10. 3.3.1, page 107: Operations on means: sums, scaling, functions

  11. iclicker: From a deck of cards, I draw a card, look at it, put it back and reshuffle. Then I do it again. What's the probability that exactly one of the 2 cards is a heart?

    • A: 2/13
    • B: 3/16
    • C: 1/4
    • D: 3/8
    • E: 1/2
  12. iclicker: From a deck of cards, I draw a card, look at it, put it back and reshuffle. I keep repeating this. What's the probability that the 2nd card is the 1st time I see hearts?

    • A: 2/13
    • B: 3/16
    • C: 1/4
    • D: 3/8
    • E: 1/2
  13. 3.3.2 page 109 Variance of an r.v.

    1. That means, how wide is its distribution?
    2. Example: compare the performance of stocks vs bonds from year to year. The expected values (means) of the returns may not be so different. (This is debated and depends, e.g., on what period you look at). However, stocks' returns have a much larger variance than bonds.
    3. \(\sigma^2_X = VAR[X] = E[(X-m_X)^2] = \sum (x-m_x)^2 p_X(x)\)
    4. standard deviation \(\sigma_X = \sqrt{VAR[X]}\)
    5. \(VAR[X] = E[X^2] - m_X^2\)
    6. 2nd moment: \(E[X^2]\)
    7. also 3rd, 4th... moments, like a Taylor series for probability
    8. shifting the distribution: VAR[X+c] = VAR[X] (corrected from VAR[c]) [[#e4]]
    9. scaling: \(VAR[cX] = c^2 VAR[X]\)
  14. Derive variance for Bernoulli.

  15. Example 3.20 3 coin tosses

    1. general rule for binomial: VAR[X]=npq
    2. Derive it.
    3. Note that it sums since the events are independent.
    4. Note that variance/mean shrinks as n grows.
  16. iclicker: The experiment is drawing a card from a deck, seeing if it's hearts, putting it back, shuffling, and repeating for a total of 100 times. The random variable is the total # of hearts seen, from 0 to 100. What's the mean of this r.v.?

    • A: 1/4
    • B: 25
    • C: 1/2
    • D: 50
    • E: 1
  17. iclicker: The experiment is drawing a card from a deck, seeing if it's hearts, putting it back, shuffling, and repeating for a total of 100 times. The random variable is the # of hearts seen, from 0 to 100. What's the variance of this r.v.?

    • A: 3/16
    • B: 1
    • C: 25/4
    • D: 75/4
    • E: 100
  18. 3.4 page 111 Conditional pmf

  19. Example 3.24 Residual waiting time

    1. X, time to xmit message, is uniform in 1...L.
    2. If X is over m, what's probability that remaining time is j?
    3. \(p_X(m+j|X>m) = \frac{P[X =m+j]}{P[X>m]} = \frac{1/L}{(L-m)/L} = 1/(L-m)\)
  20. \(p_X(x) = \sum p_X(x|B_i) P[B_i]\)

  21. Example 3.25 p 113 device lifetimes

    1. 2 classes of devices, geometric lifetimes.
    2. Type 1, probability \(\alpha\), parameter r. Type 2 parameter s.
    3. What's pmf of the total set of devices?
  22. Example 3.26.

  23. 3.5 More important discrete r.v

  24. Table 3.1: We haven't seen \(G_X(z)\) yet.

  25. 3.5.4 Poisson r.v.

    1. The experiment is observing how many of a large number of rare events happen in, say, 1 minute.

    2. E.g., how many cosmic particles hit your DRAM, how many people call to call center.

    3. The individual events are independent.

    4. The r.v. is the number that happen in that period.

    5. There is one parameter, \(\alpha\). Often this is called \(\lambda\).

      \begin{equation*} p(k) = \frac{\alpha^k}{k!}e^{-\alpha} \end{equation*}
    6. Mean and std dev are both \(\alpha\).

    7. In the real world, events might be dependent.