Engineering Probability Class 3 Thu 2019-01-17

3   Probability in the real world

  1. How Did Economists Get It So Wrong? is an article by Paul Krugman (2008 Nobel Memorial Prize in Economic Science). It says, "the economics profession went astray because economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth." You might see a certain relevance to this course. You have to get the model right before trying to solve it.

    Though I don't know much about it, I'll cheerfully try to answer any questions about econometrics.

    Another relevance to this course, in an enrichment sense, is that some people believe that the law of large numbers does not apply to certain variables, like stock prices. They think that larger and larger sample frequencies do not converge to a probability, because the variance of the underlying distribution is infinite. See Do financial returns have finite or infinite variance? A paradox and an explanation . This also is beyond this course.

  2. More articles on MIT students gaming the Mass lottery:

    1. How a Group of MIT Students Gamed the Massachusetts State Lottery
    2. A Calculated Approach to Winning the Lottery

4   Chapter 2 ctd

  1. Prove de Morgan's law (page 28)

  2. Corollory 5 (page 33): \(P[A\cup B] = P[A] + P[B] - P[A\cap B]\)

    1. Example: Queens and hearts. P[Q]=4/52, P[H]=13/52, P[Q \(\cup\) H]=16/52, P[Q \(\cap\) H]=1/52.
    2. \(P[A\cup B] \le P[A] + P[B]\)
  3. Corollory 6:

    \(\begin{array}{c} P\left[\cup_{i=1}^n A_i\right] = \\ \sum_{i=1}^n P[A_i] \\ - \sum_{i<j} P[A_i\cap A_j] \\ + \sum_{i<j<k} P[A_i\cap A_j\cap A_k] \cdots \\ + (-1)^{n+1} P[\cap_{i=1}^n A_i] \end{array}\)

    1. Example Q=queen card, H=heart, F= face card.
      1. P[Q]=4/52, P[H]=13/52, P[F]=12/52,
      2. P[Q \(\cap\) H]=1/52, P[Q \(\cap\) F] = ''you tell me''
      3. P[H \(\cap\) F]= ''you tell me''
      4. P[Q \(\cap\) H \(\cap\) F] = ''you tell me''
      5. So P[Q \(\cup\) H \(\cup\) F] = ?
    2. Example from Roulette:
      1. R=red, B=black, E=even, A=1-12
      2. P[R] = P[B] = P[E] = 16/38. P[A]=12/38
      3. \(P[R\cup E \cup A]\) = ?
  4. Corollory 7: if \(A\subset B\) then P[A] <= P[B]

    Example: Probability of a repeated coin toss having its first head in the 2nd-4th toss (1/2+1/4+1/8) \(\ge\) Probability of it happening in the 3rd toss (1/4).

  5. 2.2.1 Discrete sample space

    1. If sample space is finite, probabilities of all the outcomes tell you everything.
    2. sometimes they're all equal.
    3. Then P[event]} \(= \frac{\text{#. outcomes in event}}{\text{total # outcomes}}\)
    4. For countably infinite sample space, probabilities of all the outcomes also tell you everything.
    5. E.g. fair coin. P[even] = 1/2
    6. E.g. example 2.9. Try numbers from random.org.
    7. What probabilities to assign to outcomes is a good question.
    8. Example 2.10. Toss coin 3 times.
      1. Choice 1: outcomes are TTT ... HHH, each with probability 1/8
      2. Choice 2: outcomes are # heads: 0...3, each with probability 1/4.
      3. Incompatible. What are probabilities of # heads for choice 1?
      4. Which is correct?
      5. Both might be mathematically ok.
      6. It depends on what physical system you are modeling.
      7. You might try doing the experiment and observing.
      8. You might add a new assumption: The coin is fair and the tosses independent.
  6. Example 2.11: countably infinite sample space.

    1. Toss fair coin, outcome is # tosses until 1st head.
    2. What are reasonable probabilities?
    3. Do they sum to 1?
  7. 2.2.2 Continuous sample spaces

    1. Usually we can't assign probabilities to points on real line. (It just doesn't work out mathematically.)
    2. Work with set of intervals, and Boolean operations on them.
    3. Set may be finite or countable.
    4. This set of events is a ''Borel set''.
    5. Notation:
      1. [a,b] closed. includes both. a<=x<=b
      2. (a,b) open. includes neither. a<x<b
      3. [a,b) includes a but not b, a<=x<b
      4. (a,b] includes b but not a, a<x<=b
    6. Assign probabilities to intervals (open or closed).
    7. E.g., uniform distribution on [0,1] \(P[a\le x\le b] = \frac{1}{b-a}\)
    8. Nonuniform distributions are common.
    9. Even with a continuous sample space, a few specific points might have probabilities. The following is mathematically a valid probability distribution. However I can't immediately think of a physical system that it models.
      1. \(S = \{ x | 0\le x\le 1 \}\)
      2. \(p(x=1) = 1/2\)
      3. For \(0\le x_0 \le 1, p(x<x_0) = x_0/2\)
  8. For fun: Heads you win, tails... you win. You can beat the toss of a coin and here's how....

  9. Example 2.13, page 39, nonuniform distribution: chip lifetime.

    1. Propose that P[(t, \(\infty\) )] = \(e^{-at}\) for t>0.
    2. Does this satisfy the axioms?
    3. I: yes >0
    4. II: yes, P[S] = \(e^0\) = 1
    5. III here is more like a definition for the probability of a finite interval
    6. P[(r,s)] = P[(r, \(\infty\) )] - P[(s, \(\infty\) )] = \(e^{-ar} - e^{-as}\)
  10. Probability of a precise value occurring is 0, but it still can occur, since SOME value has to occur.

  11. Example 2.14: picking 2 numbers randomly in a unit square.

    1. Assume that the probability of a point falling in a particular region is proportional to the area of that region.
    2. E.g. P[x>1/2 and y<1/10] = 1/20
    3. P[x>y] = 1/2
  12. Recap:

    1. Problem statement defines a random experiment
    2. with an experimental procedure and set of measurements and observations
    3. that determine the possible outcomes and sample space
    4. Make an initial probability assignment
    5. based on experience or whatever
    6. that satisfies the axioms.

5   Iclicker questions

  1. Answer this question (won't be graded): 2+2=?

    1. 1
    2. 2
    3. 3.9999
    4. 4
    5. Whatever you want it to be (you're an accountant).

    Continuous probability:

    1. S is the real interval [0,1].
    2. P([a,b]) = b-a if 0<=a<=b<=1.
    3. Event A = [.2,.6].
    4. Event B = [.4,1].

    Questions:

  2. What is P[A]?

    1. .2
    2. .4
    3. .6
    4. .8
  3. What is P[B]?

    1. .2
    2. .4
    3. .6
    4. .8
  4. What is P[A \(\cup\) B]?

    1. .2
    2. .4
    3. .6
    4. .8
  5. What is P[A \(\cap\) B]?

    1. .2
    2. .4
    3. .6
    4. .8
  6. What is P[A \(\cup\) B \(^c\) ]?

    1. .2
    2. .4
    3. .6
    4. .8
  7. Retransmitting a noisy bit 3 times: Set e=0.1. What is probability of no error in 3 bits:

    1. 0.1
    2. 0.3
    3. 0.001
    4. 0.729
    5. 0.9
  8. Flipping a fair coin until we get heads: How many times will it take until the probability of seeing a head is >=.8?

    1. 1
    2. 2
    3. 3
    4. 4
    5. 5
  9. This time, the coin is weighted so that p[H]=.6. How many times will it take until the probability of seeing a head is >=.8?

    1. 1
    2. 2
    3. 3
    4. 4
    5. 5