Engineering Probability Class 7 Thu 2018-02-08

1   Discrete Random Variables

  1. Chapter 3, p 96. Discrete random variables
    1. From now on our random experiments will always produce numbers, called random variables, at least indirectly.
    2. Then we can compute, e.g., a fair value to pay for a gamble. What should you pay to play roulette so that betting on red breaks even on average?
    3. Discrete is different from discreet.
    4. Random experiment \(\rightarrow\) nonnumerical outcome \(\zeta\) \(\rightarrow\)
    5. Random Variable \(X(\zeta )\). Any real number.
    6. Random vars in general: X, Y, ...
    7. particular values: x, y, ...
    8. It's the outcome that's random, not the r.v., which is a deterministic function of the outcome.
  2. Example 3.1 Coin tosses
    1. Define X to be the number of heads from 3 tosses.
    2. \(\zeta\)
  3. Example 3.2 Betting game addon to 3.1
    1. Define another random var Y to be payoff: 8 if X=3, 1 if X=2, 0 else.
    2. Y is derived from X
  4. Example 3.3 add probs to 3.2, assuming fair coin. P[X=2], P[Y=8]
  5. 3.1.1 Ignore since it's starred.
  6. 3.2 Discrete r.v. and Probability mass function (pmf).
    1. The pmf shows the probability of every value of random variable X, and of every real number.
    2. If X cannot have the value x, then the pmf is 0 at x.
    3. \(p_X(x) = P[X=x] = P[\{\zeta:X(\zeta)=x\}]\)
  7. p100: 3 properties of pmf. They're all common sense.
    1. Nonnegative.
    2. Sums to one.
    3. The probability of an event B is the sum of the probabilities of the outcomes in B.
  8. Example 3.5 probability of # heads in 3 coin tosses: \(p_X(0)=1/8\)
  9. Example 3.6 betting game \(p_Y(1)=3/8\)
  10. Fig 3.4. You can graph the pmf.
  11. There are many types of random variables, depending on the shape of the pmf. These start out the same as the various probability laws in Chapter 2. However we'll see more types (e.g., Poisson) and more properties of each type (e.g., mean, standard deviation, generating function).
  12. Example 3.7 random number generator
    1. produces integer X equally likely in range 0..M-1
    2. \(S_X=\{0, 1, ... M-1 \}\)
    3. pmf: \(p_X(k)=1/M\) for k in 0..M-1.
    4. X is a uniform random variable over that set.
  13. Example 3.8 Bernoulli random variable
    1. indicator function \(I_A(\zeta)=1\) iff \(\zeta\in A\)
    2. pmf of \(I_A\): \(p_I(0)=1-p, p_I(1)=p\)
    3. \(I_A\) is a Bernoulli random variable.
  14. Example 3.9 Message transmission until success
    1. \(p_X(k)=q^{k-1}p, k=1,2,3,...\)
    2. Geometric random variable
    3. What about P[X is even]?
  15. Example 3.10 Number of transmission errors
    1. \(p_X(k) = {n \choose k} p^k q^{n-k}, k=0,1,...n\)
    2. binomial random variable
  16. Fig 3.5 You can graph the relative frequencies from running an experiment repeatedly.
    1. It will approach the pmf graph (absent pathological cases like the Cauchy distribition that are beyond this course.)
  17. 3.3 p104 Expected value and other moments
    1. This is a way to summarize a r.v., and capture important aspects.
    2. E.g., What's a fair price to pay for a lottery ticket?
    3. Mean or expected value or center of mass: \(m_X = E[X] = \sum_{x\in S_X} x p_X(x)\)
    4. Defined iff absolute convergence: \(\sum |x| p(x) < \infty\)
  18. Example 3.11 Mean of Bernoulli r.v.
  19. Example 3.12 Mean of Binomial r.v. What's the expected # of heads in 3 tosses?
  20. Example 3.13 Mean of uniform discrete r.v.
  21. Run an experiment n times and observe \(x(1), x(2), ...\)
    1. \(N_k(n)\) # times \(x_k\) was seen
    2. \(f_k(n) = N_k(n)/n\) frequencies
    3. Sample mean \(<X>_n = \sum x_kf_k(n)\)
    4. With lots of experiments, frequencies approach probabilities and sample mean converges to E[X]
    5. However it may take a long time, which is why stock market investors can go broke first.
  22. Example 3.14 p 106 Betting game
  23. Example 3.15 Mean of a geometric r.v.
  24. Example 3.16 St Petersburg paradox

Engineering Probability Class 6 Mon 2018-02-05

1   Iclicker review of Bayes theorem

  1. Event A is that a random person has a lycanthopy gene. Assume P(A) = .01.

    Genes-R-Us has a DNA test for this. B is the event of a positive test. There are false positives and false negatives each w.p. (with probability) 0.1. That is, P(B|A') = P(B' | A) = 0.1

    What's P(A')?

    1. 0.09
    2. .099
    3. .189
    4. .48
    5. .99
  2. What's P(A and B)?

    1. 0.09
    2. .099
    3. .189
    4. .48
    5. .99
  3. What's P(A' and B)?

    1. 0.09
    2. .099
    3. .189
    4. .48
    5. .99
  4. What's P(B)?

    1. 0.09
    2. .099
    3. .189
    4. .48
    5. .99
  5. You test positive. What's the probability you're really positive, P(A|B)?

    1. 0.09
    2. .099
    3. .189
    4. .48
    5. .99

2   Chapter 2 ctd

  1. Wikipedia on Bayes theorem.

    We'll do the second example in class.

  2. Look at example 2.30 chip quality control: For example 2.28, how long do we have to burn in chips so that the survivors have a 99% probability of being good? p=0.1, a=1/20000. I may do it on Thurs.

  3. 2.5 Independent events

    1. \(P[A\cap B] = P[A] P[B]\)
    2. P[A|B] = P[A], P[B|A] = P[B]
  4. A,B independent means that knowing A doesn't help you with B.

  5. Mutually exclusive events w.p.>0 must be dependent.

  6. Example 2.33. Last condition above is required.

    /images/fig214.jpg
  7. More that 2 events:

    1. N events are independent iff the occurrence of no combo of the events affects another event.
    2. Each pair is independent.
    3. Also need \(P[A\cap B\cap C] = P[A] P[B] P[C]\)
    4. This is not intuitive A, B, and C might be pairwise independent, but, as a group of 3, are dependent.
    5. See example 2.32, page 55. A: x&gt;1/2. B: y&gt;1/2. C: x&gt;y
  8. Common application: independence of experiments in a sequence.

  9. Example 2.34: coin tosses are assumed to be independent of each other.

    P[HHT] = P[1st coin is H] P[2nd is H] P[3rd is T].

  10. Example 2.35 System reliability

    1. Controller and 3 peripherals.
    2. System is up iff controller and at least 2 peripherals are up.
    3. Add a 2nd controller.
  11. 2.6 p59 Sequential experiments: maybe independent

  12. 2.6.1 Sequences of independent experiments

    1. Example 2.36
  13. 2.6.2 Binomial probability

    1. Bernoulli trial flip a possibly unfair coin once. p is probability of head.
    2. (Bernoulli did stats, econ, physics, ... in 18th century.)
  14. Example 2.37

    1. P[TTH] = \((1-p)^2 p\)
    2. P[1 head] = \(3 (1-p)^2 p\)
  15. Probability of exactly k successes = \(p_n(k) = {n \choose k} p^k (1-p)^{n-k}\)

  16. \(\sum_{k=0}^n p_n(k) = 1\)

  17. Example 2.38

  18. Can avoid computing n! by computing \(p_n(k)\) recursively, or by using approximation. Also, in C++, using double instead of float helps. (Almost always you should use double instead of float. It's the same speed.)

  19. Example 2.39

  20. Example 2.40 Error correction coding

  21. Multinomial probability law

    1. There are M different possible outcomes from an experiment, e.g., faces of a die showing.

    2. Probability of particular outcome: \(p_i\)

    3. Now run the experiment n times.

    4. Probability that i-th outcome occurred \(k_i\) times, \(\sum_{i=1}^M k_i = n\)

      \begin{equation*} P[(k_1,k_2,...,k_M)]` :math:`= \frac{n!}{k_1! k_2! ... k_M!} p_1^{k_1} p_2^{k_2}...p_M^{k_M} \end{equation*}
  22. Example 2.41 dartboard.

  23. Example 2.42 random phone numbers.

  24. 2.7 Computer generation of random numbers

    1. Skip this section, except for following points.
    2. Executive summary: it's surprisingly hard to generate good random numbers. Commercial SW has been known to get this wrong. By now, they've gotten it right (I hope), so just call a subroutine.
    3. Arizona lottery got it wrong in 1998.
    4. Even random electronic noise is hard to use properly. The best selling 1955 book A Million Random Digits with 100,000 Normal Deviates had trouble generating random numbers this way. Asymmetries crept into their circuits perhaps because of component drift. For a laugh, read the reviews.
    5. Pseudo-random number generator: The subroutine returns numbers according to some algorithm (e.g., it doesn't use cosmic rays), but for your purposes, they're random.
    6. Computer random number routines usually return the same sequence of number each time you run your program, so you can reproduce your results.
    7. You can override this by seeding the generator with a genuine random number from linux /dev/random.
  25. 2.8 and 2.9 p70 Fine points: Skip.

  26. Review Bayes theorem, since it is important. Here is a fictitious (because none of these probilities have any justification) SETI example.

    1. A priori probability of extraterrestrial life = P[L] = \(10^{-8}\).
    2. For ease of typing, let L' be the complement of L.
    3. Run a SETI experiment. R (for Radio) is the event that it has a positive result.
    4. P[R|L] = \(10^{-5}\), P[R|L'] = \(10^{-10}\).
    5. What is P[L|R] ?
  27. Some specific probability laws

    1. In all of these, successive events are independent of each other.
    2. A Bernoulli trial is one toss of a coin where p is probability of head.
    3. We saw binomial and multinomial probilities on Tues.
    4. The binomial law gives the probability of exactly k heads in n tosses of an unfair coin.
    5. The multinomial law gives the probability of exactly ki occurrances of the i-th face in n tosses of a die.
  28. iClicker: You have a coin where the probability of a head is p=2/3 If you toss it twice, what's the probability that you will see one head and one tail?

    1. 1/2
    2. 1/3
    3. 2/9
    4. 5/9
    5. 4/9
  29. 2.6.4 p63 Geometric probability law

    1. Repeat Bernoulli experiment until 1st success.
    2. Define outcome to be # trials until that happens.
    3. Define q=(1-p).
    4. \(p(m) = (1-p)^{m-1}p = q^{m-1}p\) (p has 2 different uses here).
    5. \(\sum_{m=1}^\infty p(m) =1\)
    6. Probability that more than K trials are required = \(q^K\).
  30. Example: probability that more than 10 tosses of a die are required to get a 6 = \(\left(\frac{5}{6}\right)^{10} = 0.16\)

  31. iClicker: You have a coin where the probability of a head is p=2/3 What's the probability that the 1st head occurs on the 2nd toss?

    1. 1/2
    2. 1/3
    3. 2/9
    4. 5/9
    5. 4/9
  32. Example 2.43: error control by retransmission. A sent over a noisy channel is checksummed so the receiver can tell if it got mangled, and then ask for retransmission. TCP/IP does this.

    Aside: This works better when the roundtrip time is reasonable. Using this when talking to Mars is challenging.

  33. 2.6.5 p64 Sequences, chains, of dependent experiments.

    1. This is an important topic, but mostly beyond this course.
    2. In many areas, there are a sequence of observations, and the probability of each observation depends on what you observed before.
    3. It relates to Markov chains.
    4. Motivation: speech and language recognition, translation, compression
    5. E.g., in English text, the probability of a u is higher if the previous char was q.
    6. The probability of a b may be higher if the previous char was u (than if it was x), but is lower if the previous two chars are qu.
    7. Need to look at probabilities of sequences, char by char.
    8. Same idea in speech recognition: phonemes follow phonemes...
    9. Same in language understanding: verb follows noun...
  34. Example 2.44, p64

    1. In this example, you repeatedly choose an urn to draw a ball from, depending on what the previous ball was.

Engineering Probability Homework 3 due Mon 2018-02-12 2359 EST

How to submit

Submit to LMS; see details in syllabus.

Questions

  1. (5 points) Assume that it is known that one person in a group of 100 committed a crime. You're in the group, so there's a prior probability of 1/100 that you are it. There is a pretty good forensic test. It makes errors (either way) only 0.1% of the time. You are given the test; the result is positive. Using this positive test, what's the probability now that you are the criminal? (Use Bayes.)
    1. Do exercise 2.59, page 87. However, make it 21 students and 3 on each day of the week. Assume that there is no relation between birthday and day of the week.
  2. (5 pts) Do exercise 2.62 on page 88 of the text.
  3. (5 pts) Do 2.69, but use the interval [-2,2].
  4. (5 pts) Do 2.72.
  5. (5 pts) Do 2.76.
  6. (5 pts) Do 2.82.
  7. (5 pts) Do 2.97.
  8. (5 pts) Do 2.102.
  9. (5 pts) Do 2.106.

Total: 50 pts.

Engineering Probability Class 5 Thu 2018-02-01

1   Iclicker review

  1. Followon to the meal choice iclicker question. My friend and I wish to visit a hospital, chosen from: Memorial, AMC, Samaritan. We might visit different hospitals.
    1. If we don't care whether we visit the same hospital or not, in how many ways can we do this?
      1. 1
      2. 2
      3. 3
      4. 6
      5. 9
    2. We wish to visit different hospitals, to later write a Poly review. In how many ways can we visit different hospitals, where we care which hospital each of us visits?
      1. 1
      2. 2
      3. 3
      4. 6
      5. 9
    3. Modify the above, to say that we care only about the set of hospitals we two visit.
      1. 1
      2. 2
      3. 3
      4. 6
      5. 9
    4. We realize that Samaritan and Memorial are both owned by St Peters and we want to visit two different hospital chains to write our reviews. In how many ways can we pick hospitals so that we pick different chains?
      1. 1
      2. 2
      3. 3
      4. 4
      5. 5
    5. We each pick between Memorial and AMC with 50% probability, independently. What is the probability that each hospital is picked exactly once (in contrast to picking one twice and the other not at all).
      1. 0
      2. 1/4
      3. 1/2
      4. 3/4
      5. 1

2   Chapter 2 ctd

  1. New stuff, pp. 47-66:

    1. Conditional probability - If you know that event A has occurred, does that change the probability that event B has occurred?
    2. Independence of events - If no, then A and B are independent.
    3. Sequential experiments - Find the probability of a sequence of experiments from the probabilities of the separate steps.
    4. Binomial probabilities - tossing a sequence of unfair coins.
    5. Multinomial probabilities - tossing a sequence of unfair dice.
    6. Geometric probabilities - toss a coin until you see the 1st head.
    7. Sequences of dependent experiments - What you see in step 1 influences what you do in step 2.
  2. 2.4 Conditional probability, page 47.

    1. big topic
    2. E.g., if it snows today, is it more likely to snow tomorrow? next week? in 6 months?
    3. E.g., what is the probability of the stock market rising tomorrow given that (it went up today, the deficit went down, an oil pipeline was blown up, ...)?
    4. What's the probability that a CF bulb is alive after 1000 hours given that I bought it at Walmart?
    5. definition \(P[A|B] = \frac{P[A\cap B]}{P[B]}\)
  3. E.g., if DARPA had been allowed to run its Futures Markets Applied to Prediction (FutureMAP) would the future probability of King Zog I being assassinated be dependent on the amount of money bet on that assassination occurring?

    1. Is that good or bad?
    2. Would knowing that the real Zog survived over 55 assassination attempts change the probability of a future assassination?
  4. Consider a fictional university that has both undergrads and grads. It also has both Engineers and others:

    /images/venn-stu.png
  5. iClicker: What's the probability that a student is an Engineer?

    1. 1/7
    2. 4/7
    3. 5/7
    4. 3/4
    5. 3/5
  6. iClicker: What's the probability that a student is an Engineer, given that s/he is an undergrad?

    1. 1/7
    2. 4/7
    3. 5/7
    4. 3/4
    5. 3/5
  7. \(P[A\cap B] = P[A|B]P[B] = P[B|A]P[A]\)

  8. Example 2.26 Binary communication. Source transmits 0 with probability (1-p) and 1 with probability p. Receiver errs with probability e. What are probabilities of 4 events?

  9. Total probability theorem

    1. \(B_i\) mutually exclusive events whose union is S
    2. P[A] = P[A \(\cap B_1\) + P[A \(\cap B_2\) + ...
    3. \(P[A] = P[A|B_1]P[B_1]\) \(+ P[A|B_2]P[B_2] + ...\)
    /images/totalprob.png

    What's the probability that a student is an undergrad, given ... (Numbers are fictitious.)

  10. Example 2.28. Chip quality control.

    1. Each chip is either good or bad.
    2. P[good]=(1-p), P[bad]=p.
    3. If the chip is good: P[still alive at t] = \(e^{-at}\)
    4. If the chip is bad: P[still alive at t] = \(e^{-1000at}\)
    5. What's the probability that a random chip is still alive at t?
  11. 2.4.1, p52. Bayes' rule. This lets you invert the conditional probabilities.

    1. \(B_j\) partition S. That means that
      1. If \(i\ne j\) then \(B_i\cap B_j=\emptyset\) and
      2. \(\bigcup_i B_i = S\)
    2. \(P[B_j|A] = \frac{B_j\cap A}{P[A]}\) \(= \frac{P[A|B_j] P[B_j]}{\sum_k P[A|B_k] P[B_k]}\)
    3. application:
      1. We have a priori probs \(P[B_j]\)
      2. Event A occurs. Knowing that A has happened gives us info that changes the probs.
      3. Compute a posteriori probs \(P[B_j|A]\)
  12. In the above diagram, what's the probability that an undergrad is an engineer?

  13. Example 2.29 comm channel: If receiver sees 1, which input was more probable? (You hope the answer is 1.)

  14. Example 2.30 chip quality control: For example 2.28, how long do we have to burn in chips so that the survivors have a 99% probability of being good? p=0.1, a=1/20000.

  15. Example: False positives in a medical test

    1. T = test for disease was positive; T' = .. negative
    2. D = you have disease; D' = .. don't ..
    3. P[T|D] = .99, P[T' | D'] = .95, P[D] = 0.001
    4. P[D' | T] (false positive) = 0.98 !!!

Engineering Probability Class 4 Mon 2018-01-29

1   Chapter 2 ctd

  1. Today: counting methods, Leon-Garcia section 2.3, page 41.

    1. We have an urn with n balls.
    2. Maybe the balls are all different, maybe not.
    3. W/o looking, we take k balls out and look at them.
    4. Maybe we put each ball back after looking at it, maybe not.
    5. Suppose we took out one white and one green ball. Maybe we care about their order, so that's a different case from green then white, maybe not.
  2. Applications:

    1. How many ways can we divide a class of 12 students into 2 groups of 6?
    2. How many ways can we pick 4 teams of 6 students from a class of 88 students (leaving 64 students behind)?
    3. We pick 5 cards from a deck. What's the probability that they're all the same suit?
    4. We're picking teams of 12 students, but now the order matters since they're playing baseball and that's the batting order.
    5. We have 100 widgets; 10 are bad. We pick 5 widgets. What's the probability that none are bad? Exactly 1? More than 3?
    6. In the approval voting scheme, you mark as many candidates as you please. The candidate with the most votes wins. How many different ways can you mark the ballot?
    7. In preferential voting, you mark as many candidates as you please, but rank them 1,2,3,... How many different ways can you mark the ballot?
  3. Leon-Garcia 2.3: Counting methods, pp 41-46.

    1. finite sample space
    2. each outcome equally probable
    3. get some useful formulae
    4. warmup: consider a multiple choice exam where 1st answer has 3 choices, 2nd answer has 5 choices and 3rd answer has 6 choices.
      1. Q: How many ways can a student answer the exam?
      2. A: 3x5x6
    5. If there are k questions, and the i-th question has \(n_i\) answers then the number of possible combinations of answers is \(n_1n_2 .. n_k\)
  4. 2.3.1 Sampling WITH replacement and WITH ordering

    1. Consider an urn with n different colored balls.
    2. Repeat k times:
      1. Draw a ball.
      2. Write down its color.
      3. Put it back.
    3. Number of distinct ordered k-tuples = \(n^k\)
  5. Example 2.1.5. How many distinct ordered pairs for 2 balls from 5? 5*5.

  6. iClicker. Suppose I want to eat one of the following 4 places, for tonight and again tomorrow, and don't care if I eat at the same place both times: Commons, Sage, Union, Knotty Pine. How many choices to I have where to eat?

    1. 16
    2. 12
    3. 8
    4. 4
    5. something else
  7. 2.3.2 Sampling WITHOUT replacement and WITH ordering

    1. Consider an urn with n different colored balls.
    2. Repeat k times:
      1. Draw a ball.
      2. Write down its color.
      3. Don't put it back.
    3. Number of distinct ordered k-tuples = n(n-1)(n-2)...(n-k+1)
  8. iClicker. Suppose I want to visit two of the following four cities: Buffalo, Miami, Boston, New York. I don't want to visit one city twice, and the order matters. How many choices to I have how to visit?

    1. 16
    2. 12
    3. 8
    4. 4
    5. something else
  9. Example 2.1.6: Draw 2 balls from 5 w/o replacement.

    1. 5 choices for 1st ball, 4 for 2nd. 20 outcomes.
    2. Probability that 1st ball is larger?
    3. List the 20 outcomes. 10 have 1st ball larger. P=1/2.
  10. Example 2.1.7: Draw 3 balls from 5 with replacement. What's the probability they're all different?

    1. P = \(\small \frac{\text{# cases where they're different}}{\text{# cases where I don't care}}\)
    2. P = \(\small \frac{\text{# case w/o replacement}}{\text{# cases w replacement}}\)
    3. P = \(\frac{5*4*3}{5*5*5}\)
  11. 2.3.3 Permutations of n distinct objects

    1. Distinct means that you can tell the objects apart.

    2. This is sampling w/o replacement for k=n

    3. 1.2.3.4...n = n!

    4. It grows fast. 1!=1, 2!=2, 3!=6, 4!=24, 5!=120, 6!=720, 7!=5040

    5. Stirling approx:

      \begin{equation*} n! \approx \sqrt{2\pi n} \left(\frac{n}{e}\right)^n\left(1+\frac{1}{12n}+...\right) \end{equation*}
    6. Therefore if you ignore the last term, the relative error is about 1/(12n).

  12. Example 2.1.8. # permutations of 3 objects. 6!

  13. Example 2.1.9. 12 airplane crashes last year. Assume independent, uniform, etc, etc. What's probability of exactly one in each month?

    1. For each crash, let the outcome be its month.
    2. Number of events for all 12 crashes = \(12^{12}\)
    3. Number of events for 12 crashes in 12 different months = 12!
    4. Probability = \(12!/(12^{12}) = 0.000054\)
    5. Random does not mean evenly spaced.
  14. 2.3.4 Sampling w/o replacement and w/o ordering

    1. We care what objects we pick but not the order

    2. E.g., drawing a hand of cards.

    3. term: Combinations of k objects selected from n. Binomial coefficient.

      \begin{equation*} C^n_k = {n \choose k} = \frac{n!}{k! (n-k)!} \end{equation*}
    4. Permutations is when order matters.

  15. Example 2.20. Select 2 from 5 w/o order. \(5\choose 2\)

  16. Example 2.21 # permutations of k black and n-k white balls. This is choosing k from n.

  17. Example 2.22. 10 of 50 items are bad. What's probability 5 of 10 selected randomly are bad?

    1. # ways to have 10 bad items in 50 is \(50\choose 10\)
    2. # ways to have exactly 5 bad is 3 ways to select 5 good from 40 times # ways to select 5 bad from 10 = \({40\choose5} {10\choose5}\)
    3. Probability is ratio.
  18. Multinomial coefficient: Partition n items into sets of size \(k_1, k_2, ... k_j, \sum k_i=n\)

    \begin{equation*} \frac{n!}{k_1! k_2! ... k_j!} \end{equation*}
  19. 2.3.5. skip

  20. More state lottery incompetence: Statistician Cracks Code For Lottery Tickets

    Finding these stories is just too easy.

Reading: 2.4 Conditional probability, page 47-

Engineering Probability Class 3 Thu 2018-01-25

1   Probability in the real world

  1. How Did Economists Get It So Wrong? is an article by Paul Krugman (2008 Nobel Memorial Prize in Economic Science). It says, "the economics profession went astray because economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth." You might see a certain relevance to this course. You have to get the model right before trying to solve it.

    Though I don't know much about it, I'll cheerfully try to answer any questions about econometrics.

    Another relevance to this course, in an enrichment sense, is that some people believe that the law of large numbers does not apply to certain variables, like stock prices. They think that larger and larger sample frequencies do not converge to a probability, because the variance of the underlying distribution is infinite. See Do financial returns have finite or infinite variance? A paradox and an explanation . This also is beyond this course.

  2. More articles on MIT students gaming the Mass lottery:

    1. How a Group of MIT Students Gamed the Massachusetts State Lottery
    2. A Calculated Approach to Winning the Lottery

2   Iclicker questions

  1. Answer this question (won't be graded): 2+2=?

    1. 1
    2. 2
    3. 3.9999
    4. 4
    5. Whatever you want it to be (you're an accountant).

    Continuous probability:

    1. S is the real interval [0,1].
    2. P([a,b]) = b-a if 0<=a<=b<=1.
    3. Event A = [.2,.6].
    4. Event B = [.4,1].

    Questions:

  2. What is P[A]?

    1. .2
    2. .4
    3. .6
    4. .8
  3. What is P[B]?

    1. .2
    2. .4
    3. .6
    4. .8
  4. What is P[A \(\cup\) B]?

    1. .2
    2. .4
    3. .6
    4. .8
  5. What is P[A \(\cap\) B]?

    1. .2
    2. .4
    3. .6
    4. .8
  6. What is P[A \(\cup\) B \(^c\) ]?

    1. .2
    2. .4
    3. .6
    4. .8
  7. Retransmitting a noisy bit 3 times: Set e=0.1. What is probability of no error in 3 bits:

    1. 0.1
    2. 0.3
    3. 0.001
    4. 0.729
    5. 0.9
  8. Flipping a fair coin until we get heads: How many times will it take until the probability of seeing a head is >=.8?

    1. 1
    2. 2
    3. 3
    4. 4
    5. 5
  9. This time, the coin is weighted so that p[H]=.6. How many times will it take until the probability of seeing a head is >=.8?

    1. 1
    2. 2
    3. 3
    4. 4
    5. 5

3   Chapter 2 ctd

  1. Corollory 5: \(P[A\cup B] = P[A] + P[B] - P[A\cap B]\)

    1. Proof from the fact that \(P[A], P[B], P[A\cap B]\) are disjoint.
    2. Example: Queens and hearts. P[Q]=4/52, P[H]=13/52, P[Q \(\cup\) H]=16/52, P[Q \(\cap\) H]=1/52.
    3. \(P[A\cup B] \le P[A] + P[B]\)
  2. Corollory 6:

    \(\begin{array}{c} P\left[\cup_{i=1}^n A_i\right] = \\ \sum_{i=1}^n P[A_i] \\ - \sum_{i<j} P[A_i\cap A_j] \\ + \sum_{i<j<k} P[A_i\cap A_j\cap A_k] \cdots \\ + (-1)^{n+1} P[\cap_{i=1}^n A_i] \end{array}\)

    1. Example Q=queen card, H=heart, F= face card.
      1. P[Q]=4/52, P[H]=13/52, P[F]=12/52,
      2. P[Q \(\cap\) H]=1/52, P[Q \(\cap\) F] = ''you tell me''
      3. P[H \(\cap\) F]= ''you tell me''
      4. P[Q \(\cap\) H \(\cap\) F] = ''you tell me''
      5. So P[Q \(\cup\) H \(\cup\) F] = ?
    2. Example from Roulette:
      1. R=red, B=black, E=even, A=1-12
      2. P[R] = P[B] = P[E] = 16/38. P[A]=12/38
      3. \(P[R\cup E \cup A]\) = ?
  3. Corollory 7: if \(A\subset B\) then P[A] <= P[B]

    Example: Probability of a repeated coin toss having its first head in the 2nd-4th toss (1/2+1/4+1/8) \(\ge\) Probability of it happening in the 3rd toss (1/4).

  4. 2.2.1 Discrete sample space

    1. If sample space is finite, probabilities of all the outcomes tell you everything.
    2. sometimes they're all equal.
    3. Then P[event]} \(= \frac{\text{#. outcomes in event}}{\text{total # outcomes}}\)
    4. For countably infinite sample space, probabilities of all the outcomes also tell you everything.
    5. E.g. fair coin. P[even] = 1/2
    6. E.g. example 2.9. Try numbers from random.org.
    7. What probabilities to assign to outcomes is a good question.
    8. Example 2.10. Toss coin 3 times.
      1. Choice 1: outcomes are TTT ... HHH, each with probability 1/8
      2. Choice 2: outcomes are # heads: 0...3, each with probability 1/4.
      3. Incompatible. What are probabilities of #. heads for choice 1?
      4. Which is correct?
      5. Both might be mathematically ok.
      6. It depends on what physical system you are modeling.
      7. You might try doing the experiment and observing.
      8. You might add a new assumption: The coin is fair and the tosses independent.
  5. Example 2.11: countably infinite sample space.

    1. Toss fair coin, outcome is # tosses until 1st head.
    2. What are reasonable probabilities?
    3. Do they sum to 1?
  6. 2.2.2 Continuous sample spaces

    1. Usually we can't assign probabilities to points on real line. (It just doesn't work out mathematically.)
    2. Work with set of intervals, and Boolean operations on them.
    3. Set may be finite or countable.
    4. This set of events is a ''Borel set''.
    5. Notation:
      1. [a,b] closed. includes both. a<=x<=b
      2. (a,b) open. includes neither. a<x<b
      3. [a,b) includes a but not b, a<=x<b
      4. (a,b] includes b but not a, a<x<=b
    6. Assign probabilities to intervals (open or closed).
    7. E.g., uniform distribution on [0,1] \(P[a\le x\le b] = \frac{1}{b-a}\)
    8. Nonuniform distributions are common.
    9. Even with a continuous sample space, a few specific points might have probabilities. The following is mathematically a valid probability distribution. However I can't immediately think of a physical system that it models.
      1. \(S = \{ x | 0\le x\le 1 \}\)
      2. \(p(x=1) = 1/2\)
      3. For \(0\le x_0 \le 1, p(x<x_0) = x_0/2\)
  7. For fun: Heads you win, tails... you win. You can beat the toss of a coin and here's how....

  8. Example 2.13, page 39, nonuniform distribution: chip lifetime.

    1. Propose that P[(t, \(\infty\) )] = \(e^{-at}\) for t>0.
    2. Does this satisfy the axioms?
    3. I: yes >0
    4. II: yes, P[S] = \(e^0\) = 1
    5. III here is more like a definition for the probability of a finite interval
    6. P[(r,s)] = P[(r, \(\infty\) )] - P[(s, \(\infty\) )] = \(e^{-ar} - e^{-as}\)
  9. Probability of a precise value occurring is 0, but it still can occur, since SOME value has to occur.

  10. Example 2.14: picking 2 numbers randomly in a unit square.

    1. Assume that the probability of a point falling in a particular region is proportional to the area of that region.
    2. E.g. P[x>1/2 and y<1/10] = 1/20
    3. P[x>y] = 1/2
  11. Recap:

    1. Problem statement defines a random experiment
    2. with an experimental procedure and set of measurements and observations
    3. that determine the possible outcomes and sample space
    4. Make an initial probability assignment
    5. based on experience or whatever
    6. that satisfies the axioms.

Engineering Probability Class 2 Mon 2018-01-22

1   Misc

  1. Why not use LMS more?
    1. This static CMS is much easier to use. I can create and save material more easily.
    2. You don't need to login to see the content. You can deep-link into the material. People outside the course can see, and save, the material.
    3. A few years ago, LMS sued an open source competitor. (They lost!)
  2. How to find this wiki?
    1. Google **RPI WRF**.
    2. Go to my home page.
    3. Links to my current courses are listed at the top.

2   Chapter 1 ctd

  1. Rossman-Chance coin toss applet demonstrates how the observed frequencies converge (slowly) to the theoretical probability.
  2. Example of unreliable channel
    1. Want to transmit a bit: 0, 1
    2. It arrives wrong with probability e, say 0.001
    3. Idea: transmit each bit 3 times and vote.
      1. 000 -> 0
      2. 001 -> 0
      3. 011 -> 1
    4. 3 bits arrive correct with probability \((1-e)^3\) = 0.997002999
    5. 1 error with probability \(3(1-e)^2e\) = 0.002994
    6. 2 errors with probability \(3(1-e)e^2\) = 0.000002997
    7. 3 errors with probability \(e^3\) = 0.000000001
    8. corrected bit is correct if 0 or 1 errors, with probability \((1-e)^3+3(1-e)^2e\) = 0.999996999
    9. We reduced probability of error by factor of 1000.
    10. Cost: triple the transmission plus a little logic HW.
  3. Example of text compression
    1. Simple way: Use 5 bits for each letter: A=00000, B=00001
    2. In English, 'E' common, 'Q' rare
    3. Use fewer bits for E than Q.
    4. Morse code did this 170 years ago.
      1. E = .
      2. Q = _ _ . _
    5. Aside: An expert Morse coder is faster than texting.
    6. English can be compressed to about 1 bit per letter (with difficulty); 2 bits is easy.
    7. Aside: there is so much structure in English text, that if you add the bit strings for 2 different texts bit-by-bit, they can usually mostly be reconstructed.
    8. That's how cryptoanalysis works.
  4. Example of reliable system design
    1. Nuclear power plant fails if
      1. water leaks
      2. and operator asleep (a surprising number of disasters happen in the graveyard shift).
      3. and backup pump fails
      4. or was turned off for maintenance
    2. What's the probability of failure? This depends on the probabilities of the various failure modes. Those might be impossible to determine accurately.
    3. Design a better system? Coal mining kills.
    4. The backup procedures themselves can cause problems (and are almost impossible to test). A failure with the recovery procedure was part of the reason for a Skype outage.

3   Chapter 2

  1. A random experiment has 2 parts:
    1. experimental procedure
    2. set of measurements
  2. Random experiment may have subexperiments and sequences of experiments.
  3. Outcome or sample point \(\zeta\): a non-decomposable observation.
  4. Sample space S: set of all outcomes
  5. \(|S|\):
    1. finite, e.g. {H,T}, or
    2. discrete = countable, e.g., 1,2,3,4,... Sometimes discrete includes finite. or
    3. uncountable, e.g., \(\Re\), aka continuous.
  6. Types of infinity:
    1. Some sets have finite size, e.g., 2 or 6.
    2. Other sets have infinite size.
    3. Those are either countable or uncountable.
    4. A countably infinite set can be arranged in order so that its elements can be numbered 1,2,3,...
    5. The set of natural numbers is obviously countable.
    6. The set of positive rational numbers between 0 and 1 is also countable. You can order it thus: \(\frac{1}{1}, \frac{1}{2}, \frac{1}{3}, \frac{2}{3}, \frac{1}{4}, \ \frac{3}{4}, \frac{1}{5}, \frac{2}{5}, \frac{3}{5}, \ \cdots\)
    7. The set of real numbers is not countable (aka uncountable). Proving this is beyond this course. (It uses something called diagonalization.
    8. Uncountably infinite is a bigger infinity than countably infinite, but that's beyond this course.
    9. Georg Cantor, who formulated this, was hospitalized in a mental health facility several times.
  7. Why is this relevant to probability?
    1. We can assign probabilities to discrete outcomes, but not to individual continuous outcomes.
    2. We can assign probabilities to some events, or sets of continuous outcomes.
  8. E.g. Consider this experiment to watch an atom of sodium-26.
    1. Its half-life is 1 second (Applet: Nuclear Isotope Half-lifes)
    2. Define the outcomes to be the number of complete seconds before it decays: \(S=\{0, 1, 2, 3, \cdots \}\)
    3. \(|S|\) is countably infinite, i.e., discrete.
    4. \(p(0)=\frac{1}{2}, p(1)=\frac{1}{4}, \cdots\) \(p(k)=2^{-(k+1)}\)
    5. \(\sum_{k=0}^\infty p(k) = 1\)
    6. We can define events like these:
      1. The atom decays within the 1st second. p=.5.
      2. The atom decays within the first 3 seconds. p=.875.
      3. The atom's lifetime is an even number of seconds. \(p = \frac{1}{2} + \frac{1}{8} + \frac{1}{32} + \cdots = \frac{2}{3}\)
  9. Now consider another experiment: Watch another atom of Na-26
    1. But this time the outcome is defined to be the real number, x, that is the time until it decays.
    2. \(S = \{ x | x\ge0 \}\)
    3. \(|S|\) is uncountably infinite.
    4. We cannot talk about the probability that x=1.23 exactly. (It just doesn't work out.)
    5. However, we can define the event that \(1.23 < x < 1.24\), and talk about its probability.
    6. \(P[x>x_0] = 2^{-x_0}\)
    7. \(P[1.23 < x < 1.24]\) \(= 2^{-1.23} - 2^{-1.24} = 0.003\)
  10. Event
    1. collection of outcomes, subset of S
    2. what we're interested in.
    3. e.g., outcome is voltage, event is V>5.
    4. certain event: S
    5. null event: \(\emptyset\)
    6. elementary event: one discrete outcome
  11. Set theory
    1. Sets: S, A, B, ...
    2. Universal set: U
    3. elements or points: a, b, c
    4. \(a\in S, a\notin S\), \(A\subset B\)
    5. Venn diagram
    6. empty set: {} or \(\emptyset\)
    7. operations on sets: equality, union, intersection, complement, relative complement
    8. properties (axioms): commutative, associative, distributive
    9. theorems: de Morgan
  12. Prove deMorgan 2 different ways.
    1. Use the fact that A equals B iff A is a subset of B and B is a subset of A.
    2. Look at the Venn diagram; there are only 4 cases.
  13. 2.1.4 Event classes
    1. Remember: an event is a set of outcomes of an experiment, e.g., voltage.
    2. In a continuous sample space, we're interested only in some possible events.
    3. We're interested in events that we can measure.
    4. E.g., we're not interested in the event that the voltage is exactly an irrational number.
    5. Events that we're interested in are intervals, like [.5,.6] and [.7,.8].
    6. Also unions and complements of intervals.
    7. This matches the real world. You can't measure a voltage as 3.14159265...; you measure it in the range [3.14,3.15].
    8. Define \(\cal F\) to be the class of events of interest: those sets of intervals.
    9. We assign probabilities only to events in \(\cal F\).
  14. 2.2 Axioms of probability
    1. An axiom system is a general set of rules. The probability axioms apply to all probabilities.
    2. Axioms start with common sense rules, but get less obvious.
    3. I: 0<=P[A]
    4. II: P[S]=1
    5. III: \(A\cap B=\emptyset \rightarrow\) \(P[A\cup B] = P[A]+P[B]\)
    6. III': For \(A_1, A_2, ....\) if \(\forall_{i\ne j} A_i \cap A_j = \emptyset\) then \(P[\bigcup_{i=1}^\infty A_i]\) \(= \sum_{i=1}^\infty P[A_i]\)
  15. Example: cards. Q=event that card is queen, H=event that card is heart. These events are not disjoint. Probabilities do not sum.
    1. \(Q\cap H \ne\emptyset\)
    2. P[Q] = 1/13=4/52, P[H] = 1/4=13/52, P[Q \(\cup\) H] = 16/52!=17/52.
  16. Example C=event that card is clubs. H and C are disjoint. Probabilities do sum.
    1. \(C\cap H \ne\emptyset\)
    2. P[C] = 13/52, P[H] = 1/4=13/52, P[Q \(\cup\) H] = 26/52.
  17. Example. Flip a fair coin \(A_i\) is the event that the first time you see heads is the i-th time, for \(i\ge1\).
    1. We can assign probabilities to these countably infinite number of events.
    2. \(P[A_i] = 1/2^i\)
    3. They are disjoint, so probabilities sum.
    4. Probability that the first head occurs in the 10th or later toss = \(\sum_{i=10}^\infty 1/2^i\)
  18. Corollory 1
    1. \(P[A^c] = 1-P[A]\)
    2. E.g., P[heart] = 1/4, so P[not heart] = 3/4
  19. Corollory 2: P[A] <=1
  20. Corollory 3: P[\(\emptyset\)] = 0
  21. Corollory 4:
    1. For \(A_1, A_2, .... A_n\) if \(\forall_{i\ne j} A_i \cap A_j = \emptyset\) then \(P\left[\bigcup_{i=1}^n A_i\right] = \sum_{i=1}^n P[A_i]\)
    2. Proof by induction from axiom III.

4   Iclicker questions

  1. What is your major?
    1. CSYS
    2. ELEC
    3. CSCI
    4. Other engineering
    5. Other
  2. What is your class?
    1. 2018
    2. 2019
    3. 2020
    4. 2021
    5. Other
  3. What one athematical tool should I use in class?
    1. Matlab
    2. SciPy
    3. Mathematica
    4. Maple
    5. Other

Engineering Probability Class 1 Thu 2018-01-18

1   Topics

  1. Syllabus and Intro.

  2. Why probability is useful

    1. AT&T installed bandwidth to provide level of iphone service (not all users want to use it simultaneously).
    2. also web servers, roads, cashiers, ...
    3. fair price for financial CDO that reduces risk
  3. To model something

    1. Real thing too expensive, dangerous, time-consuming (aircraft design).
    2. Capture the relevant, ignore the rest.
    3. Coin flip: relevant: it's fair? not relevant: copper, tin, zinc, ...
    4. Validate model if possible.
  4. Computer simulation model

    1. For systems too complicated for a simple math equation (i.e., most systems outside school)
    2. Often a graph of components linked together, e.g., with
      1. Matlab Simulink
      2. PSPICE
    3. many examples, e.g. antilock brake, US economy
    4. Can do experiments on it.
  5. To make public policy: "Compas (Correctional Offender Management Profiling for Alternative Sanctions), is used throughout the U.S. to weigh up whether defendants awaiting trial or sentencing are at too much risk of reoffending to be released on bail." Slashdot.

  6. Deterministic model

    1. Resistor: V=IR
    2. Limitations: perhaps not if I=1000000 amps. Why?
    3. Limitations: perhaps not if I=0.00000000001 amps. Why?
  7. Probability model

    1. Roulette wheel: \(p_i=\frac{1}{38}\) (ignoring http://www.amazon.com/Eudaemonic-Pie-Thomas-Bass/dp/0595142362 )
  8. Terms

    1. Random experiment: different outcomes each time it's run.
    2. Outcome: one possible result of a random experiment.
    3. Sample space: set of possible outcomes.
      1. Discrete, or
      2. Continuous.
    4. Tree diagram of successive discrete experiments.
    5. Event: subset of sample space.
    6. Venn diagram: graphically shows relations.
  9. Statistical regularity

    1. \(lim_{n\rightarrow\infty}f_k(n) =p_k\)
    2. law of large numbers
    3. weird distributions (e.g., Cauchy) violate this, but that's probably beyond this course.
  10. Properties of relative frequency

    1. the frequencies of all the possibilities sum to 1.
    2. if an event is composed of several outcomes that are disjoint, the event's probability is the sum of the outcomes' probabilities.
    3. E.g., If the event is your passing this course and the relevant outcomes are grades A, B, C, D, with probabilities .3, .3, .2, .1, then \(p_{pass}=0.9\) . (These numbers are fictitious.)
  11. Axiomatic approach

    1. Probability is between 0 and 1.
    2. Probs sum to 1.
    3. If the events are disjoint, then the probs add.
  12. Building a model

    1. Want to model telephone conversations where speaker talks 1/3 of time.
    2. Could use an urn with 2 black, 1 white ball.
    3. Computer random number generator easier.
  13. Detailed example in more detail - phone system

    1. Design telephone system for 48 simultaneous users.

    2. Transmit packet of voice every 10msecs.

    3. Only 1/3 users are active.

    4. 48 channels wasteful.

    5. Alloc only M<48 channels.

    6. In the next 10msec block, A people talked.

    7. If A>M, discard A-M packets.

    8. How good is this?

    9. n trials

    10. \(N_k(n)\) trials have k packets

    11. frequency \(f_k(n)=N_k(n)/n\)

    12. \(f_k(n)\rightarrow p_k\) probability

    13. We'll see the exact formula (Poisson) later.

    14. average number of packets in one interval:

      \(\frac{\sum_{k=1}^{48} kN_k(n)}{n} \rightarrow \sum_{k=1}^{48} kp_k = E[A]\)

    15. That is the expected value of A.

  14. Probability application: unreliable communication channel.

    1. Transmitter transmits 0 or 1.
    2. Receiver receives 0 or 1.
    3. However, a transmitted 0 is received as a 0 only 90% of the time, and
    4. a transmitted 1 is received as a 1 only 80% of the time, so
    5. if you receive a 0 what's the probability that a 0 was transmitted?
    6. ditto 1.
    7. (You don't have enough info to answer this; you need to know also the probability that a 0 was transmitted. Perhaps the transmitter always sends a 0.)
  15. Another application: stocking spare parts:

    1. There are 10 identical lights in the classroom ceiling.
    2. The lifetime of each bulb follows a certain distribution. Perhaps it dies uniformly anytime between 1000 and 3000 hours.
    3. As soon as a light dies, the janitor replaces it with a new one.
    4. How many lights should the janitor stock so that there's a 90% chance that s/he won't run out within 5000 hours?

2   Reading

Leon-Garcia, chapter 1.

4   Material added after class

  1. My handwritten tablet notes.
  2. If you're satisfied with your first two exam grades, then yes, you may skip the final.
  3. Probability for blackjack: Beat the Dealer: A Winning Strategy for the Game of Twenty-One.
  4. How MIT Students Won $8 Million in the Massachusetts Lottery.
  5. Computer glitch leads Arizona Lottery to issue new Pick 3 tickets. It wasn't ever picking 8s or 9s in certain positions. I heard of another similar story for Arizona years ago but can't find the cite. They seem to have some serious competency problems.

Engineering Probability Homework 1 due Thurs 2018-01-25

How to submit

Submit to LMS; see details in syllabus.

Questions

  1. (6 pts) One of the hardest problems is forming an appropriate probability model. E.g., suppose you're working for Verizon deciding how much data capacity your network will need once it starts selling the iphone. Suppose that you know that each customer will use 5GB/month. Since a month has about 2.5M seconds, does that mean that your network will need to provide only 2KB/s (correction) per customer? What might be wrong with this model? How might you make it better? (This is an open-ended question; any reasonable answer that shows creativity gets full points.)
  2. (3 pts) One hard problem with statistics is how they should be interpreted. For example, the more iphones that are sold, the higher the national debt gets. Does this mean that the US should pay Verizon not to sell the iphone next month?
  3. (6 pts) Do exercise 1.3 in the text on page 19.
  4. (6 pts) Do exercise 1.6 on page 19.
  5. (6 pts) Do exercise 1.11 on page 20.

Total: 27 pts.

Engineering Probability Homework 2 due Mon 2018-02-05 2359 EST

How to submit

Submit to LMS; see details in syllabus.

Questions

  1. (6 pts) Do exercise 2.2, page 81 of Leon-Garcia.

  2. (6 pts) Do exercise 2.4, page 81.

  3. (6 pts) Do exercise 2.6, page 82.

  4. (6 pts) Do exercise 2.21, page 84.

  5. (6 pts) Do exercise 2.25, page 84.

  6. (6 pts) Do exercise 2.35(a), page 85. Assume the "half as frequently" means that for a subinterval of length d, the probability is half as much when the subinterval is in [0,2] as when in [-1,0).

  7. (6 pts) Do exercise 2.39, page 86. Ignore any mechanical limitations of combo locks. Good RPI students should know what those limitations are.

    (Aside: A long time ago, RPI rekeyed the whole campus with a more secure lock. Shortly thereafter a memo was distributed that I would summarize as, "OK, you can, but don't you dare!")

  8. (6 pts) Do exercise 2.59, page 87. However, make it 35 students and 3 on each day of the week. Assume that there is no relation between birthday and day of the week.

  9. (6 pts) Find a current policy issue where you think that probabilities are being misused, and say why, in 100 words. Full points will be awarded for a logical argument. I don't care what the issue is, or which side you take. Try not to pick something too too inflammatory; follow the Page 1 rule that an NSF lawyer taught me when I was there. (Would you be willing to see your answer on page 1 of tomorrow's paper?)

Total: 54 pts.