W Randolph Franklin home page
... (old version)
EngProbSpring2010/ home page Login


ECSE-2500, Engineering Probability, Spring 2010, Rensselaer Polytechnic Institute

Lecture 2

  1. iClicker
    1. Confession: this is the first time I've used this. Please correct my mistakes.
    2. Answer this question (won't be graded): 2+2=? 1, B:2, C:3.9999, D:4, E: whatever you want it to be.
    3. Register your iClicker, using your RCS ID. E.g., I would register as W Randolph Franklin frankwr
  2. Statistical regularity
    1. {$ lim_{n\rightarrow\infty}f_k(n) =p_k $}
    2. law of large numbers
    3. weird distributions (e.g., Cauchy) violate this, but that's probably beyond this course.
  3. Properties of relative frequency
    1. the frequencies of all the possibilities sum to 1.
    2. if an event is composed of several outcomes that are disjoint, the event's probability is the sum of the outcomes' probabilities.
    3. E.g., If the event is your passing this course and the relevant outcomes are grades A, B, C, D, with probabilities .3, .3, .2, .1, then ppass=.9. (These numbers are fictitious.)
  4. Axiomatic approach
    1. Prob is between 0 and 1.
    2. Probs sum to 1.
    3. If disjoint, probs add.
  5. Building a model
    1. Want to model telephone conversations where speaker talks 1/3 of time.
    2. Could use an urn with 2 black, 1 white ball.
    3. Computer random number generator easier.
  6. Detailed example - phone system
    1. Design telephone system for 48 simultaneous users.
    2. Xmit packet of voice every 10msecs.
    3. Only 1/3 users are active.
    4. 48 channels wasteful.
    5. Alloc only M<48 channels.
    6. In the next 10msec block, A people talked.
    7. If A>M, discard A-M packets.
    8. How good is this?
    9. n trials
    10. Nk(n) trials have k packets
    11. frequency fk(n)=Nk(n)/n
    12. fk(n)->pk probability
    13. We'll see the exact formula (Poisson) later.
    14. average number of packets in one interval: {$$\frac{\sum_{k=1}^{48} kN_k(n)}{n} \rightarrow \sum_{k=1}^{48} kp_k = E[A] $$}
    15. That is the expected value of A.
  7. Example of unreliable channel
    1. Want to xmit a bit: 0, 1
    2. It arrives wrong with prob. (w.p.) e, say 0.001
    3. Idea: xmit each bit 3 times and vote.
      1. 000 -> 0
      2. 001 -> 0
      3. 011 -> 1
    4. 3 bits arrive correct w.p. (1-e)3 = 0.997002999
    5. 1 error w.p. 3(1-e)2e = 0.002994
    6. 2 errors w.p. 3(1-e)e2 = 0.000002997
    7. 3 errors w.p. e3 = 0.000000001
    8. corrected bit is correct if 0 or 1 errors, w.p. (1-e)3+3(1-e)2e = 0.999996999
    9. We reduced prob of error by factor of 1000.
    10. Cost: triple the transmission plus a little logic HW.
  8. iclicker exercise based on that (not graded)
    1. e=0.1
    2. What is prob of no error in 3 bits: A 0.1 B 0.3 C 0.001 D 0.729 E 0.9
  9. Example of text compression
    1. Simple way: Use 5 bits for each letter: A=00000, B=00001
    2. In English, 'E' common, 'Q' rare
    3. Use fewer bits for E than Q.
    4. Morse code did this 170 years ago.
      1. E = .
      2. Q = _ _ . _
    5. Aside: An expert Morse coder is faster than texting.
    6. English can be compressed to about 1 bit per letter (with difficulty); 2 bits is easy.
    7. Aside: there is so much structure in English text, that if you add the bit strings for 2 different texts bit-by-bit, they can usually mostly be reconstructed.
    8. That's how cryptoanalysis works.
  10. Example of reliable system design
    1. Nuclear power plant fails if
      1. water leaks
      2. and operator asleep (a surprising number of disasters happen in the graveyard shift).
      3. and backup pump fails
      4. or was turned off for maintenance
      5. .....
    2. What's the probability of failure?
    3. Design a better system? Coal mining kills.
    4. The backups themselves can cause problems (and are almost impossible to test).

Chapter 2

  1. Specifying random expt
    1. experimental procedure
    2. set of measurements
  2. Random expt may have subexpts and sequences of expts
  3. Outcome or sample point {$\xi$}: like an atom
    1. not decomposable
  4. Sample space S: set of outcomes
  5. |S|:
    1. finite, e.g. {H,T}
    2. discrete = countable, e.g., 1,2,3,4,...
    3. uncountable, e.g., {$\Re$}
  6. Event
    1. collection of outcomes, subset of S
    2. what we're interested in.
    3. e.g., outcome is voltage, event is V>5.
    4. certain event: S
    5. null event: {$\emptyset$}
    6. elementary event: one discrete outcome
  7. Set theory
    1. Sets: S, A, B, ...
    2. Universal set: U
    3. elements or points: a, b, c
    4. {$ a\in S, a\notin S $}, {$A\subset B$}
    5. Venn diagram
    6. empty set: {} or {$\emptyset$}
    7. operations on sets: equality, union, intersection, complement, relative complement
    8. properties (axioms): commutative, associative, distributive
    9. theorems: de Morgan