ECSE-2500, Engineering Probability, Spring 2010, Rensselaer Polytechnic Institute

These pages look better in Firefox than in Explorer.

# Lecture 6

1. Quickest way to get to the course website: google rpi prob
2. Next homework is really due next Fri.
3. Let's do exercise 2.128 on p95.
4. iClicker: You have a coin where the probability of a head is p=2/3 If you toss it twice, what's the probability that you will see one head and one tail?
• A 1/2
• B 1/3
• C 2/9
• D 5/9
• E 4/9
5. iClicker: You have a coin where the probability of a head is p=2/3 What's the probability that the 1st head occurs on the 2nd toss? If you toss it twice, what's the probability that you will see one head and one tail?
• A 1/2
• B 1/3
• C 2/9
• D 5/9
• E 4/9
6. Example 2.44, p64
1. Intro to Markov chains
2. Motivation: speech and language recognition, translation, compression
3. E.g., in English text, the probability of a u is higher if the previous char was q.
4. The probability of a b may be higher if the previous char was u (than if it was x), but is lower if the previous two chars are qu.
5. Need to look at probs of sequences, char by char.
6. Same idea in speech recognition: phonemes follow phonemes...
7. Same in language understanding: verb follows noun...
8. In this example, you repeatedly choose an urn to draw a ball from, depending on what the previous ball was.
7. Chapter 3, p 96. Discrete random variables
1. Discrete is different from discreet.
2. Random experiment → nonnumerical outcome {$\xi$} →
3. Random Variable {$X(\xi )$}. Any real number.
4. Random vars in general: X, Y, ...
5. particular values: x, y, ...
6. It's the outcome that's random, not the r.v., which is a deterministic function of the outcome.
8. Example 3.1 Coin tosses
1. Define X to be the number of heads from 3 tosses.
2. {$\xi$}=HHH, X(HHH) =3, {$S_X=\{0,1,2,3\}$}
9. Example 3.2 Betting game addon to 3.1
1. Define another random var Y to be payoff: 8 if X=3, 1 if X=2, 0 else.
2. Y is derived from X
10. Example 3.3 add probs to 3.2, assuming fair coin. P[X=2], P[Y=8]
11. 3.1.1 Ignore since it's starred.
12. 3.2 Discrete r.v. and Probability mass function (pmf)
1. {$p_X(x) = P[X=x] = P[\{\xi:X(\xi)=x\}]$}
13. 3 properties of pmf
14. Example 3.5 prob of # heads in 3 coin tosses: {$p_X(0)=1/8$}
15. Example 3.6 betting game {$p_Y(1)=3/8$}
16. Fig 3.4. You can graph the pmf.
17. Example 3.7 random number generator
1. produces integer X equally likely in range 0..M-1
2. {$S_X=\{0, 1, ... M-1 \}$}
3. pmf: {$p_X(k)=1/M$} for k in 0..M-1.
4. X is a uniform random variable over that set.
18. Example 3.8 Bernoulli random variable
1. indicator function {$I_A(\xi)=1$} iff {$\xi\in A$}
2. pmf of {$I_A$}: {$p_I(0)=1-p, p_I(1)=p$}
3. {$I_A$} is a Bernoulli random variable.
19. Example 3.9 Message transmission until success
1. {$p_X(k)=q^{k-1}p, k=1,2,3,...$}
2. Geometric random variable
3. What about P[X is even]?
20. Example 3.10 Number of transmission errors
1. {$p_X(k) = {n \choose k} p^k q^{n-k}, k=0,1,...n$}
2. binonial random variable
21. Fig 3.5 You can graph the relative frequencies from running an experiment repeatedly.
1. It will approach the pmf graph (absent pathological cases that are beyond this course.)
22. 3.3 p104 Expected value and other moments
1. This is a way to summarize a r.v., and capture important aspects.
2. Mean or expected value or center of mass: {$m_X = E[X] = \sum_{x\in S_X} x p_X(x)$}
3. Defined iff absolute convergence: {$\sum |x| p(x) < \infty$}
23. Example 3.11 Mean of Bernoulli r.v.
24. Example 3.12 Mean of Binomial r.v. What's the expected # of heads in 3 tosses?
25. Example 3.13 Mean of uniform discrete r.v.
26. Run an experiment n times and observe {$x(1), x(2), ...$}
1. {$N_k(n)$} # times {$x_k$} was seen
2. {$f_k(n) = N_k(n)/n$} frequencies
3. Sample mean {$<X>_n = \sum x_kf_k(n)$}
4. With lots of experiments, frequencies approach probabilities and sample mean converges to E[X]
5. However it may take a long time, which is why stock market investors can go broke.
27. Example 3.14 p 106 Betting game
28. Example 3.15 Mean of a geometric r.v.
29. Example 3.16 St Petersburg parafox
30. 3.1.1 Expected value of a function of a r.v.
1. Z=g(X)
2. E[Z] = E[g(x)] = {$\sum_k g(x_k) p_X(x_k)$}
31. Example 3.17 square law device
32. {$E[a g(X)+b h(X)+c] = a E[g(X)] + b E[h(x)] + c$}
33. Example 3.18 Square law device continued
34. Example 3.19 Multiplexor discards packets