ECSE-2500, Engineering Probability, Spring 2010, Rensselaer Polytechnic Institute
These pages look better in Firefox than in Explorer.
Lecture 6
- Quickest way to get to the course website: google rpi prob
- Next homework is really due next Fri.
- Let's do exercise 2.128 on p95.
- iClicker: You have a coin where the probability of a head is p=2/3
If you toss it twice, what's the probability that you will see one head
and one tail?
- A 1/2
- B 1/3
- C 2/9
- D 5/9
- E 4/9
- iClicker: You have a coin where the probability of a head is p=2/3
What's the probability that the 1st head occurs on the 2nd toss?
If you toss it twice, what's the probability that you will see one head
and one tail?
- A 1/2
- B 1/3
- C 2/9
- D 5/9
- E 4/9
- Example 2.44, p64
- Intro to Markov chains
- Motivation: speech and language recognition, translation, compression
- E.g., in English text, the probability of a u is higher if the previous char was q.
- The probability of a b may be higher if the previous char was u (than if it was x), but is lower if the previous two chars are qu.
- Need to look at probs of sequences, char by char.
- Same idea in speech recognition: phonemes follow phonemes...
- Same in language understanding: verb follows noun...
- In this example, you repeatedly choose an urn to draw a ball from, depending on what the previous ball was.
- Chapter 3, p 96. Discrete random variables
- Discrete is different from discreet.
- Random experiment → nonnumerical outcome {$ \xi $} →
- Random Variable {$ X(\xi ) $}. Any real number.
- Random vars in general: X, Y, ...
- particular values: x, y, ...
- It's the outcome that's random, not the r.v., which is a deterministic function of the outcome.
- Example 3.1 Coin tosses
- Define X to be the number of heads from 3 tosses.
- {$ \xi $}=HHH, X(HHH) =3, {$ S_X=\{0,1,2,3\} $}
- Example 3.2 Betting game addon to 3.1
- Define another random var Y to be payoff: 8 if X=3, 1 if X=2, 0 else.
- Y is derived from X
- Example 3.3 add probs to 3.2, assuming fair coin. P[X=2], P[Y=8]
- 3.1.1 Ignore since it's starred.
- 3.2 Discrete r.v. and Probability mass function (pmf)
- {$ p_X(x) = P[X=x] = P[\{\xi:X(\xi)=x\}] $}
- 3 properties of pmf
- Example 3.5 prob of # heads in 3 coin tosses: {$ p_X(0)=1/8 $}
- Example 3.6 betting game {$ p_Y(1)=3/8 $}
- Fig 3.4. You can graph the pmf.
- Example 3.7 random number generator
- produces integer X equally likely in range 0..M-1
- {$ S_X=\{0, 1, ... M-1 \} $}
- pmf: {$ p_X(k)=1/M $} for k in 0..M-1.
- X is a uniform random variable over that set.
- Example 3.8 Bernoulli random variable
- indicator function {$ I_A(\xi)=1 $} iff {$ \xi\in A $}
- pmf of {$ I_A $}: {$ p_I(0)=1-p, p_I(1)=p $}
- {$ I_A $} is a Bernoulli random variable.
- Example 3.9 Message transmission until success
- {$ p_X(k)=q^{k-1}p, k=1,2,3,... $}
- Geometric random variable
- What about P[X is even]?
- Example 3.10 Number of transmission errors
- {$ p_X(k) = {n \choose k} p^k q^{n-k}, k=0,1,...n $}
- binonial random variable
- Fig 3.5 You can graph the relative frequencies from running an experiment
repeatedly.
- It will approach the pmf graph (absent pathological cases that are beyond this course.)
- 3.3 p104 Expected value and other moments
- This is a way to summarize a r.v., and capture important aspects.
- Mean or expected value or center of mass: {$ m_X = E[X] = \sum_{x\in S_X} x p_X(x) $}
- Defined iff absolute convergence: {$ \sum |x| p(x) < \infty $}
- Example 3.11 Mean of Bernoulli r.v.
- Example 3.12 Mean of Binomial r.v. What's the expected # of heads in 3 tosses?
- Example 3.13 Mean of uniform discrete r.v.
- Run an experiment n times and observe {$ x(1), x(2), ... $}
- {$ N_k(n) $} # times {$ x_k $} was seen
- {$ f_k(n) = N_k(n)/n $} frequencies
- Sample mean {$ <X>_n = \sum x_kf_k(n) $}
- With lots of experiments, frequencies approach probabilities and sample mean converges to E[X]
- However it may take a long time, which is why stock market investors can go broke.
- Example 3.14 p 106 Betting game
- Example 3.15 Mean of a geometric r.v.
- Example 3.16 St Petersburg parafox
- 3.1.1 Expected value of a function of a r.v.
- Z=g(X)
- E[Z] = E[g(x)] = {$ \sum_k g(x_k) p_X(x_k) $}
- Example 3.17 square law device
- {$ E[a g(X)+b h(X)+c] = a E[g(X)] + b E[h(x)] + c $}
- Example 3.18 Square law device continued
- Example 3.19 Multiplexor discards packets