Engineering Probability Class 13 Mon 2019-02-25
Table of contents
1 This year's exam 1 online
with and w/o answers. See here.
We gave full points even if you didn't finish the arithmetic to compute a number. In the real world, you have computers. However, in the real world, accurate analysis and computation matter. In 1954, physicists made an eensy teensy error designing Castle Bravo.
2 Homework 5
online, due after break.
3 Notation
How to parse $F_X(x)$
- Uppercase F means that this is a cdf. Different letters may indicate different distributions.
- The subscript X is the name of the random variable.
- The x is an argument, i.e., an input.
- $F_X(x)$ returns the probability that the random variable is less or equal to the value x, i.e. prob(X<=x).
4 Matlab
-
Matlab, Mathematica, and Maple all will help you do problems too big to do by hand. Sometime I'll demo Matlab since IMO more of the class knows it.
-
Matlab
-
Major functions:
cdf(dist,X,A,...) pdf(dist,X,A,...)
-
Common cases of dist (there are many others):
'Binomial' 'Exponential' 'Poisson' 'Normal' 'Geometric' 'Uniform' 'Discrete Uniform'
-
Examples:
pdf('Normal',-2:2,0,1) cdf('Normal',-2:2,0,1) p=0.2 n=10 k=0:10 bp=pdf('Binomial',k,n,p) bar(k,bp) grid on bc=cdf('Binomial',k,n,p) bar(k,bc) grid on x=-3:.2:3 np=pdf('Normal',x,0,1) plot(x,np)
-
Interactive GUI to explore distributions: disttool
-
Random numbers:
rand(3) rand(1,5) randn(1,10) randn(1,10)*100+500 randi(100,4)
-
Interactive GUI to explore random numbers: randtool
-
Plotting two things at once:
x=-3:.2:3 n1=pdf('Normal',x,0,1) n2=pdf('Normal',x,0,2) plot(x,n1,n2) plot(x,n1,x,n2) plot(x,n1,'--r',x,n2,'.g')
-
-
Use Matlab to compute a geometric pdf w/o using the builtin function.
-
Iclicker. Which of the following do you prefer to use?
- Matlab
- Maple
- Mathematica
- Paper. It was good enough for Bernoulli and Gauss; it's good enough for me.
- Something else (please email about it me after the class).
5 My opinion
This is my opinion of Matlab.
- Advantages
- Excellent quality numerical routines.
- Free at RPI.
- Many toolkits available.
- Uses parallel computers and GPUs.
- Interactive - you type commands and immediately see results.
- No need to compile programs.
- Disadvantages
- Very expensive outside RPI.
- Once you start using Matlab, you can't easily move away when their prices rise.
- You must force your data structures to look like arrays.
- Long programs must still be developed offline.
- Hard to write in Matlab's style.
- Programs are hard to read.
- Alternatives
- Free clones like Octave are not very good
- The excellent math routines in Matlab are also available free in C++ librarues
- With C++ libraries using template metaprogramming, your code looks like Matlab.
- They compile slowly.
- Error messages are inscrutable.
- Executables run very quickly.
6 Chapter 4 ctd
-
Taxi example: Sometimes there are mixed discrete and continuous r.v.
- Let X be the time X to get a taxi at the airport.
- 80% of the time a taxi is already there, so p(X=0)=.8.
- Otherwise we wait a uniform time from 0 to 20 minutes, so p(a<x<b)=.01(b-a), for 0<a<b<20.
-
Iclicker. For the taxi example, what is F(0)?
- 0
- .2
- .8
- .81
- 1
-
iclicker. For the taxi example, what is F(1)?
- 0
- .8
- .81
- .9
- 1
-
Text 4.2 p 148 pdf
-
Simple continuous r.v. examples: uniform, exponential.
-
The exponential distribution complements the Poisson distribution. The Poisson describes the number of arrivals per unit time. The exponential describes the distribution of the times between consecutive arrivals.
Ex 4.7 p 150: exponential r.v.
-
Properties
- Memoryless.
- $f(x) = \lambda e^{-\lambda x}$ if $x\ge0$, 0 otherwise.
- Example: time for a radioactive atom to decay.
-
Ski p 4.2.1 for now.
-
The most common continuous distribution is the normal distribution.
-
4.2.2 p 152. Conditional probabilities work the same with continuous distributions as with discrete distributions.
-
p 154. Gaussian r.v.
- $$f(x) = \frac{1}{\sqrt{2\pi} \cdot \sigma} e^{\frac{-(x-\mu)^2}{2\sigma^2}}$$
- cdf often called $\Psi(x)$
- cdf complement:
- $$Q(x)=1-\Psi(x) = \int_x^\infty \frac{1}{\sqrt{2\pi} \cdot \sigma} e^{\frac{-(t-\mu)^2}{2\sigma^2}} dt$$
- E.g., if $\mu=500, \sigma=100$,
- P[x>400]=0.66
- P[x>500]=0.5
- P[x>600]=0.16
- P[x>700]=0.02
- P[x>800]=0.001
-
Text 4.3 p 156 Expected value
-
Skip the other distributions (for now?).