WEBVTT 1 00:00:20.280 --> 00:00:32.880 Okay, good afternoon. Everyone bye. 2 00:00:32.880 --> 00:00:37.049 You are so a question. 3 00:00:37.049 --> 00:00:44.159 Can you hear me? I, thank you Nicolas again. The reason I do that is. 4 00:00:44.159 --> 00:00:47.609 I haven't figured a week and audio feedback of what. 5 00:00:47.609 --> 00:00:52.950 I what you would hear without getting a 3rd delay or something. 6 00:00:52.950 --> 00:00:58.229 Okay, so this is profitability we're doing. 7 00:00:59.250 --> 00:01:06.569 I'm actually at this point, whoopsie class 12 and. 8 00:01:08.159 --> 00:01:15.120 March 8th, so we're going to just do a little review at the end of. 9 00:01:15.120 --> 00:01:19.439 Chapter 3, in the book, I got the book scan up here and. 10 00:01:19.439 --> 00:01:30.719 And I'll get into chapter 4, Thursday. I'll do lots of examples here. I'll present new material. I've also got some problem with my iPad at the moment is not narrowing. 11 00:01:30.719 --> 00:01:37.530 Mirroring to my laptop properly. I don't know what's going on. 12 00:01:37.530 --> 00:01:41.459 Okay, so. 13 00:01:42.900 --> 00:01:47.519 Um, let's do a little. 14 00:01:48.689 --> 00:01:52.709 I'll review the end what we're doing at the end of chapter 3 here. 15 00:01:54.750 --> 00:02:05.700 Okay, chapter 3, discrete, random variables. We saw an important 1 at the end called plus all random variable and I would like to. 16 00:02:05.700 --> 00:02:12.030 To hit this again and. 17 00:02:13.530 --> 00:02:19.500 Just to remind you. 18 00:02:21.150 --> 00:02:28.949 Okay, I mentioned it before, but it's so important. I would just like to mention it again quickly and. 19 00:02:28.949 --> 00:02:36.120 It's we have a large number of events that might happen. 20 00:02:36.120 --> 00:02:46.500 They're independent, but only the probability, if any specific event happening is really, really small, but because there's so many possibilities. 21 00:02:46.500 --> 00:02:50.849 The on the average is going to be a few things happening. 22 00:02:51.594 --> 00:03:05.965 And example might be decay we suppose I got some radioactive element and so was it got a mole of it so that 6 times 10 to the 23rd atoms and suppose an average of I didn't want Adam decays every. 2nd. So. 23 00:03:06.870 --> 00:03:17.969 They're 6 times 10 of the 23rd Adams, but in 3rd, the probability of any specific Adam decaying is 1 divided by advocated Wisconsin and. 24 00:03:17.969 --> 00:03:24.000 What that multiplies out to is an expected value of 1, Adam decaying and every 2nd. 25 00:03:24.000 --> 00:03:32.550 Now, so the right, but it's gonna be some seconds is going to be in 0T decaying. Some seconds is going to be 2 or 3 or maybe 5. K. 26 00:03:32.550 --> 00:03:37.770 And so the random variable, which is the number that decay in a 2nd. 27 00:03:37.770 --> 00:03:43.620 That's follows the pulse on distribution and I have here the formula. 28 00:03:43.620 --> 00:03:47.219 Right here from the textbook. 29 00:03:47.219 --> 00:03:52.050 It has real examples. You've got a call center. 30 00:03:52.050 --> 00:04:03.659 Got a packet thing, there's a lot of packets might be transmitted each package very unlikely, but a lot of potential packets. So number of packets that are hitting your switch in a 2nd, would cost distributed. 31 00:04:03.659 --> 00:04:06.990 People calling a call center, assuming their independent. 32 00:04:06.990 --> 00:04:15.569 Some random variable, so we saw that the, it has 1 parameter the 1 parameters. The mean. 33 00:04:15.569 --> 00:04:23.519 And we get different things. It's to me, it means a real number. I mean, the number of arrivals in a 2nd is an editor, but the parameter is a real number. 34 00:04:23.519 --> 00:04:29.338 And we see, it could be 3 quarters. So if it's 3 quarters, the most common cases is 0. 35 00:04:29.338 --> 00:04:33.658 Arrivals in the 2nd. Okay. 36 00:04:33.658 --> 00:04:37.468 Questions. 37 00:04:37.468 --> 00:04:41.908 Somebody is on. 38 00:04:41.908 --> 00:04:46.978 And. 39 00:04:46.978 --> 00:04:50.338 Silence. 40 00:04:51.358 --> 00:04:54.778 Okay. 41 00:04:56.399 --> 00:05:03.389 If you wanted to ask a question, unmute yourself again, but that was causing feedback. 42 00:05:03.389 --> 00:05:08.098 Okay um, so far so on. 43 00:05:08.098 --> 00:05:17.009 I mentioned queries at a call center arrivals at a packet multi flexor and and so on. Okay. 44 00:05:17.754 --> 00:05:32.423 Errors again, if there's a lot if a photon may go bad, because it got absorbed, because a lot of photons that might go bad. Probably better of any 1 going bad is very small than the random variable for the number of accounts that went bad as croissant. 45 00:05:32.483 --> 00:05:33.233 Distributed. 46 00:05:34.764 --> 00:05:49.434 You could use the or binomial problem. Well, each particular photon is you could say the exact probability distribution for the number of say bad photons errors is a binomial thing, which is true. 47 00:05:49.673 --> 00:05:51.504 But in the case where. 48 00:05:51.778 --> 00:05:58.738 And is really large, like this example here attended the 9th and PE is very small like. 49 00:05:58.738 --> 00:06:07.048 And the minus night, then the croissant is an excellent approximation to the. 50 00:06:07.048 --> 00:06:12.869 Find all meal probability distribution so you'll use plus on in this case. 51 00:06:14.278 --> 00:06:23.009 Discreet uniform we've seen before. Nothing interesting. There. This is also showing you can compute. So the discreet uniform. 52 00:06:23.009 --> 00:06:31.259 Well, just assuming is in values and equal probability and you can compute mean and variance and so on. 53 00:06:31.259 --> 00:06:39.269 Okay, I mentioned, I've noticed enamored of zip. Some people are. 54 00:06:39.269 --> 00:06:47.908 Long tails this that you see this discussing financial crashes and so the long tail means that unlikely events are. 55 00:06:47.908 --> 00:06:51.149 Less likely than you might expect. 56 00:06:51.149 --> 00:06:57.028 So that it's called a long tail, it's called a fat tail and so on. 57 00:06:57.028 --> 00:07:06.509 Rules of thumb a small portion of people have higher grades, more wealth, whatever. 58 00:07:06.509 --> 00:07:13.528 Okay generation I'm going to skip you can read it if you want my uniform. 59 00:07:13.528 --> 00:07:21.418 My repeated summary, random variable generation is that it's harder than you would think so. 60 00:07:22.499 --> 00:07:29.218 Is a package to do it to can tell you how to do it. If you want I did mention last time I had this up there is some. 61 00:07:29.218 --> 00:07:35.879 You know, there's some popular, some known good methods and I listed 1 of them before. 62 00:07:36.444 --> 00:07:37.553 And summary, 63 00:07:37.704 --> 00:07:39.504 what we saw in chapter 3, 64 00:07:39.803 --> 00:07:40.463 this discrete, 65 00:07:40.463 --> 00:07:41.334 random variable, 66 00:07:41.334 --> 00:07:43.584 we saw to calculate meat and Barry and soften, 67 00:07:43.613 --> 00:07:44.124 for example, 68 00:07:44.334 --> 00:07:47.514 we saw as a probability mass function, 69 00:07:48.173 --> 00:07:52.613 and that's just the probability of each of the possible discrete outcomes of occurring. 70 00:07:53.004 --> 00:07:57.353 And we saw the conditional probability mass function also. So. 71 00:07:57.658 --> 00:08:08.759 Okay, and I guess I may do an example, some of these examples on Thursday, but I want to just to confuse you a little go. I'm gonna give you a new material from chapter 4. 72 00:08:08.759 --> 00:08:17.728 And then I'm going to jump back and do some examples from 3. some of the problems from the end. 73 00:08:18.928 --> 00:08:31.434 So, chapter 4 is just about 1, random variable chapter 3. it was only discreet things. So finite number of outcomes are countably, infinite number of outcomes, and we could assign a probability to each outcome. 74 00:08:31.673 --> 00:08:45.533 So, now, this chapter 4 is going to go in going to include the continuous random variables. So, the random, the outcomes like a real number and the probability of any exact real number is 0T. So we have to talk about probability at some interval. 75 00:08:45.928 --> 00:08:54.239 And there's a new idea here. So, new chapter got to have new ideas. You're paying to come to our PR and. 76 00:08:54.239 --> 00:08:59.489 Learn new ideas. The 1st idea here is almost a distribution function. 77 00:08:59.489 --> 00:09:14.158 And it's the, it's the probability that our outcome, which is the number in this case, the random, the random variables, always a number. So we may have outcomes if we toss the coin heads or tails or. 78 00:09:14.158 --> 00:09:18.149 And on edge. 79 00:09:18.149 --> 00:09:28.139 Those are not numbers, so we convert this outcome to a random variable. We'll assign a number to it. So heads would be 1 and tails would be 0. 80 00:09:28.139 --> 00:09:41.453 And we do that, so now we can do things with this. We can find meetings, for example, which is symbols, like heads and tails. What's the mean of I had an entail if it's if it's a number I mean, it's point 5. okay. 81 00:09:41.453 --> 00:09:46.673 So, the chapter, we're okay, so the 1st, new idea is a cumulative distribution function. 82 00:09:47.219 --> 00:09:53.038 And that's the probability that the random variable is less than or equal to a specific value. 83 00:09:53.038 --> 00:09:59.548 So so if you look at the notation right here. 84 00:09:59.548 --> 00:10:08.609 So, cumulative distribution, so the capital letter is like, X that is the name of the random variable. 85 00:10:08.609 --> 00:10:12.599 The lower case acts that is a particular value. 86 00:10:12.599 --> 00:10:19.109 For the random variable and what we have here is the probability that. 87 00:10:19.109 --> 00:10:25.528 The random variable is less than or equal to a specific number and that's the kill to distribution function. 88 00:10:26.609 --> 00:10:37.168 Now, why we use this idea is this idea we'll work both with discreet, random variables. That also was continuous random variables. So, this is a unifying idea here that. 89 00:10:37.168 --> 00:10:45.359 So, we don't have to consider continuous and discrete as 2 separate cases as both of them will have a queue. Most of distribution function. 90 00:10:46.469 --> 00:10:50.609 Okay, um. 91 00:10:53.879 --> 00:10:58.859 Give an example here 4.1. 92 00:10:58.859 --> 00:11:03.418 Let's go figure here. Okay, so. 93 00:11:03.418 --> 00:11:10.168 Or it won't be down here lower. Right. Okay. So our random experiment is, we're tossing a fair coined. 94 00:11:10.168 --> 00:11:14.788 3 times, and the random variable is the number of heads that we see. 95 00:11:15.833 --> 00:11:29.244 So again, so it's a 1 0T chance we see 0T heads or 3 heads and 3 0T chance that we see 1 or 2 heads and we plot the probability mass function dump bottom right here 4.1 B, just for. 96 00:11:30.328 --> 00:11:40.739 Our chart basically, and this 4.1 be, it has a value only for 0, 1, 2 or 3. 97 00:11:40.739 --> 00:11:45.328 That's a probably mass function. The cumulative distribution function. 98 00:11:45.328 --> 00:11:52.078 Is the thing on the left here so it is a value for any real number at all. 99 00:11:52.078 --> 00:11:57.178 Now, this notation here, when for horizontal line. 100 00:11:57.178 --> 00:12:04.499 It was a big dot at the left, but no big dog at the right. What this means is that. 101 00:12:04.499 --> 00:12:11.729 If the value of this at exactly an integer is the line that has the big heavy on it. 102 00:12:11.729 --> 00:12:16.708 So, if we look at the case, what is the CDM for? X? Equals 1. 103 00:12:16.708 --> 00:12:24.058 It will be, um, they don't have they didn't trial the Y, axis here. 104 00:12:24.058 --> 00:12:28.859 It will be 1 half so. 105 00:12:28.859 --> 00:12:41.548 So, the probability that I ran and variable is less than or equal to 1 is a half. So, this would be the probability that we have less than or equal to 1 head occurring will be 1. 106 00:12:42.688 --> 00:12:52.139 We could have fractions we could look where I've got the cursor there say X equals 1 and a half. So the probability that the number of heads is less than or equal to. 107 00:12:52.139 --> 00:13:05.999 Point and a half is still at 1 half. So we can now with the CDF, we can ask questions about what's the problem that the number of heads is less than or equal to any real number? What's the probably the number of heads is less than or equal to minus 3. 108 00:13:05.999 --> 00:13:12.149 Cheryl, what's the probability number of heads is less than or equal to 17. 109 00:13:12.384 --> 00:13:23.724 1, okay, so this is the PDF. So, the CDF, the PDFs design is values non, non 0T guys, specific values. 110 00:13:23.724 --> 00:13:32.933 So the random variable the CDF cumulative has a value for every round number at the far left at 0T at the far, right? It becomes 1 and it has a jump at every. 111 00:13:33.389 --> 00:13:37.349 Value for which the random value variable has a non 0T value. 112 00:13:38.458 --> 00:13:44.369 Okay, CDF and you get a value you just do partial sums. 113 00:13:44.369 --> 00:13:47.698 For the. 114 00:13:47.698 --> 00:13:52.259 Probability density function or probably mass function. 115 00:13:52.259 --> 00:13:59.339 Okay, and a formula for it. 116 00:13:59.339 --> 00:14:07.019 We can have a new term here for a step function. You will effects and it's 0T fixes negative and it's 1 effects. 117 00:14:07.019 --> 00:14:14.609 0T are positive and if you have this and it's called a step function unit, step function, and then we can define the. 118 00:14:14.609 --> 00:14:18.839 The distribution function is something like that of so. 119 00:14:18.839 --> 00:14:25.288 So is 1 example, let me give you another example of a human distribution function. 120 00:14:25.288 --> 00:14:29.729 And but this was a continuous grand and variable. 121 00:14:29.729 --> 00:14:40.918 Well, and this will be a uniform 1, so we're going to spin an arrow. And so that's our random experiment is to spend a. 122 00:14:43.019 --> 00:14:49.979 Spit around spinner spinner and observes the angle, which will be from 0T to 2 pie. 123 00:14:49.979 --> 00:14:56.668 Using radiance and so a random variable is that angle. 124 00:14:56.668 --> 00:15:03.749 And it's uniform is somewhere between 0T and 2 pie and it's. 125 00:15:03.749 --> 00:15:07.139 0T outside that. Okay, so. 126 00:15:08.339 --> 00:15:17.009 We cannot talk about the probability that X or undergo is a specific angle, because it's continuous, but we can talk about the. 127 00:15:17.009 --> 00:15:27.989 Probability we can talk about the cumulative distribution function, which is the probability that the, that the arrow is at angle is less than or equal to. 128 00:15:27.989 --> 00:15:32.339 A given thing, and this will be this down here. 129 00:15:36.359 --> 00:15:47.038 Oh, we can write it doesn't matter any 1 of these things right here. So oh, the note. 130 00:15:48.928 --> 00:15:53.399 Somebody just unmuted. 131 00:15:53.399 --> 00:15:57.629 Have a question, let's see. 132 00:16:09.778 --> 00:16:13.198 So here the notation here. 133 00:16:13.198 --> 00:16:19.139 And I haven't actually on the I've been further down here. 134 00:16:19.139 --> 00:16:28.798 Is a capital letter capital f, mean, just the CD at the lower case would mean it was a. 135 00:16:31.918 --> 00:16:40.349 Density function, or a mass function, the lower case. Big accent in the name of the brand new variable. So the density, the. 136 00:16:40.349 --> 00:16:45.418 Cumulative function is just X if axis from 0. 137 00:16:46.798 --> 00:16:51.749 Is x axis from 0T is that cholee? 138 00:16:51.749 --> 00:16:56.458 Well, what what they've done here is they've normalized it to make x0T to 1 here. 139 00:16:56.458 --> 00:17:01.139 And we'd have to CDF and we could plot it as something like this here. 140 00:17:01.139 --> 00:17:12.598 Oh, this is for general uniform brand new variable from a, to B so if the random variables uniformly from a to B, the density function to probability, it's within some Delta X. 141 00:17:12.598 --> 00:17:18.358 Is this 1 already, Vanessa? This is what the density function is. This is also somewhat new. 142 00:17:18.358 --> 00:17:22.949 We haven't seen this, but then the distribution, the cumulative. 143 00:17:22.949 --> 00:17:35.578 For this that the value of faxes last week to a certain value, it's a ramp stop. It's 0T effects is less than a, because that's bound to the beautiful 1 effects is bigger than because that's the right balance neutral. And it ramps up. 144 00:17:35.578 --> 00:17:39.028 So. 145 00:17:39.028 --> 00:17:42.358 Okay, now. 146 00:17:44.398 --> 00:17:52.769 I mentioned this example before this makes things a little more complicated. We have a mixed random variable here so maybe it's. 147 00:17:55.199 --> 00:18:06.269 You know, maybe it's part discreet and part continuous and this talks about this talks about this here. So I've mentioned this before, but. 148 00:18:07.679 --> 00:18:14.519 So, the Kim distribution function, it has some properties. 149 00:18:14.519 --> 00:18:20.429 It's always monotonically increasing and never decrease. It could stay constant or it could increase. 150 00:18:20.429 --> 00:18:33.239 it's between zero and one at the very small value section limited x goes to minus infinity it's zero very large values it's one so so at the left and it goes to zero at the right . 151 00:18:33.239 --> 00:18:37.469 Our right, it goes to 1 and it never goes down it's flat or it goes up. 152 00:18:37.469 --> 00:18:41.818 So, CDF, so it's called a non decreasing function. 153 00:18:42.959 --> 00:18:48.749 And, okay, now what can we do with this. 154 00:18:50.009 --> 00:18:54.239 1st thing we can do is that for a. 155 00:18:54.239 --> 00:18:57.269 Continuous random variable. 156 00:18:57.269 --> 00:19:00.929 We can subtract values of the cubes of functions. 157 00:19:00.929 --> 00:19:04.888 To find the I'm sorry. 158 00:19:06.239 --> 00:19:10.769 Okay, what we can do is. 159 00:19:10.769 --> 00:19:19.798 We can find the value probability of an interval for the continuous error by just subtracting 2 values. 160 00:19:19.798 --> 00:19:25.318 Of the, we can supply subtracting 2 values of the. 161 00:19:25.318 --> 00:19:31.648 Be able to function and so we get here here and if it's. 162 00:19:31.648 --> 00:19:36.898 If it's a discreet thing, we can find probabilities of specific values. Like here. 163 00:19:36.898 --> 00:19:51.838 And, of course, the CDF as the probably the access last circle to a given point if we take the 1 minus that we've got the probability of X is greater. So these are common sense sort of things. This point 6 here is, this is. 164 00:19:51.838 --> 00:19:55.888 Really useful here. Okay. 165 00:19:57.179 --> 00:20:04.019 And we've got some examples here that's apparently said Paul 1. 166 00:20:04.019 --> 00:20:08.848 Skip through this. 167 00:20:08.848 --> 00:20:13.348 Do 4 or 5 and some more interesting here. 168 00:20:15.088 --> 00:20:21.028 So, we have a uniform random variable. 169 00:20:21.028 --> 00:20:28.558 And so, if we need to find the probability that excess between, say, minus point 5, and plus a quarter. 170 00:20:28.558 --> 00:20:32.638 The way we would do that is we would subtract the 2. 171 00:20:32.638 --> 00:20:40.229 Cumulative functions, we would take the value for X equals point 2 5 minus the value for X equals minus point. 5. 172 00:20:40.229 --> 00:20:43.558 And get all the other 2 examples. 173 00:20:43.558 --> 00:20:52.348 And that's what we would have here. So we would find the probability X is the value is at a certain range by subtracting the PDF. 174 00:20:52.348 --> 00:20:56.848 It kim's a function for the values at the end of the range and we get that. 175 00:20:56.848 --> 00:21:09.929 Okay, skip the proof here again. I've hit this so many times. I think, you know, it, that you've got to discreet, random variables you got to continuous and you've got. 176 00:21:09.929 --> 00:21:14.219 The mixed. Okay. 177 00:21:14.219 --> 00:21:18.568 Fine points to start. Things are going to tend to skip here. 178 00:21:18.568 --> 00:21:21.989 So, unless you have a question. 179 00:21:21.989 --> 00:21:27.959 Okay, so for. 180 00:21:27.959 --> 00:21:36.568 The continuous and very we have the density function. 181 00:21:36.568 --> 00:21:43.798 If this is continuous, right? Well, we also have for discreet ground variables, but if it's a continuous kind of variable, the CDF. 182 00:21:43.798 --> 00:21:47.308 Is smooth and has derivatives probably. 183 00:21:47.308 --> 00:21:51.659 Mostly, and if we take the derivative of the. 184 00:21:54.419 --> 00:21:59.969 Of the dense of the distribution function we get what's called the density function. 185 00:21:59.969 --> 00:22:07.048 And the density function called the acronym uses PDF if it was a. 186 00:22:08.308 --> 00:22:14.308 Discreet variable would be probably mass function here for the continuous case with probably called a probability density function. 187 00:22:14.903 --> 00:22:25.013 And we represented with a little Lowercase app, just like we had for this case, a lower case variable. F. G. H. 188 00:22:25.013 --> 00:22:34.163 whatever means the density function again, the subscript opera case X is the name of the random variable. And the argument is a specific value of the random variable. 189 00:22:36.358 --> 00:22:41.848 Okay, and it gives the relative density. It probably would have any specific. 190 00:22:41.848 --> 00:22:53.068 Precise value is 0T, but this gives a relative. You can talk about relative densities. So, like, the relative probability of a very small interval this would be in the limit. The density functioning. 191 00:22:53.068 --> 00:23:04.919 So, and I talk about it down here so the problem that the random variables between some value acts, and some infinite. 192 00:23:04.919 --> 00:23:08.939 Greater exports age is the density times. H, here. 193 00:23:08.939 --> 00:23:13.499 So, the density of the profitability. 194 00:23:13.499 --> 00:23:18.509 And it's always greater than or equal to 0. 195 00:23:19.858 --> 00:23:27.959 But and it could be arbitrarily large, it has to integrate out to 1 because it integrates out to the cumulative function at 1. 196 00:23:27.959 --> 00:23:36.808 plus infinity so so this is a so we're going to see for our continuous functions we're going to see the . 197 00:23:38.489 --> 00:23:43.318 We're going to see the density function used a lot. 198 00:23:43.318 --> 00:23:50.219 Okay, and so it's always a negative and so. 199 00:23:51.328 --> 00:23:56.128 We got the density function from a limit that have a very small. 200 00:23:56.128 --> 00:24:03.959 Animals, so if we take the density function, we integrate it over an interval. We get the probability of the random variable being and that interval. So. 201 00:24:03.959 --> 00:24:18.749 and the density you kill the function is zero four minus infinity up to x so we integrate the density for minus affinity to actually get the cdm so they go but they're complementary to each other sometimes one's more useful . 202 00:24:18.749 --> 00:24:22.288 Sometimes the other way useful, so. 203 00:24:23.429 --> 00:24:36.898 And let's talk about it here. We have a very small interval here. X2 X plus DX last page it said H, here it says. 204 00:24:36.898 --> 00:24:43.888 And then the area of that divided by D X is the density. 205 00:24:43.888 --> 00:24:49.469 So, okay. 206 00:24:51.568 --> 00:24:56.999 So now to physical mass. Okay. So some real examples here. 207 00:25:00.118 --> 00:25:06.419 If I have a uniform random variable on an interval from a to B. 208 00:25:06.419 --> 00:25:10.288 Then the density function is going to be 1 over B minus say. 209 00:25:10.288 --> 00:25:15.209 If we're in the interval a, to B and Sara outside the. 210 00:25:15.209 --> 00:25:19.469 The simplest possible density function PDF. 211 00:25:19.469 --> 00:25:23.189 Um, the. 212 00:25:23.189 --> 00:25:30.538 Cd after the kills, the thing is that the integral of that, and so this uniform. 213 00:25:30.538 --> 00:25:35.308 Right. And very 0T if last today. 214 00:25:35.308 --> 00:25:41.098 Flexible so the density, the kill the function going to be 0T up until we get to a. 215 00:25:41.098 --> 00:25:49.588 From a, to B, it increases linearly and when executes up to be the functions got enough to 1 and it stays 1. 216 00:25:49.588 --> 00:25:53.519 Forever after, and we've seen that Tom. 217 00:25:55.048 --> 00:26:01.949 Seen that before and scroll back here. 218 00:26:04.019 --> 00:26:10.078 That's on the left here. Okay. The Tim to function and on the right has a density function. Okay. 219 00:26:12.358 --> 00:26:16.648 Okay. 220 00:26:17.999 --> 00:26:26.969 Getting to some slightly more complicated, continuous description, the exponential random variable. 221 00:26:26.969 --> 00:26:38.038 So, typical example would be transmission time for messages or something and this is the. 222 00:26:39.388 --> 00:26:45.148 1 definition of it here here now, the transition time, it's non negative. 223 00:26:45.773 --> 00:27:00.413 And it's not exactly cannot be precise or does it means the same thing. Okay. So, 1 way to define it is what's the probability that the transmission time is greater than some parameter X? 224 00:27:00.689 --> 00:27:04.288 Give this definition here and axes anything. 225 00:27:04.288 --> 00:27:11.608 greater than zero to infinity now if we plug x equals zero into here we get eat of the minus zero . 226 00:27:11.608 --> 00:27:14.608 Which is 1, so good to the thing. 227 00:27:14.608 --> 00:27:21.209 You know, it's consistent. This is a valid definition, but probably, but it access something is 1. okay. 228 00:27:21.894 --> 00:27:36.564 Such a definition now, how do we get say the CDF of that? Well, this is what I've given the definition probably that X is greater than the CDF is a compliment of that. So that CDF, that's going to be 1 minus that. 229 00:27:36.594 --> 00:27:41.693 That's what we got down here for 16, 8 and the CDF is going to be 0T. If. 230 00:27:41.939 --> 00:27:54.269 x is less than zero because this is defined only for positive effects and then it's one minus this is set out for anything up to x equals infinity where is a minus infinity becomes zero so the cdm becomes one . 231 00:27:54.269 --> 00:27:58.709 So, we're happy. The thing is a valid definition. 232 00:27:58.709 --> 00:28:06.179 That's the killer. No, the density function is a derivative of that and you take your derivative. 233 00:28:06.179 --> 00:28:13.048 And we get the thing at the bottom effect, says greater equal to 0T axis lessons. 0T. The density is 0. 234 00:28:15.148 --> 00:28:25.108 Okay, so, density starts off reasonably high and tails off to 0T as gets larger. So the transmission time could be anything from 0 1 up. But it's more likely that it's. 235 00:28:25.108 --> 00:28:30.058 Small slightly granted exponential random variable so. 236 00:28:31.679 --> 00:28:34.949 And we'll see this again in the future somewhat. 237 00:28:37.348 --> 00:28:44.669 It's also it's a compliment to the distribution. Like, if we have random. 238 00:28:44.669 --> 00:28:59.608 Element Adams decaying. So the number of atoms that decay in a 2nd, is that random the time between 2 consecutive decays is an exponential follows an exponential distribution. 239 00:28:59.608 --> 00:29:02.909 If you got independent. 240 00:29:02.909 --> 00:29:07.138 You know, events happening. 241 00:29:07.138 --> 00:29:10.348 You know, requests coming into a server. 242 00:29:10.493 --> 00:29:17.003 Than packets from all over coming in, you know, you're running a web server and these are requests for service. 243 00:29:17.364 --> 00:29:29.903 The number of requests for a service, you get 2nd, would be a postpone distribution if they're independent and the time between 2 consecutive requests would follow an exponential distribution. Now why this is relevant to. 244 00:29:30.358 --> 00:29:35.699 Is that suppose it takes you a 2nd to serve up a web page than. 245 00:29:35.699 --> 00:29:45.358 It's an extra request happens in less than a 2nd, it will happen when you're still serving the previous web page, and they'll have to be queue up and wait. 246 00:29:47.878 --> 00:29:57.838 So, we've seen a uniform random variable continuous we've seen across saw. We've seen the exponential. I'm sorry. 247 00:29:57.838 --> 00:30:02.638 Now, we're going to see another continuous 1 is called a class and just the definition here. 248 00:30:02.638 --> 00:30:08.669 And they might say after the who to speech white forms decay, or something like that and. 249 00:30:08.669 --> 00:30:13.378 So you have the deaf and here, it would be defined, giving the density function. 250 00:30:13.378 --> 00:30:20.699 And and we integrate that to get the cumulative function, which we have down here. 251 00:30:23.308 --> 00:30:37.558 Okay, you see this happen often. Sometimes it's defined the density functions defined with a C, a just a normalization constant and it's not given. So you'd have to integrate the thing and find out what's the would be. 252 00:30:38.638 --> 00:30:41.759 Okay. 253 00:30:43.019 --> 00:30:46.169 So, it's 3 continuous things. 254 00:30:47.848 --> 00:30:53.098 So, we get discrete random variables. We saw a PDF for that is. 255 00:30:53.098 --> 00:30:57.088 Most of these arrow, we saw the step function, the. 256 00:30:58.138 --> 00:31:07.439 You might, you might find sort of the derivative of the step function and get it Delta function. I'm sorry but using what derivative means, but. 257 00:31:07.439 --> 00:31:15.538 You can get something called it is useful thing. It's spiked that just goes up. Let's skip that. Someone just not as interesting. 258 00:31:15.538 --> 00:31:20.489 Okay, in this course. 259 00:31:21.898 --> 00:31:31.558 We have condition all our probabilities. We have the straight probability, and we have the conditional probability condition on something else being true. 260 00:31:33.749 --> 00:31:40.108 And so we can have a conditional camel to fund distribution function. 261 00:31:40.108 --> 00:31:49.618 And that is just so defined as the conditional distribution function of X, giving some condition. See. 262 00:31:49.618 --> 00:31:53.669 And the is the. 263 00:31:55.078 --> 00:31:59.459 The probability X is less than X and. 264 00:31:59.459 --> 00:32:03.808 And also see is true to and normalizing it by the probability of the. 265 00:32:03.808 --> 00:32:09.388 Probably see has to be 0T because as we greater than 0T, because if we. 266 00:32:09.388 --> 00:32:14.398 If we condition something on an event, which can't occur, then it's sort of meaningless. 267 00:32:14.398 --> 00:32:18.868 Any case we can talk about our conditional probabilities here. 268 00:32:18.868 --> 00:32:21.868 That condition of cumulative distribution functions. 269 00:32:23.548 --> 00:32:32.398 And this as a chance for lots of nice examples here. 270 00:32:33.479 --> 00:32:45.868 And here might be 1, we've got a machine it's kind of certain its lifetime is a random variable acts. 271 00:32:45.868 --> 00:32:48.929 And it's got some can came up to function. 272 00:32:48.929 --> 00:32:56.249 Yeah, the facts and and we're interested that if the machine is still working at time key. 273 00:32:56.249 --> 00:33:00.088 It hasn't failed yet. What is. 274 00:33:00.088 --> 00:33:03.388 The conditional CDF PDF. 275 00:33:03.388 --> 00:33:11.159 For the remaining lifetime of the machine, if it hasn't failed yet, if it's still alive at see such a conditional thing. So. 276 00:33:14.338 --> 00:33:19.439 And so we can right here, so definitely conditional X. 277 00:33:19.439 --> 00:33:25.798 The lifetime is a little ex, given that the random and tea. That's the. 278 00:33:25.798 --> 00:33:32.009 Definition here, and then I use the definition on the previous page, and we can find the. 279 00:33:32.009 --> 00:33:36.719 Conditional CDF there and. 280 00:33:40.259 --> 00:33:44.128 Now, we look at this here, the thing in the numerator. 281 00:33:45.388 --> 00:33:51.509 And it is 0T unless. 282 00:33:51.509 --> 00:34:00.659 X is greater than tea, because big X, less than a little X and it gets greater than 0T. Only a little X is greater than T. 283 00:34:01.739 --> 00:34:06.358 So, it's we have down here and if little X is greater than tea. 284 00:34:06.358 --> 00:34:15.059 Then then these 2 things collapse suggests the probability that big accessible, a little lax. 285 00:34:15.059 --> 00:34:24.329 Because it implies the right side 1 so we work that out and we can get. So, then the probability that big X, actually, that's how and teased. 286 00:34:24.329 --> 00:34:27.838 And so it works out to. 287 00:34:27.838 --> 00:34:32.398 Do this thing down here and we've got. 288 00:34:32.398 --> 00:34:38.938 Is included the fact here that the probability is 0T and this little axis is bigger than the little T. that's why. 289 00:34:38.938 --> 00:34:44.128 Okay, do us attraction in here and we can get the conditional. 290 00:34:44.128 --> 00:34:52.378 Distribution function we take the derivative about that. We've got the conditional density function. 291 00:34:53.759 --> 00:34:59.128 And that makes sense here, if you think of it. So this is a probably density function at the machine. 292 00:35:00.239 --> 00:35:08.068 Dies at Ty at Timex and give it it survived at least a time tea. 293 00:35:09.179 --> 00:35:12.239 And that's just conditional probability that it's. 294 00:35:12.239 --> 00:35:16.768 Ignoring the conditional probability that its died at time. 295 00:35:16.768 --> 00:35:20.039 Normalized with the probability that it got to. 296 00:35:20.039 --> 00:35:30.599 Time tea, so, when I say probably, I mean, literally, okay, so this is showing us examples of the conditional. 297 00:35:30.599 --> 00:35:38.909 Probability distribution, and we get a total probability thing and so on. Okay. 298 00:35:38.909 --> 00:35:47.338 411 here is a big complicated example I'll describe it today and I'll write stuff down and walk you through it on. 299 00:35:47.338 --> 00:35:52.528 Thursday by Thursday, my iPad may actually be connecting to my laptop properly. 300 00:35:52.528 --> 00:35:58.889 But Leon Garcia takes this example 4, 1, 1, and uses it again. And again. 301 00:35:58.889 --> 00:36:04.559 Okay. 302 00:36:04.559 --> 00:36:18.628 So, what's happening here for transmitting a bit? It's either 0T or 1 zeros transmitted. It was a voltage and it's a symmetric voltage. So it's minus fee or plus fee. 303 00:36:18.628 --> 00:36:25.349 And then what the transmission line add some noise and the noise is called Gaussian noise. Now. 304 00:36:25.349 --> 00:36:31.259 There's some forward looking here, because we haven't told you yet what Gaussian noise means. 305 00:36:31.259 --> 00:36:34.978 It's just a continuous random variable. 306 00:36:34.978 --> 00:36:41.998 It's centered at 0T and get some 0T the smaller it gets sort of a bell shaped curve. 307 00:36:41.998 --> 00:36:47.099 We'll see, we'll spend a lot of time on the random variable later, but the moment. 308 00:36:48.358 --> 00:36:58.289 You know, just take it as some unknown given. Any case you've got X is your transmitted signal. It's plus and minus plus or minus fee. 309 00:36:59.873 --> 00:37:14.304 And then some random noise gets added to it to make a new random variable Y, can combine random variables like this. So we know. So exits discreet and we know it's probabilty mass function. It's not uniform. 310 00:37:14.304 --> 00:37:21.023 We're going to missing a little Messier. The property of 1 is P and the Providence 0T excuse me is 1 minus P. 311 00:37:21.298 --> 00:37:28.528 So it's like an unfair coin. So, X is discreet, but N is continuous and why it will be continuous. 312 00:37:31.048 --> 00:37:35.429 So we want to know what's a density function for why. 313 00:37:35.429 --> 00:37:42.208 And then a later version of this example, will be given a particular value of why. 314 00:37:42.208 --> 00:37:45.599 What's the most likely value for X? 315 00:37:46.918 --> 00:37:52.349 And again, it gets a little tricky because the values of acts are not equally probable. 316 00:37:52.349 --> 00:37:58.498 It makes life interesting and it's true and the real world. Okay. If I did a scan of this page. 317 00:37:58.498 --> 00:38:03.719 Got white bits and black bits the White fits are more common, so they're not equally probable. 318 00:38:04.978 --> 00:38:17.369 Okay, so what we're going to use is using this as an example, just get some of this stuff. 319 00:38:17.369 --> 00:38:20.730 Get through is we want the cumulative. 320 00:38:20.730 --> 00:38:26.070 Distribution function for Y, and. 321 00:38:27.090 --> 00:38:32.849 We're going to do 2 cases we're going to find the distribution function for. 322 00:38:34.650 --> 00:38:40.530 Well, for X, given that X was transmitted as well. 323 00:38:40.530 --> 00:38:48.869 As a minus fee, which would be case fees here or transmitted as a philosophy, which would be case be 1 that 1 was transmitted. 324 00:38:48.869 --> 00:38:54.030 So, we can say that the, it's like the total probability for me, it's a total. 325 00:38:54.030 --> 00:38:57.119 Probably only foreigner for the kill the function. 326 00:38:57.119 --> 00:39:00.239 It's the cumulative function. 327 00:39:00.239 --> 00:39:13.800 For X continue condition, Donna 0T being transmitted times, or probability is era was translated it should be a little P. P. in there. Plus the kim's a function conditional 1 being transmitted times of probability. Okay. 328 00:39:13.800 --> 00:39:18.659 1, being transmitted, now we work this out. 329 00:39:20.760 --> 00:39:24.869 Here so their probabilty, so the conditional. 330 00:39:24.869 --> 00:39:31.920 A function here is the probability that why that's the definition of cumulative here condition on. 331 00:39:31.920 --> 00:39:38.969 Be 0T, which is a 0T transmitted, which has big axis minus the times. The probability of that, which is 1 minus P. 332 00:39:38.969 --> 00:39:42.599 The to function for. 333 00:39:42.599 --> 00:39:51.000 Again, for given v1, that's a big, wide little x condition on B1, which was the 1 was transmitted, which was. 334 00:39:51.000 --> 00:39:54.869 Plus the voltage was transmitted times the probability of that. 335 00:39:54.869 --> 00:39:59.039 Which is okay. 336 00:40:01.469 --> 00:40:08.400 Now, the next complication here. 337 00:40:08.400 --> 00:40:13.139 Is that remember our why was X plus some noise. 338 00:40:13.139 --> 00:40:22.829 And so, um, so, right here, the, probably that, while I think to. 339 00:40:22.829 --> 00:40:36.269 Is the problem and well, that here, I just find X Y, some minus fee. So this nets out to the probability that comfy placenta is less than X. 340 00:40:36.269 --> 00:40:44.400 Because why is the plus end in this case? And this nuts out. 341 00:40:44.400 --> 00:40:50.340 This, and we work a few things and waving my hands a little. We're going to get the. 342 00:40:50.340 --> 00:40:58.170 Kim look to distribution function for Y. 343 00:40:58.170 --> 00:41:01.710 If a 0T was transmitted, which means that X minus be. 344 00:41:01.710 --> 00:41:05.579 Is the CDF for the noise? 345 00:41:05.579 --> 00:41:11.699 Random variable being X plus the, the noise is X and X. 346 00:41:11.699 --> 00:41:15.030 Is minus phase and this nets out to, um. 347 00:41:15.030 --> 00:41:18.809 X when the stats are. 348 00:41:18.809 --> 00:41:23.489 Is the right thing and then we can. 349 00:41:23.489 --> 00:41:28.769 I'll do this more detailed on terrorist, giving you a teaser of what's happening. 350 00:41:29.880 --> 00:41:34.860 Get the 2 conditional cumulative functions vulnerable and by the probabilities and we get. 351 00:41:34.860 --> 00:41:38.940 Combo unconditional CDF for the output signal. Why. 352 00:41:41.159 --> 00:41:48.480 And also showed us the combo of the discreet brand, new, variable and continuous. That was the cumulative function. The. 353 00:41:48.480 --> 00:41:54.539 Density function is the derivative of that, and with respect to X and. 354 00:41:56.429 --> 00:42:08.880 We get that. Okay, so anticipating a little the Gaussian random variable. And here, this is the 1st time you've seen this in the course it's. 355 00:42:08.880 --> 00:42:13.110 Odd way that we got introduced to set. 356 00:42:13.110 --> 00:42:16.199 This is his formal definition here. 357 00:42:16.199 --> 00:42:21.510 It's the most important continuous friend of variable. 358 00:42:21.510 --> 00:42:30.750 And this is what it looks like like this. Well, this is X plus it's a famous bell shaped curve. 359 00:42:30.750 --> 00:42:39.780 It peaks its tails, go down to 0T at both ends and they go to 0T very quickly because he looked at the definition everybody eat the minus X squared. 360 00:42:39.780 --> 00:42:47.489 so its tails go down to zero very fast you're integrated for minus if you need to infinity you'll get one . 361 00:42:47.489 --> 00:42:55.320 It's got 1 parameter segment segments the standard deviation till it expresses effectively the width of that. 362 00:42:56.519 --> 00:43:03.960 Your Gaussian probability distribution continuous, calcium, random variable. Now. 363 00:43:03.960 --> 00:43:07.590 It's also called the normal Gaussian and normal mean. The same thing. 364 00:43:07.590 --> 00:43:14.969 It's called Gaussian after mathematician gals who 1st wrote about it, it's called normal. 365 00:43:14.969 --> 00:43:18.059 Cause, it's got an interesting profit property, which is this. 366 00:43:18.059 --> 00:43:23.340 Almost all other continuous, random variables. 367 00:43:23.340 --> 00:43:29.519 As end gets bigger, if you get more and more observations, they start looking like the calcium. 368 00:43:30.355 --> 00:43:45.264 If you even discrete, if you take your binomial distribution end, being big and caving big, it starts looking like a calcium. So if you want to, if you have to evaluate it, binomial use the calcium approximation. 369 00:43:46.739 --> 00:43:57.599 It's quite and it gets good really quickly. The croissant distribution. The average clammed is somewhat bigger than 0T. It starts looking like. 370 00:43:57.599 --> 00:44:04.409 A Gaussian for lab alpha more than about 5 that happens quite quickly. And I've got a blurb here. I'll talk to her about that. 371 00:44:04.409 --> 00:44:10.949 Okay, so we've got the Gaussian random variable. So what's happening here is that. 372 00:44:12.750 --> 00:44:23.429 Well, we take half of edifecs, plus the chip sit down by X. Y, and Z shifted up 5. B. and so we have the conditional. 373 00:44:23.429 --> 00:44:27.150 This is a conditional density function. 374 00:44:27.150 --> 00:44:30.719 By the 2 things here. 375 00:44:31.949 --> 00:44:35.460 And then what we can do is we can get the, um. 376 00:44:35.460 --> 00:44:41.820 So this would be the complete density function as they receive signal here. It's fairly complicated but. 377 00:44:41.820 --> 00:44:48.539 Things get more complicated here. Okay. Now the calcium. 378 00:44:48.539 --> 00:44:54.000 Random variable. 379 00:44:54.000 --> 00:44:58.530 You know, sort of have to use a computer to work with it much. 380 00:44:58.530 --> 00:45:08.460 To evaluate it, and so all, you can use tables of numbers if you weren't allowed to use a computer, but this calcium on the close book exam, we'd hand out a table of values for it. 381 00:45:09.900 --> 00:45:13.679 But you can't just do partial under goals, solid and solid by hand. 382 00:45:13.679 --> 00:45:20.219 Okay, so. 383 00:45:22.679 --> 00:45:26.699 Go to a random variable chapter 3. it was discreet. 384 00:45:26.699 --> 00:45:30.900 Now, it's becoming continuous also, or even mixed. 385 00:45:30.900 --> 00:45:37.949 Last chapter, we saw computing what are called statistics of the random variables such as the mean and the variance. 386 00:45:37.949 --> 00:45:41.610 Meaning is also called the expected value. It's the same thing. 387 00:45:41.610 --> 00:45:47.789 And we saw the formula for for a discreet, random variable. 388 00:45:47.789 --> 00:45:51.809 Um, which was down here. 389 00:45:53.340 --> 00:45:58.019 You can also have it for continuous random variable, which is. 390 00:45:58.019 --> 00:46:01.980 For 27 down here now. 391 00:46:01.980 --> 00:46:14.880 It's an integral now, there is a complication, but it's getting a little beyond the course, but there are some probability distribution. Some continuous decisions that do not have means. 392 00:46:14.880 --> 00:46:20.280 That's what they're mentioning here and this is enrichment material. Not going to examine you on it. 393 00:46:20.280 --> 00:46:25.949 But it is, can see, I don't want, give you an example or 2 maybe but there are perfectly good. 394 00:46:25.949 --> 00:46:29.099 Random variables that don't have. 395 00:46:29.099 --> 00:46:33.960 Means he tried to evaluate to Florida for a mean analytical diverges. 396 00:46:33.960 --> 00:46:37.980 So, the concept of, I mean, does not exist for this random variable. 397 00:46:37.980 --> 00:46:42.179 Did a variance and so on. Okay. 398 00:46:43.860 --> 00:46:50.730 Skip the thing about the delta Delta, the significance of this random variable will not having a mean and variance. 399 00:46:50.730 --> 00:46:58.349 Is some people argue this is the case for financial variables like the stock market price and so on. 400 00:46:58.349 --> 00:47:12.630 Okay, any case if any uniform random variable, it does have meetings and take the farm and you're going to get to mean at some point. That's nice. And so on. 401 00:47:12.630 --> 00:47:22.650 Okay, the gal see, you can do I mean, you can integrate to mean if it goes around or variable, and we'll get to be the point in the middle. 402 00:47:22.650 --> 00:47:28.170 I've seen seen a lot more exponential, random variable. 403 00:47:29.280 --> 00:47:33.719 So the PDF is alarmed that you did the minus land T. 404 00:47:33.719 --> 00:47:40.710 you want the expectation t and integrated zero to minus infinity because the pdf is zero effects . 405 00:47:40.710 --> 00:47:44.159 At T. O. T surrounding is negative. 406 00:47:44.159 --> 00:47:50.130 Work it out and get overlapped and so on. 407 00:47:52.320 --> 00:48:06.750 So, and again, send a sense, it's a call it's a compliment here. If it was the number of customers per 2nd, the expected time between customers, it's 1, Overland. 408 00:48:11.400 --> 00:48:18.480 And we can find expected values of functions of random variables. The same sort of idea. 409 00:48:18.480 --> 00:48:22.769 I'm just giving you previews right now I'll hit you in more detail. 410 00:48:22.769 --> 00:48:26.909 Following lectures, so. 411 00:48:26.909 --> 00:48:31.949 And skip through some of this at the moment. Okay. Okay. 412 00:48:31.949 --> 00:48:36.269 We can find the variance of a random variable. 413 00:48:36.269 --> 00:48:39.989 It's the same formula as the. 414 00:48:39.989 --> 00:48:45.030 For a discreet, random variable, there's 2 ways of doing it. 415 00:48:45.030 --> 00:48:50.699 You could take the expected value of the random variable minus the mean, expected value squared. 416 00:48:50.699 --> 00:48:58.949 The square is inside the brackets, the expectation, or we take expected value of X squared minus where it's expected value. 417 00:48:58.949 --> 00:49:09.869 Same definition works for continuous, random variables and there's a new definition here. The deviation is the square root of the variance. The. 418 00:49:11.099 --> 00:49:16.619 Saturday deviation is useful because if you do a dimensional analysis, you look at the units. 419 00:49:16.619 --> 00:49:19.679 So, if the random variable is, let's say. 420 00:49:19.679 --> 00:49:30.269 You measuring a distance meters perhaps then that's units for the round. No variable. The expect expectation has units of meters. The variances units. Meters squared. 421 00:49:30.269 --> 00:49:33.989 The standard deviation goes back and that's units of meters. That's nice. 422 00:49:35.010 --> 00:49:48.480 Uniform random variable. We can take the formula here. The mean is a plus B over to undergo some expectation of X minus 8 plus 2 over 2 squared. 423 00:49:48.480 --> 00:49:53.849 Easy enough elementary integral sudden we get feedback say squared over 12. 424 00:49:55.590 --> 00:50:01.139 The Gaussian again, the Gaussian or also called the normal. 425 00:50:01.139 --> 00:50:07.469 Distribution it says the most common continuous distribution, because. 426 00:50:07.469 --> 00:50:11.429 Almost all other distributions approach it in the limit. 427 00:50:11.429 --> 00:50:14.519 As you have more and more observations, so. 428 00:50:14.519 --> 00:50:24.539 And I'll walk you through this by hand later there is a trick you can use to actually find the closed hinder goal for the variance variances Sigma Square. 429 00:50:28.559 --> 00:50:31.800 And we got this here is just rehashing. 430 00:50:31.800 --> 00:50:36.989 Some properties of the variants, the various of a constant is. 431 00:50:36.989 --> 00:50:43.829 0T, if you shift the random variable bye. See, you do not change the variance of, like, the spread of it. 432 00:50:43.829 --> 00:50:47.460 If you scale up the random variable, see. 433 00:50:47.460 --> 00:50:51.960 The varying scales up by C squared. So. 434 00:50:55.050 --> 00:50:58.739 And this is shows here, the Gaussian guys, a spell shaped curve here. 435 00:50:58.739 --> 00:51:03.869 Sigma equals 1 then the standard deviation is 1. 436 00:51:03.869 --> 00:51:11.610 And basically, it's like a 2 thirds chance that the random variables within, in the mean plus and minus Sigma. 437 00:51:11.610 --> 00:51:17.219 If the segments 1, half, then it's compressed to half the width. 438 00:51:17.219 --> 00:51:21.000 And so tails off to 0T that much faster. So. 439 00:51:23.940 --> 00:51:36.389 You could have higher order moments here. I'm going to skip them. They're not as useful of. They it's like, moment generating functions. 440 00:51:36.389 --> 00:51:39.690 And. 441 00:51:41.159 --> 00:51:46.590 Having some other courses, if you have all of the moments of a distribution, then. 442 00:51:46.590 --> 00:51:54.869 That actually fully defined to distribution, just like, if you have a function, you do a tailor expansion, you have all of the derivatives. 443 00:51:54.869 --> 00:52:04.559 Well, behaved function that is then that defines the function. If you've got all of the moments of all orders in, that defines the distribution for, you. 444 00:52:05.639 --> 00:52:13.079 Well, behaved distributions, we'll skip to they set them all. Oh, okay. 445 00:52:13.079 --> 00:52:17.969 Just say, hit you with some important, continuous ones the uniform. 446 00:52:17.969 --> 00:52:21.719 We've seen exponential. 447 00:52:21.719 --> 00:52:25.139 We've seen time between occurrences. 448 00:52:25.139 --> 00:52:29.670 Again, if they're independent events hitting you, like, requests on your web server. 449 00:52:29.670 --> 00:52:42.750 And the possible customers are independent of each other, and there's a lot of them, anyone customers unlikely, but in the aggregate, you're going to get hits on your web server. Then the, then the time between consecutive hits the random variable. 450 00:52:42.750 --> 00:52:55.380 An exponential distribution, this table here page 164 you might want to make a note of because it'll be coming back to it. The Gaussian important. Gam is another 1. we may see. 451 00:52:55.380 --> 00:53:05.880 Later it's pops up in different things, they've got 2 parameters. So picking different values of the parameters creates a lot of interesting special cases. 452 00:53:07.320 --> 00:53:13.019 The Erlang thing happens in communication theory, the airline random variable. 453 00:53:13.019 --> 00:53:20.250 Got multiple requests on a server and you're looking at a group of them. 454 00:53:20.250 --> 00:53:23.429 Pie squared pops up and sound the passion. 455 00:53:23.429 --> 00:53:27.210 Taco rally, I'm going to skip these somewhat. 456 00:53:28.980 --> 00:53:43.289 Cache is cool because this 1 does not have moment to do try to do is either go for the it diverges is approach a well defined brand new variable integrate the density. You can 1 would find the expected value of the density. 457 00:53:43.289 --> 00:53:48.059 Yeah, okay. And some less important ones I'll skip. 458 00:53:49.380 --> 00:53:59.429 Yeah, okay. And remember this property for the exponential distribution. This is exponential. Eventually be the time until the next. 459 00:53:59.429 --> 00:54:09.570 Random Adam, or the time until the next hit on your web server the time you have to wait for the next head does not depend on how long you've been waiting. 460 00:54:09.570 --> 00:54:13.380 The memory property. Okay. 461 00:54:13.380 --> 00:54:19.440 Gaussian will hit this more later. Let me go back over here a little. 462 00:54:21.960 --> 00:54:27.900 What I've typed on the blog, so this is. 463 00:54:27.900 --> 00:54:32.219 Plus on, which is probability of K. 464 00:54:32.219 --> 00:54:38.789 Ads over from, and also they don't care which of the, in which case we're ahead. 465 00:54:38.789 --> 00:54:42.179 Um, I know me that's sorry. That's binomial. Plus on. 466 00:54:42.179 --> 00:54:51.030 You know, just to be for large when you got a large number of possibilities, 1 is very unlikely mean and then there are moments and approximation. So. 467 00:54:51.030 --> 00:54:54.030 There are times when all 3 are possible. 468 00:54:54.030 --> 00:55:07.170 But it's a binomial event is very large. You want to use an approximation, like the normal but if that is very large, and P is very small. So, and and P is a small number, then the POS on Mexico distribution. 469 00:55:07.170 --> 00:55:12.809 And I mention that here, so, and for really large and. 470 00:55:12.809 --> 00:55:15.989 You can get an factorial in the equation. 471 00:55:15.989 --> 00:55:20.159 And it gets hard to. 472 00:55:20.159 --> 00:55:25.230 Evaluate and factorial use to normal. You don't have to do that. 473 00:55:26.670 --> 00:55:29.670 And then my summary here are the points. 474 00:55:29.670 --> 00:55:39.059 For chapter starter chapter 4 now for discrete random variables. So we had the mass function continuously have the density function. 475 00:55:39.059 --> 00:55:46.500 And the relevance is, the probability of the random variables in a certain interval is a density function, time society. So that interval. 476 00:55:46.500 --> 00:55:54.929 Very small and inter intervals and for any random variable, we got the cumulative distribution, the probability Index up to a certain point. 477 00:55:54.929 --> 00:55:58.769 And definition here. 478 00:55:59.304 --> 00:56:14.094 And you can always go look at Wikipedia. Hi. Just a fun mentioning Wikipedia because people like to score points against it perfectly valid points. It's politically quite biased, et cetera but I'm not using it for politics. I'm using it for math. 479 00:56:14.340 --> 00:56:22.260 And so that's even deeper than we need in the course. 480 00:56:23.369 --> 00:56:32.699 The notation that I mentioned here and okay so tools tools tools. Computers are occasionally useful. 481 00:56:32.699 --> 00:56:36.389 I guess I'm joking. Um. 482 00:56:36.389 --> 00:56:41.190 This Matlab. 483 00:56:41.190 --> 00:56:44.849 I will where that. 484 00:56:44.849 --> 00:56:57.690 Some students don't like math lab tough Labs. Important. You're going to become a confident engineer. I'm sorry where? I'm not sorry you got to get some familiarity with Matlab and you should be able to pick it up on your own. 485 00:56:58.375 --> 00:57:08.304 If we don't spend a lot of time, and of course, nice for matrices and it has some nice stuff for probabilities. And I'll show you some examples of this. 486 00:57:08.304 --> 00:57:19.074 So, you're working with the same thing, auto computation and Matlab has got a lot of packages for signal processing control theory. And so it's a really big tool. 487 00:57:19.739 --> 00:57:25.710 My only objection is, you've got in the real world, it costs money, a lot of money. So. 488 00:57:25.710 --> 00:57:30.985 But it's very powerful and it's algorithms for doing stuff. 489 00:57:30.985 --> 00:57:41.485 Inverting matrices whatever are the best, the best because they, the company that does math math for sale, hire, they'll go to hire the best mathematicians. 490 00:57:42.869 --> 00:57:50.579 In the world, and pay them to develop routines for them for Matt Labs. So that Labs got the best state of the art routines in it. 491 00:57:51.659 --> 00:57:59.250 I'll show some of this, there's other packages here. I've mentioned. 492 00:57:59.250 --> 00:58:03.059 Mathematica is Mathematica and maple. 493 00:58:03.059 --> 00:58:09.840 Our packages that work with algebra, you can work with equations and formulas. Matlab just numbers. 494 00:58:09.840 --> 00:58:24.449 And so, Dave alert Mathematica is the more common 1 used to license PayPal because it was cheaper. I'd recommend Mathematica got a license from Mathematica to. And I'll show you some examples of that. 495 00:58:24.449 --> 00:58:31.949 So working again learning curve, but it weren't beautiful, plodding and works with equation. You can tell to integrate. 496 00:58:31.949 --> 00:58:36.000 A formula, and it will do it. 497 00:58:36.000 --> 00:58:39.059 My unsolicited opinion. 498 00:58:39.059 --> 00:58:44.070 Matlab excellent quality numerical lots of tool kits. Oh, it also. 499 00:58:44.070 --> 00:58:57.090 Works on parallel computers I mean, all computers are multi car by phones, multi car now. Okay. And the laptop I'm lecturing on has a good GPU in it. A couple of 1000 could of course. 500 00:58:57.090 --> 00:59:02.159 And so you run Matlab, we can use your parallel computers to some extent. 501 00:59:02.159 --> 00:59:10.530 It's interactive disadvantages it's expensive and it works with the rays. 502 00:59:10.530 --> 00:59:18.210 So, they got a crazy programming style to ride efficiently. That's the thing. It starts off easy and gets harder to get deeper. 503 00:59:18.210 --> 00:59:24.210 But alternatives to Matt labs are free clones that are no good. 504 00:59:24.210 --> 00:59:29.159 I like C, plus plus personally, I program and C plus plus. 505 00:59:29.159 --> 00:59:36.630 And the algorithms that are in Matlab routines are, there's also C plus plus routines that will do that too. Generally. 506 00:59:36.630 --> 00:59:42.059 And so you could write a C, plus was called, but you have to compile it and target. 507 00:59:42.059 --> 00:59:53.909 And the air, it's, it's horrible debugging the sort of stuff and C plus the error messages are awful. 1 little error in your source code may cause hundreds of error messages. None of which are helpful. 508 00:59:53.909 --> 00:59:57.269 Um, because. 509 00:59:57.269 --> 01:00:06.269 You do something wrong in your source, and it triggers a narrow deep inside 1 of the routines that you're using and you don't have the source code for the routine. Here is totally meaningless. 510 01:00:06.269 --> 01:00:13.019 So, it's using a facility C plus plus called template metal programming, which is really powerful. 511 01:00:13.019 --> 01:00:18.030 He can write code. It looks like Matt lab. It's very compact code. It compiles folly. 512 01:00:18.030 --> 01:00:21.059 But if you can get it to work, it runs it really. 513 01:00:21.059 --> 01:00:24.780 Fast it's like, why I like C plus plus. 514 01:00:24.780 --> 01:00:29.489 So, okay, um. 515 01:00:31.440 --> 01:00:34.440 A reasonable point to stop. I'll stop. 516 01:00:35.789 --> 01:00:39.119 I'll stop protect, Charlie. Well, here I have. 517 01:00:39.119 --> 01:00:42.659 Some what I was just telling about some of this, I'll hit this again on Thursday. 518 01:00:42.659 --> 01:00:46.710 So, what we did today, just to review, I. 519 01:00:46.710 --> 01:00:55.409 Did a quick review of the song discreet distribution, because it's important. Then we got into chapter 4 talking about. 520 01:00:55.409 --> 01:00:58.889 General random variables as could be continuous. 521 01:00:58.889 --> 01:01:13.260 We saw the idea of a of a cumulative distribution function. So it's like the left integral off to a certain point in the probability. The CDF is important, because you can use it both for continuous and for discrete random variables. 522 01:01:13.260 --> 01:01:18.690 And then we saw some important, continuous, random variables. 523 01:01:18.690 --> 01:01:22.739 Uniform exponential. 524 01:01:22.739 --> 01:01:28.469 Normal are the big sign introduction to the norm. I give you a teaser for it. 525 01:01:28.469 --> 01:01:31.800 And we'll see more of that on. 526 01:01:32.849 --> 01:01:36.059 On Thursday. 527 01:01:36.059 --> 01:01:41.039 So, at some, so, today was presenting stuff in the textbook. 528 01:01:41.039 --> 01:01:46.349 Thursday, I'll try to run some lab stuff maybe and. 529 01:01:46.349 --> 01:01:50.519 Took introduction to Matt lab for people that don't know it. 530 01:01:50.519 --> 01:01:57.869 And also write down, do some hand examples also reviewing chapter 3 some examples. 531 01:01:57.869 --> 01:02:02.340 And pick you examples from chapter 4. 532 01:02:04.050 --> 01:02:12.840 Stop a couple of minutes early today. Natural point to stop. If you have questions. 533 01:02:12.840 --> 01:02:15.989 Chat window is open. 534 01:02:15.989 --> 01:02:22.889 And all in a while, we're having a problem with somebody on mute do to do do. 535 01:02:25.619 --> 01:02:30.539 Yeah, if anyone wants to. 536 01:02:32.519 --> 01:02:36.690 You can now mute yourself if you'd like. 537 01:02:38.400 --> 01:02:42.150 Yeah, or you can type questions. 538 01:02:42.150 --> 01:02:45.630 Or you can go home and head off to you next class. 539 01:02:45.630 --> 01:02:52.920 Or get suffer? No questions. Okay well, half a good week. 540 01:02:52.920 --> 01:02:56.789 Enjoy the sunny weather, I guess it's warming up a little. 541 01:02:56.789 --> 01:03:01.710 And virtually speaking see you on. 542 01:03:01.710 --> 01:03:08.940 Oh, question when do you think the exam? Thank you the, it's a valid point. Let me get the out to you in the next day or 2 then. 543 01:03:10.590 --> 01:03:13.980 Other questions good point like. 544 01:03:16.320 --> 01:03:21.690 Welcome, no, you, that's a valid criticism. I should have gotten it out on the weekend. 545 01:03:21.690 --> 01:03:28.710 I'm reading all the comments and the wildcard last question. Multiple choice are graded. 546 01:03:28.710 --> 01:03:32.039 And so processing the wild cards and so on. 547 01:03:33.239 --> 01:03:40.530 Other than that. Okay. Well, oh, I can't read it. I can I scroll up. My Internet just went out. 548 01:03:41.699 --> 01:03:48.389 Can I scroll up walk the chat window. 549 01:03:50.670 --> 01:03:57.570 Okay, well there's nothing much in the chat window if that's what you mean. 550 01:03:58.739 --> 01:04:02.820 Here this is just a textbook. If you mean the blog. 551 01:04:02.820 --> 01:04:16.500 Did you miss anything the last 5 minutes though? I summarized what I talked about today, just for a couple of minutes, and then I asked for questions there was a comment that I need to get moving and get the exam back to you. 552 01:04:16.500 --> 01:04:20.730 That was all the last 5 minutes. 553 01:04:24.090 --> 01:04:27.840 That's okay. I'll see you then. 554 01:04:27.840 --> 01:04:29.460 Thursday.