WEBVTT 1 00:20:20.818 --> 00:20:28.169 Okay, Hello everyone I don't TV. I on Q talk just ended. I don't know if anyone had a chance to. 2 00:20:28.169 --> 00:20:33.449 Attended it was an excellent talk. Excellent introduction to quantum computing. 3 00:20:33.449 --> 00:20:40.648 So, a chance to introduce to that part of paranoia computing. 4 00:20:40.648 --> 00:20:46.828 So, can you hear me. 5 00:20:47.878 --> 00:20:51.269 That's here. 6 00:20:51.269 --> 00:20:57.328 And good, thank you. Okay, so, because well, that was. 7 00:20:57.328 --> 00:21:02.699 An optional part of the class. This is going to be really short right now. So, let me. 8 00:21:02.699 --> 00:21:06.239 Think we've got 2 talks left over from. 9 00:21:06.239 --> 00:21:12.088 From Thursday from Monday rather is basically that. 10 00:21:12.088 --> 00:21:18.239 And so let's see, who is. 11 00:21:18.239 --> 00:21:21.269 Who is on then. 12 00:21:22.348 --> 00:21:28.528 The 2nd. 13 00:21:28.528 --> 00:21:37.469 In parallel here. 14 00:21:39.479 --> 00:21:43.769 Whoops, this getting the parallel thing on. 15 00:21:47.548 --> 00:21:50.608 Okay. 16 00:21:53.128 --> 00:21:56.848 Silence. 17 00:21:56.848 --> 00:22:01.048 Me a 2nd, here. 18 00:22:01.048 --> 00:22:04.798 Silence. 19 00:22:06.209 --> 00:22:09.449 Okay, and they. 20 00:22:10.828 --> 00:22:15.929 Just share this. 21 00:22:24.743 --> 00:22:39.294 Okay, so is it sharing that that was this? The slides may become available also and now 1.1St. Okay the other thing is Ben in corner. If you're online, can talk to us. This 1st, 1, comment about computing is. 22 00:22:41.398 --> 00:22:45.989 We've got all these different computer systems, and they don't talk to each other. 23 00:22:47.753 --> 00:22:48.263 So, 24 00:22:48.263 --> 00:22:49.374 if you joined late, 25 00:22:49.403 --> 00:22:52.284 or if you're not in everything well, 26 00:22:52.314 --> 00:22:53.574 if you don't have, 27 00:22:53.874 --> 00:22:54.203 you know, 28 00:22:54.203 --> 00:22:56.963 if you're not watching this lecture was and you're not seeing this, 29 00:22:57.203 --> 00:22:59.544 unless you're reading the blog but in any case, 30 00:22:59.544 --> 00:23:03.564 if you're not in great scope or not getting this invitation then, 31 00:23:03.564 --> 00:23:07.344 let me know LMS you get added automatically for these, 32 00:23:07.344 --> 00:23:09.324 that just stupid final grades and. 33 00:23:11.308 --> 00:23:19.078 Use webx teams for messages, which I'm not checking during the class and so on. Okay. So any questions about. 34 00:23:19.078 --> 00:23:26.818 That so we have Ben and Connor if you want to talk, let me just stop this and. 35 00:23:26.818 --> 00:23:32.669 Yeah, Ben, I see you're online if you would like. 36 00:23:32.669 --> 00:23:37.888 Tell us about to talk in 6 computers. 37 00:23:37.888 --> 00:23:49.229 Sure, yeah, right. I don't have like a PowerPoint, but I have a little visual while. I'm talking about the 2 computers. Let me see if I can get that. Oh, that's fine. Started. 38 00:23:51.868 --> 00:24:00.838 All right, so this is just 1st, 1, this is invidious. Spleen. Let me see if I can all tab and. 39 00:24:00.838 --> 00:24:03.929 Go back to my notes. Can you still see the the visual. 40 00:24:06.479 --> 00:24:10.679 Yes. Okay. Great. So. 41 00:24:10.679 --> 00:24:16.318 1st, computer supercomputer I'm going to be talking about is saline, which is number 5. 42 00:24:16.318 --> 00:24:25.798 Has this super computer on the top 500 list that me know about it was created by invidia. It's currently located in. 43 00:24:25.798 --> 00:24:28.949 The videos headquarters. 44 00:24:28.949 --> 00:24:33.479 It's the fastest supercomputer used for commercial applications. 45 00:24:33.479 --> 00:24:37.828 And it's used by a variety of companies for its computing power. 46 00:24:37.828 --> 00:24:47.189 Customers of saline NVIDIA for usage include continental walkie, Mark and Microsoft. 47 00:24:47.189 --> 00:24:52.048 It's also used in house spine video for workloads. 48 00:24:52.048 --> 00:25:02.999 And it's a pretty big deal for in video commercially, because it uses their D system, just kind of a platform for setting up. 49 00:25:02.999 --> 00:25:07.288 Supercomputers and there are other. 50 00:25:07.288 --> 00:25:13.709 You know, research institutes and other companies that are interested in using its platform. 51 00:25:13.709 --> 00:25:22.439 University Florida was 1 of them setting up soon. It's very easy to set up compared to previous solutions. 52 00:25:22.439 --> 00:25:28.558 This was really important during the pandemic when you couldn't have a ton of people. 53 00:25:28.558 --> 00:25:33.808 Coming in and building stuff health. 54 00:25:33.808 --> 00:25:38.818 Concerns, um, so it was set up. 55 00:25:38.818 --> 00:25:42.598 We set up by 2 engineers or the course of. 56 00:25:42.598 --> 00:25:46.769 3 weeks, which is very fast. 57 00:25:46.769 --> 00:25:54.929 Achieve 63.4 flops floating point operations per 2nd. 58 00:25:54.929 --> 00:26:00.929 Fun fact about saline is it as a robotic trip that tends to it? 59 00:26:00.929 --> 00:26:09.689 It has 555520 computing cores and 1.12 gigabytes. 60 00:26:09.689 --> 00:26:15.028 1.12M gigabytes memory or 1.12 petabytes. 61 00:26:15.028 --> 00:26:20.249 It uses the epic processors for its use. 62 00:26:20.249 --> 00:26:24.179 It's the 2nd, most energy efficient supercomputer on the top of a bunch of list. 63 00:26:24.179 --> 00:26:28.709 It runs to 20.04 that 1 else. Yes. 64 00:26:28.709 --> 00:26:36.179 It's relatively new, compared to all of the other ones on the topic country list. So there wasn't a ton of. 65 00:26:36.179 --> 00:26:39.838 A lot of the information was that I could find, and it was. 66 00:26:39.838 --> 00:26:44.459 You know, and press releases, but still still interesting. 67 00:26:44.459 --> 00:26:53.969 Some interesting stuff next computer number 6, and we talking about to a. 68 00:26:53.969 --> 00:26:59.759 This was created by the Chinese, national University of defense technology. 69 00:26:59.759 --> 00:27:02.848 It's located in the National supercomputer center. 70 00:27:02.848 --> 00:27:08.159 In guangzho, China, it is 6, fastest, super computer world and I said. 71 00:27:08.159 --> 00:27:12.538 And utilizes Xenon E5 processors. 72 00:27:12.538 --> 00:27:18.689 And for a while, it was the fastest supercomputer in the world. The top 500 list had it. 73 00:27:18.689 --> 00:27:26.999 As its fastest supercomputer, or it's June 2013, November, 2013, June, 2014 and November 2014 lists. 74 00:27:26.999 --> 00:27:31.078 Also, the June, 2015 and November 201500. 75 00:27:31.078 --> 00:27:36.118 It achieved 61.4. 76 00:27:36.118 --> 00:27:41.068 Flops and it planned to double its computing power. 77 00:27:41.068 --> 00:27:44.878 Which would have made it the current 3rd, most powerful computer. 78 00:27:44.878 --> 00:27:51.239 But the U s government rejected intel's application export processors for the computer. 79 00:27:51.239 --> 00:28:01.048 They cited by nuclear research, so until was unable to export. 80 00:28:01.048 --> 00:28:04.318 Um, processors. 81 00:28:04.318 --> 00:28:10.409 For the computers, so in response to that. 82 00:28:10.409 --> 00:28:14.398 The National University of defense technology created the subway. 83 00:28:14.398 --> 00:28:20.368 Tie who light supercomputer using only domestic. 84 00:28:20.368 --> 00:28:23.848 Domestically developed processors and. 85 00:28:23.848 --> 00:28:27.298 Which I believe is the 4th. 86 00:28:27.298 --> 00:28:31.348 Computer on the top of the list, um. 87 00:28:31.348 --> 00:28:34.588 But the 2 a runs. 88 00:28:34.588 --> 00:28:38.308 It has 4.9M computing corps. 89 00:28:38.308 --> 00:28:42.719 And 2.27, petabytes of memory. 90 00:28:42.719 --> 00:28:50.159 It was reportedly used for simulation analysis and government security applications, but since it's a. 91 00:28:50.159 --> 00:28:53.578 It's used by government. 92 00:28:53.578 --> 00:28:57.179 In the National University fast technology is not exactly no. 93 00:28:57.179 --> 00:29:03.058 More of a military application, so I don't know exactly what it was all used for. 94 00:29:03.058 --> 00:29:07.078 Something interesting about it is it was accordingly difficult to use. 95 00:29:07.078 --> 00:29:12.269 And running code for it was challenging with those and experience with the machine and. 96 00:29:12.269 --> 00:29:16.439 So, it was very specific to the, to a. 97 00:29:16.439 --> 00:29:21.628 Yeah, I think that's all I have. 98 00:29:21.628 --> 00:29:27.929 Anyone has any questions yes. Thank you. Very much. Interesting. 99 00:29:27.929 --> 00:29:33.298 So, I'll start with a question, so this looks to be a bigger part of invidious business plan is. 100 00:29:33.298 --> 00:29:37.979 Yeah, like this yeah, so they've been, um. 101 00:29:37.979 --> 00:29:42.479 Working with their companies to use the computing. 102 00:29:42.479 --> 00:29:49.229 The saline for remotely using their computing power and. 103 00:29:49.229 --> 00:29:55.919 Using their framework for building up super computers and other places besides just and videos. 104 00:29:55.919 --> 00:30:00.719 Headquarters it's kind of interesting. The contrast of these 2 computers ones are very. 105 00:30:00.719 --> 00:30:04.888 Commercial 1, with a lot of, you know. 106 00:30:04.888 --> 00:30:08.098 Press releases, and the other 1, it's more of a. 107 00:30:08.098 --> 00:30:13.439 Research your military product with very different stories. 108 00:30:13.439 --> 00:30:17.128 But that is an interesting contrast between the 2 we're researching them. 109 00:30:18.239 --> 00:30:28.439 Yeah, thank you. I have a problem with some of the invidious press releases, so finally polished. I have trouble extracting the actual information. Yeah. 110 00:30:28.439 --> 00:30:32.578 Yeah, a lot of it was look, we have a robot that tends to it and not a lot of like. 111 00:30:32.578 --> 00:30:42.719 Hard information, right I probably even finding which of their are faster than the others. Sometimes, you know, they're all. 112 00:30:42.719 --> 00:30:50.459 Holidays adjectives. Okay. Which is a newer 1, the higher performing 1 anyone else have any questions. 113 00:30:51.628 --> 00:30:57.449 Okay, thank you. I was curious. I don't know for sure about it. 114 00:30:57.449 --> 00:31:02.038 The simulations they were running was they actually like. 115 00:31:02.038 --> 00:31:07.439 Per event that they were doing, like, nuclear stuff and scared, or wasn't like, just skepticism. 116 00:31:07.439 --> 00:31:13.528 Well, we don't really know from what I read the. 117 00:31:13.528 --> 00:31:19.648 Of the National University defense technology said they were using it for. 118 00:31:19.648 --> 00:31:23.909 Simulation analysis and government security applications, which is kind of a vague. 119 00:31:23.909 --> 00:31:36.058 But and I think the US Department of defense said that they were used for potentially nuclear, but I guess there's no way for us individually to know for sure. It could've just been, you know, skepticism. 120 00:31:36.058 --> 00:31:49.979 Of the of the yes becoming the most powerful computer again and 1 thing to maintain you was super computer technology priority. So who knows, you know. 121 00:31:49.979 --> 00:31:58.318 Okay. 122 00:31:58.318 --> 00:32:03.088 So, great Connor are you online? 123 00:32:05.009 --> 00:32:13.558 Yes, would you like to talk. 124 00:32:17.009 --> 00:32:23.759 Connor you logged in are you there? 125 00:32:28.048 --> 00:32:32.608 Not there at the moment. 126 00:32:32.608 --> 00:32:35.878 Silence. 127 00:32:35.878 --> 00:32:41.969 Okay, well, if not then. 128 00:32:41.969 --> 00:32:48.388 You can talk, you can talk Monday so. 129 00:32:48.388 --> 00:32:58.499 If this is since we were making the iron Q talk and the slides will be available online somewhere and I'll talk about more detail and I get to quantum computing later on. 130 00:32:58.499 --> 00:33:05.699 Then that's unless anyone has anything you would like to have another question here. 131 00:33:06.929 --> 00:33:10.378 No, I don't see you. I'm just checking. 132 00:33:10.378 --> 00:33:17.368 Unless anyone would like to talk about anything that's class for today and also. 133 00:33:17.368 --> 00:33:30.088 See, you on Monday, what we'll do Monday, I think, is finish off open and perhaps show lots of little programming examples and then move on to another parallel topic. 134 00:33:30.088 --> 00:33:35.308 I've also got to yes. Sorry. Did you want me to share my. 135 00:33:35.308 --> 00:33:41.038 My quote, unquote presentation my my laptop shut down so that's why I left. 136 00:33:41.038 --> 00:33:44.098 Um, momentarily so I apologize for that. 137 00:33:44.098 --> 00:33:49.858 Well, you are on you on the schedule for today I think that I screw up the schedule. 138 00:33:49.858 --> 00:33:57.449 No, I am on the schedule. I mean, if you've got hardware problems, we can. Do you Monday? 139 00:33:57.449 --> 00:34:02.759 No, I'm booked to reboot. I just joined on my phone, so I can. 140 00:34:02.759 --> 00:34:12.748 Doc, I just wanted to apologize for. Yeah, but if you have any video at all, then maybe wait until you can show the video you got any. 141 00:34:12.748 --> 00:34:20.429 Or if you want to just talk yeah, I can just talk. Okay. No, not just for last Monday. 142 00:34:21.568 --> 00:34:28.349 So, my topic was the applications of parallel computing and machine learning. 143 00:34:28.349 --> 00:34:43.139 So, obviously, the topic of applying parallel computing to machine learning processes, and algorithms is a topic with a lot of ongoing research as. 144 00:34:43.139 --> 00:34:53.429 The applications of machine learning as well as the size of the data sets that the algorithms employ has grown. 145 00:34:53.429 --> 00:35:02.518 Um, applications were becoming increasingly complex and time consuming, and this is led researchers and developers to want to. 146 00:35:02.518 --> 00:35:05.998 You know, more quickly and efficiently perform the necessary. 147 00:35:05.998 --> 00:35:09.748 Computations and this is especially true and. 148 00:35:09.748 --> 00:35:17.128 Areas of the commercial and development space where real time processing and speed are a customer focus. 149 00:35:17.128 --> 00:35:22.469 Think of things like autonomous driving, smart assistance, things like that. 150 00:35:22.469 --> 00:35:33.509 Up to this point machine learning, because it is a newer technology has often taken a single or serial approach to computing and processing. 151 00:35:33.509 --> 00:35:43.168 And while, sometimes this is necessary, due to certain computations and steps needing to be performed before others and then a specific order. 152 00:35:43.168 --> 00:35:49.168 There are also areas where parallel computing could be easily. 153 00:35:49.168 --> 00:35:53.909 Applied and there are some instances inconveniences in. 154 00:35:53.909 --> 00:35:59.128 That lead to the sequential processing model to be incredibly time consuming. 155 00:35:59.128 --> 00:36:09.298 And out, you know, 3 of these areas are 1, you don't actually know in advance which algorithm is going to give you the best result or the best fit to your data. 156 00:36:09.298 --> 00:36:15.809 So you have to try a lot of them and if you're performing that sequentially, obviously it will take a lot of time. 157 00:36:15.809 --> 00:36:26.188 Another 1 is that the learning process of a model is often controlled through things called hyper parameters, which is something that the user has to take. 158 00:36:26.188 --> 00:36:40.679 And there's no formula to really calculate that. So these are fine tuned over the course of iterations and often require hundreds of iterations for the model to find the hyper parameters that sit the, the design of the problem. 159 00:36:40.679 --> 00:36:49.409 And then ML models by designer super flexible, which puts them at risk of overfitting the data. 160 00:36:49.409 --> 00:36:56.159 And this is why, in your model not only links the data itself, but it also ends the noise of the data. 161 00:36:56.159 --> 00:37:00.778 This can be prevented by using a message called cross validation. 162 00:37:00.778 --> 00:37:04.318 Which is actually an area. 163 00:37:04.318 --> 00:37:09.958 That lends itself very well 2 parallel computing, but cross validation is essentially. 164 00:37:09.958 --> 00:37:15.179 Just training and evaluating the models on different subsets of the data. Multiple times. 165 00:37:15.179 --> 00:37:23.699 So, because of the iterative nature, and how often they need to be repeated this test, open up a whole world of. 166 00:37:23.699 --> 00:37:26.728 Opportunity for parallel. 167 00:37:26.728 --> 00:37:27.239 So, 168 00:37:27.233 --> 00:37:27.594 while, 169 00:37:27.594 --> 00:37:28.193 unfortunately, 170 00:37:28.193 --> 00:37:42.623 I wasn't able to find a whole lot of high level information on how machine learning and parallel computing intersect when we think of as simply processing large amounts of data and performing numerous computations often. 171 00:37:42.929 --> 00:37:49.889 Many many times we can derive a few areas that might benefit from this parallelization. 172 00:37:49.889 --> 00:38:04.469 So, for 1 to relate it to what we've kind of talked about, a lot of tasks fall into the or the single program, multi data area taxonomy. So, if we look into. 173 00:38:04.469 --> 00:38:12.599 Some of the computations, we can see a few processes that would largely benefit from parallelization technique. 174 00:38:12.599 --> 00:38:20.130 1 of these is due to the high potential for performance gain. This is a huge focus and. 175 00:38:22.315 --> 00:38:27.894 Is paralyzing the common tasks that can be shared among a lot of numerous algorithms, 176 00:38:27.925 --> 00:38:32.695 such as matrix multiplication and matrix multiplication as, 177 00:38:32.784 --> 00:38:33.085 you know, 178 00:38:33.085 --> 00:38:37.074 it's well used for implementing things like linear regression algorithms, 179 00:38:37.074 --> 00:38:43.074 which are incredibly popular and we can also use parallelization for things like distance calculation. 180 00:38:43.380 --> 00:38:48.960 Where, and distance is a common metric and machine learning. 181 00:38:48.960 --> 00:39:03.869 Used in machine learning that requires iterative calculations across numerous algorithms to determine and because these calculations are entirely independent about other calculations. The same iteration. 182 00:39:03.869 --> 00:39:07.110 You know, we can clearly use parallelization pretty well. 183 00:39:07.110 --> 00:39:11.039 Like, I had mentioned before 1 of the areas that. 184 00:39:11.039 --> 00:39:20.519 Can really benefit from parallelization is key fold cross validation. That's the data validation. Technique used. 185 00:39:20.519 --> 00:39:30.449 To avoid overfitting the data it's pretty much a common evaluation technique that's regularly employed and an model. 186 00:39:30.449 --> 00:39:34.590 To validate it, which involves just an intense processing of. 187 00:39:34.590 --> 00:39:45.780 Processing dataset segments to determine and algorithms error rate and it's a model. It's a. 188 00:39:45.780 --> 00:39:54.780 While I won't get into the technical details of how it works, or, you know, what happens in it for the sake of this discussion. We can pretty much just say that it's. 189 00:39:54.780 --> 00:39:58.829 A highly iterative and repetitive series of competition. 190 00:39:58.829 --> 00:40:07.530 Results of all of these computations then being combined and averaged into a single integrated model or predicted error rate. 191 00:40:07.530 --> 00:40:12.840 So, when performs sequentially, this can be relatively time consuming, especially when. 192 00:40:12.840 --> 00:40:20.670 Each bold or segment is paired with a copy tech funnel a complex algorithm, such as, you know. 193 00:40:20.670 --> 00:40:24.570 Linear regression. Matrix multiplication. 194 00:40:24.570 --> 00:40:30.989 So, performing these computations in parallel could drastically speed up error approximation, which is a big. 195 00:40:30.989 --> 00:40:34.619 A big component of fitting data to an model. 196 00:40:34.619 --> 00:40:43.619 And this is a standard method, so if we could figure out a way and paralyze these as well, it would speed up processing. 197 00:40:43.619 --> 00:40:57.059 Quite a bit, but, like, I had mentioned, not every process can be effectively paralyzed. Some of them do have to be sequentially performed. So there is an extent how we can parallelize these. 198 00:40:57.059 --> 00:41:07.889 But for actual applications are 2 of the most likely tools for applying core parallelization. 199 00:41:07.889 --> 00:41:12.989 In the space, and we did talk about food a little bit and I'm not sure everyone. 200 00:41:12.989 --> 00:41:18.570 5 thoughts, but from a capability standpoint, there's really the big 2. 201 00:41:18.570 --> 00:41:24.750 That's pretty much all I have. Okay. Thank you. Connor. 202 00:41:24.750 --> 00:41:28.380 Any questions. 203 00:41:28.380 --> 00:41:33.780 Questions if I have a comment to see. 204 00:41:33.780 --> 00:41:42.389 I guess some parallel application that if you go to the and video website, and then if I can say, pull up a. 205 00:41:42.389 --> 00:41:47.969 Page here, I'll just share. 206 00:41:47.969 --> 00:41:54.000 Great so this is just a page I pulled up from just a 2nd. 207 00:41:55.769 --> 00:42:10.739 Yeah, you may be able to see it. Yeah, so they again news the previous talk, they're big part of their business. Now. In fact, they're adding stuff to their hard. We mentioned matrix multiplication. They are adding. 208 00:42:10.739 --> 00:42:17.280 New hardware features, they've been doing it for several years now to the to do machine learning faster. 209 00:42:17.280 --> 00:42:31.500 What are the types of hardware are lower? Precision, 16 bit floating point and we're single precision is 32 bits, including the exponent. So, 16 best he has about 7 beds plus the exponent because that's adequate for the. 210 00:42:31.500 --> 00:42:42.539 For the matrix qualifications you mentioned, and it can be done faster since this last data and data speeds limitations. That's just 1 thing about tying it in. And you see, you can. 211 00:42:42.539 --> 00:42:46.500 Browse around here and. 212 00:42:46.500 --> 00:42:52.110 Hi, before and that sort of thing. Okay. 213 00:42:53.280 --> 00:43:00.630 The other questions, and so on, if not well, have a good weekend and see you Monday. 214 00:43:01.739 --> 00:43:08.309 Actually, uh, 1, quick question. Okay. Uh, I'm just briefly looking at your blog and I saw this at homework. 3. 215 00:43:08.309 --> 00:43:11.730 Uh, we're using parallel dot. 216 00:43:11.730 --> 00:43:16.530 You just quickly go over again, uh, how to use parallel. 217 00:43:16.530 --> 00:43:27.809 Sure, yeah, I went through it went through it Monday budgets. You connect to it. Let me show you how. 218 00:43:27.809 --> 00:43:33.869 I might do it then just to give me a minute here. 219 00:43:33.869 --> 00:43:41.670 Let's see and. 220 00:43:41.670 --> 00:43:45.449 Okay, let me. 221 00:43:45.449 --> 00:43:49.650 See, if I can share a screen. 222 00:43:58.230 --> 00:44:05.579 Okay, and let me just get this around here. 223 00:44:05.579 --> 00:44:10.079 Okay, so you're seeing a window on my laptop and. 224 00:44:10.079 --> 00:44:16.559 H, and now I'm off campus size, the VPN running. 225 00:44:16.559 --> 00:44:20.340 Silence. 226 00:44:21.894 --> 00:44:29.965 Now, I have set up, I've set up a key pair with you don't have a key pair set up. 227 00:44:29.965 --> 00:44:40.344 So, you, when you did this, it would ask for your password and your password is Erin 6, 6 0T something for 9 digits and you'd want to go and. 228 00:44:41.369 --> 00:44:52.559 You'd want to go and change it and to something else. And if we look at slash the route here, I've got a link called class 2021. 229 00:44:52.559 --> 00:45:03.869 And users are your home directory, which you don't see each other's files files oh, this thing is running. 230 00:45:03.869 --> 00:45:09.119 It's riding, it's running Linux. Of course. 231 00:45:09.119 --> 00:45:22.014 As to all the supercomputers, and just to show you what the machine is age top it's got 250 to it's got a quarter terabyte of RAM of D RAM and it's dual 14 corps. 232 00:45:24.269 --> 00:45:31.679 See on, so that's 56 hyper threads there a fair number of threads and. 233 00:45:32.699 --> 00:45:37.769 What it's what it's God is. 234 00:45:37.769 --> 00:45:45.119 What 2 GP is interesting is the 8000. 235 00:45:45.119 --> 00:45:52.050 Um, which has 48 gigabytes of memory on the GPU and 4600 CUDA cores. 236 00:45:52.050 --> 00:45:59.519 And so you're here and open an invoice talks about the machine open MP. 237 00:45:59.519 --> 00:46:06.420 And these are some programs, the Pi, the various information, the Pi. 238 00:46:06.420 --> 00:46:14.460 Here, and now you cannot write into this directory, I hope, but you can copy the. 239 00:46:14.460 --> 00:46:25.559 Copy files to somewhere else and then compile them. They all use comments. So you can copy. I'll use the directory DS let's say, and copy. 240 00:46:25.559 --> 00:46:29.070 And, um. 241 00:46:29.070 --> 00:46:34.349 Yes, and copy to make file also. 242 00:46:34.349 --> 00:46:39.539 And then I can go to DS and there are my files here. I could say something like make hello. 243 00:46:39.539 --> 00:46:47.309 And it's just using by default g plus boss later on, I'll switch to different compiler and then we can run it. 244 00:46:47.309 --> 00:46:55.769 And every time we run it, it's different, because all the threads are stomping on each other's feet. And this is the 1st lesson. 245 00:46:55.769 --> 00:47:05.909 So, that's what I have in this directory of stuff that I showed on Monday and the last year is. 246 00:47:05.909 --> 00:47:09.179 Show stuff I last year, and I'm caught. 247 00:47:09.179 --> 00:47:15.059 Think is I have to fix last year. You can go in and look at old stuff and I'll show more of that next Monday. 248 00:47:16.739 --> 00:47:25.590 Is that a start? Yes, thank you. Very much. Sure. Any more questions or other questions. 249 00:47:25.590 --> 00:47:29.969 Now, if you're not from a Linux background, you're going to have to learn a little annex, but. 250 00:47:29.969 --> 00:47:35.099 I'm not apologizing because all the big computers use Harry and sub Linux. So. 251 00:47:35.099 --> 00:47:41.250 He was amazed I haven't found any that are using. 252 00:47:41.250 --> 00:47:49.440 Microsoft at 1.180 s machine on the list of using Microsoft, and then the new list, they were all Linux. 253 00:47:49.440 --> 00:48:00.840 Other questions yes, just really quick. I had just logged into the server. I was able to get in fine, but it's asking me about creating a. 254 00:48:00.840 --> 00:48:12.449 Dot file. Oh, that's just for the shell. You can say no, should the shell gives you lots of options? I'm using Z shell so you can change it to another shell if you want. So. 255 00:48:12.449 --> 00:48:16.590 Okay, that's the 1st time you use had shell? Yeah. 256 00:48:19.320 --> 00:48:29.369 I'm introducing you to various tools that don't make your life easier, but you don't have to use them and Monday's blog we'll have more information on and so on. You can look ahead to that. 257 00:48:30.869 --> 00:48:34.739 Other. 258 00:48:34.739 --> 00:48:42.480 Okay, have a good weekend. I may be out skiing or something so I'll see you Monday. 259 00:48:42.480 --> 00:48:46.469 Silence.