1 00:00:04,400 --> 00:00:07,240 What I want to do today is chat with you 2 00:00:07,240 --> 00:00:09,759 about career advice in AI. 3 00:00:09,759 --> 00:00:13,559 And in previous years, I used to do most of this lecture 4 00:00:13,560 --> 00:00:14,560 by myself. 5 00:00:14,560 --> 00:00:16,480 But what I thought I'd do today is 6 00:00:16,480 --> 00:00:19,160 I'll share just a few thoughts and then hand it over 7 00:00:19,160 --> 00:00:23,160 to my good friend Laurence Moroney, who I invited to speak 8 00:00:23,160 --> 00:00:27,140 here and kindly agreed to come all the way to San Francisco, 9 00:00:27,140 --> 00:00:31,240 he lives in Seattle, to share with us a very broad market 10 00:00:31,239 --> 00:00:33,460 landscape for what he's seeing in the job market, 11 00:00:33,460 --> 00:00:39,000 as well as tips for career, growing a career in AI. 12 00:00:39,000 --> 00:00:42,359 But there was just two slides and then one more thought 13 00:00:42,359 --> 00:00:44,439 I want to share with you before I hand it over 14 00:00:44,439 --> 00:00:49,320 to Laurence, which is it really feels like the best 15 00:00:49,320 --> 00:00:53,399 opportunity, the best time ever to be building with AI 16 00:00:53,399 --> 00:00:55,799 and to building a career in AI. 17 00:00:55,799 --> 00:00:59,809 A few months ago I noticed in social media, traditional media, 18 00:00:59,810 --> 00:01:03,370 there are a few questions about is AI slowing down? 19 00:01:03,369 --> 00:01:05,832 People saying, well, it's GPT-5 that good? 20 00:01:05,832 --> 00:01:07,250 I think it's actually pretty good. 21 00:01:07,250 --> 00:01:10,269 But there are questions about is AI progress slowing down? 22 00:01:10,269 --> 00:01:13,469 And I think part of the reason the question was even raised was 23 00:01:13,469 --> 00:01:19,670 because if a benchmark for AI is 100% is perfect answers, 24 00:01:19,670 --> 00:01:22,370 then if you make rapid progress, at some point, 25 00:01:22,370 --> 00:01:25,510 you cannot get above 100% accuracy. 26 00:01:25,510 --> 00:01:30,350 But one of the studies that most influenced my thinking was work 27 00:01:30,349 --> 00:01:32,549 done by this organization, M-E-T-R, 28 00:01:32,549 --> 00:01:37,069 METR that studied as time passes, 29 00:01:37,069 --> 00:01:40,949 how complex are the tasks that AI could do as measured by how 30 00:01:40,950 --> 00:01:43,950 long it takes a human to do that task? 31 00:01:43,950 --> 00:01:47,870 So a few years ago, maybe GPT-2 could do tasks 32 00:01:47,870 --> 00:01:50,734 that a human could do in a couple seconds. 33 00:01:50,734 --> 00:01:52,109 And then they could do tasks that 34 00:01:52,109 --> 00:01:57,030 took a human four seconds, then eight seconds, then, 35 00:01:57,030 --> 00:02:00,219 a minute, two minutes, four minutes, and so on. 36 00:02:00,219 --> 00:02:03,420 And the study estimates that the length of task AI can do 37 00:02:03,420 --> 00:02:06,700 is doubling every seven months. 38 00:02:06,700 --> 00:02:09,620 And I think on this metric, I feel 39 00:02:09,620 --> 00:02:12,599 optimistic that AI will continue making progress, 40 00:02:12,599 --> 00:02:15,139 meaning the complexity of tasks as measured 41 00:02:15,139 --> 00:02:19,659 by how long a human takes to do something is doubling rapidly. 42 00:02:19,659 --> 00:02:22,379 And same study with a smaller data set 43 00:02:22,379 --> 00:02:25,859 seems to show-- same study argued that for AI coding, 44 00:02:25,860 --> 00:02:29,840 the doubling time is even shorter, maybe 70 days. 45 00:02:29,840 --> 00:02:31,539 So this code that used to take me, 46 00:02:31,539 --> 00:02:34,620 I don't know, 10 minutes to write, then 20 minutes 47 00:02:34,620 --> 00:02:36,340 to write, 40 minutes to write, and AI 48 00:02:36,340 --> 00:02:38,700 could do more and more of that. 49 00:02:38,699 --> 00:02:41,419 And so the reasons I think this is a golden age 50 00:02:41,419 --> 00:02:43,699 to be building, best time we've ever seen 51 00:02:43,699 --> 00:02:47,819 is maybe two themes which are more powerful and faster. 52 00:02:47,819 --> 00:02:50,340 So we can all, all of you in this room 53 00:02:50,340 --> 00:02:54,620 can now write software that is more powerful than what 54 00:02:54,620 --> 00:02:57,730 anyone on the planet could have built a year 55 00:02:57,729 --> 00:03:00,789 ago by using AI building blocks. 56 00:03:00,789 --> 00:03:03,150 AI building blocks include large language models, 57 00:03:03,150 --> 00:03:05,110 radical genetic workflows, voice AI, 58 00:03:05,110 --> 00:03:06,350 and of course, deep learning. 59 00:03:06,349 --> 00:03:10,169 It turns out that a lot of LLMs have a decent, at least basic 60 00:03:10,169 --> 00:03:12,006 understanding of deep learning. 61 00:03:12,007 --> 00:03:14,090 So if you have a prompt one of the frontier models 62 00:03:14,090 --> 00:03:16,890 to implement a cutting edge neural network for you, 63 00:03:16,889 --> 00:03:19,949 try prompting it to implement a transformer network for you. 64 00:03:19,949 --> 00:03:23,369 It's actually not bad at helping you use these building blocks 65 00:03:23,370 --> 00:03:25,810 to build software quickly. 66 00:03:25,810 --> 00:03:29,610 And so we have very powerful building blocks 67 00:03:29,610 --> 00:03:32,607 that were very difficult or did not exist a year or two ago. 68 00:03:32,606 --> 00:03:34,689 And so you can now build software that does things 69 00:03:34,689 --> 00:03:38,250 that no one else on the planet, even the most advanced teams 70 00:03:38,250 --> 00:03:39,930 on the planet, could have done. 71 00:03:39,930 --> 00:03:44,849 And then also with AI coding, the speed 72 00:03:44,849 --> 00:03:46,569 with which you can get software written 73 00:03:46,569 --> 00:03:49,169 is much faster than ever before. 74 00:03:49,169 --> 00:03:50,969 And I've personally found it as important 75 00:03:50,969 --> 00:03:52,870 to stay on the frontier of tools, 76 00:03:52,870 --> 00:03:55,810 because the tools for AI coding changes, 77 00:03:55,810 --> 00:03:57,670 I don't know, really rapidly. 78 00:03:57,669 --> 00:04:03,569 So I feel like since several months ago, my personal number 79 00:04:03,569 --> 00:04:07,209 one favorite tool became Cloud Code, moving on 80 00:04:07,210 --> 00:04:09,849 from some earlier generations, I think. 81 00:04:09,849 --> 00:04:13,090 And then I think since the release of GPT-5, 82 00:04:13,090 --> 00:04:15,090 I think OpenAI Codex has actually 83 00:04:15,090 --> 00:04:17,088 made tremendous progress. 84 00:04:17,088 --> 00:04:19,709 And this morning, Gemini 3 was released, 85 00:04:19,709 --> 00:04:22,610 which haven't had time to play with it yet just this morning. 86 00:04:22,610 --> 00:04:25,030 It seems like another huge leap forward. 87 00:04:25,029 --> 00:04:27,129 So I feel if you ask me every three months 88 00:04:27,129 --> 00:04:29,810 what my personal favorite coding tool is, it actually 89 00:04:29,810 --> 00:04:32,329 probably changes definitely every six months, but quite 90 00:04:32,329 --> 00:04:33,990 possibly every three months. 91 00:04:33,990 --> 00:04:38,970 And I find that being half a generation behind in these tools 92 00:04:38,970 --> 00:04:41,950 means being, frankly, quite a bit less productive. 93 00:04:41,949 --> 00:04:44,539 And I know everyone says AI is moving so fast, 94 00:04:44,540 --> 00:04:45,790 everything's changing so fast. 95 00:04:45,790 --> 00:04:48,810 But AI coding tools, of all the sectors in AI, 96 00:04:48,810 --> 00:04:51,790 many things maybe don't move as fast as the hype says it does, 97 00:04:51,790 --> 00:04:53,920 but AI coding tools is one sector 98 00:04:53,920 --> 00:04:56,660 where I see the pace of progress is tremendous. 99 00:04:56,660 --> 00:04:59,220 And staying at the latest generation of tools, 100 00:04:59,220 --> 00:05:01,800 rather than half generation behind makes 101 00:05:01,800 --> 00:05:03,920 you more productive. 102 00:05:03,920 --> 00:05:06,823 And with our ability to build more powerful software 103 00:05:06,822 --> 00:05:08,239 and build it much faster than ever 104 00:05:08,240 --> 00:05:11,160 before, I think one piece of advice 105 00:05:11,160 --> 00:05:12,800 that I give now, much more strongly 106 00:05:12,800 --> 00:05:15,240 now than even a year ago or two years ago, 107 00:05:15,240 --> 00:05:17,800 is just go and build stuff. 108 00:05:17,800 --> 00:05:19,139 Take classes from Stanford. 109 00:05:19,139 --> 00:05:20,419 Take online courses. 110 00:05:20,420 --> 00:05:22,400 And additionally your opportunity 111 00:05:22,399 --> 00:05:24,000 to build things, and I think Laurence 112 00:05:24,000 --> 00:05:26,199 is going to talk about showing them to others, 113 00:05:26,199 --> 00:05:28,479 is greater than ever before. 114 00:05:28,480 --> 00:05:30,759 But there's one weird implication 115 00:05:30,759 --> 00:05:33,767 of this that is maybe not-- is still, 116 00:05:33,767 --> 00:05:36,059 I don't know, more and more people are appreciating it, 117 00:05:36,060 --> 00:05:38,319 but not widely known, which is the product management 118 00:05:38,319 --> 00:05:43,480 bottleneck, which is that when it is increasingly easy to go 119 00:05:43,480 --> 00:05:47,319 from a clearly written software spec to a piece of code, 120 00:05:47,319 --> 00:05:50,399 then the bottleneck increasingly is deciding what to build 121 00:05:50,399 --> 00:05:53,829 or increasingly writing that queer spec for what you actually 122 00:05:53,829 --> 00:05:55,029 want to build. 123 00:05:55,029 --> 00:05:57,349 When I'm building software, I often 124 00:05:57,350 --> 00:06:00,090 think of going through a loop where we'll write some software, 125 00:06:00,089 --> 00:06:03,609 write some code sure to use this to get user feedback. 126 00:06:03,610 --> 00:06:06,830 I think of this as a PM or product management work. 127 00:06:06,829 --> 00:06:09,029 And then based on user feedback, I'll 128 00:06:09,029 --> 00:06:11,767 revise my view on what users like, what they don't like. 129 00:06:11,767 --> 00:06:13,809 This UI is too difficult. They want this feature. 130 00:06:13,810 --> 00:06:16,509 They don't want that feature and change my conception 131 00:06:16,509 --> 00:06:19,110 of what to build, and then go around this loop 132 00:06:19,110 --> 00:06:21,629 many times to hopefully iterate toward a product 133 00:06:21,629 --> 00:06:23,389 that users love. 134 00:06:23,389 --> 00:06:27,310 And because of AI coding, the process of building software 135 00:06:27,310 --> 00:06:31,470 has become much cheaper and much faster than before. 136 00:06:31,470 --> 00:06:34,630 But that ironically shifts the bottleneck 137 00:06:34,629 --> 00:06:38,189 to deciding what to build. 138 00:06:38,189 --> 00:06:43,029 So some weird trends I'm seeing. 139 00:06:43,029 --> 00:06:45,409 In Silicon Valley and in many tech companies, 140 00:06:45,410 --> 00:06:48,950 people have often talked about an engineer to product manager, 141 00:06:48,949 --> 00:06:50,779 engineer to PM ratio. 142 00:06:50,779 --> 00:06:53,439 And you take these ratios with grain of salt, 143 00:06:53,439 --> 00:06:55,439 because they're kind of vary all over the place. 144 00:06:55,439 --> 00:06:57,860 But you hear companies talk about the Eng to PM 145 00:06:57,860 --> 00:07:00,800 ratio of 4 to 1 or 7 to 1 or 8 to 1. 146 00:07:00,800 --> 00:07:04,139 This idea that one product manager writing product specs 147 00:07:04,139 --> 00:07:08,339 can keep four to eight or some number like that engineer 148 00:07:08,339 --> 00:07:09,539 is busy. 149 00:07:09,540 --> 00:07:11,319 But because engineering is speeding up, 150 00:07:11,319 --> 00:07:15,259 whereas product management is not sped up as far as much by AI 151 00:07:15,259 --> 00:07:18,180 as engineering, I'm seeing the Eng 152 00:07:18,180 --> 00:07:22,632 to PM ratio trending downward, maybe even two or one to one. 153 00:07:22,632 --> 00:07:24,299 So some teams I work with, they proposed 154 00:07:24,300 --> 00:07:26,819 headcount was one PM to one engineer, which 155 00:07:26,819 --> 00:07:31,300 is a ratio unlike almost all Silicon Valley, certainly 156 00:07:31,300 --> 00:07:33,980 traditional Silicon Valley companies. 157 00:07:33,980 --> 00:07:37,759 And the other thing I'm seeing is that engineers, 158 00:07:37,759 --> 00:07:40,360 they can also shape products that 159 00:07:40,360 --> 00:07:44,020 can move really fast where you go one step further, 160 00:07:44,019 --> 00:07:46,419 take the engineer, take the PM, and collapse them 161 00:07:46,420 --> 00:07:48,090 into a single human. 162 00:07:48,089 --> 00:07:51,009 And I find that there are definitely 163 00:07:51,009 --> 00:07:53,129 engineers that doing engineering work that 164 00:07:53,129 --> 00:07:55,329 don't enjoy talking to users and having that more 165 00:07:55,329 --> 00:07:58,149 human, empathetic side of work. 166 00:07:58,149 --> 00:08:01,529 But I'm finding increasingly that the subset of engineers 167 00:08:01,529 --> 00:08:05,449 that learn to talk to users, get feedback, develop 168 00:08:05,449 --> 00:08:08,409 deep empathy for users so that they can make decisions 169 00:08:08,410 --> 00:08:10,530 about what to build, those engineers 170 00:08:10,529 --> 00:08:12,889 are also the fastest moving people that I'm 171 00:08:12,889 --> 00:08:15,529 seeing in Silicon Valley today. 172 00:08:15,529 --> 00:08:19,829 And I feel like at the earliest stage of my career, 173 00:08:19,829 --> 00:08:24,509 one thing I regretted for years was in one of the roles I had, 174 00:08:24,509 --> 00:08:27,610 I went to try to convince a bunch of engineers 175 00:08:27,610 --> 00:08:29,430 to do more product work. 176 00:08:29,430 --> 00:08:32,129 And I actually made a bunch of really good engineers 177 00:08:32,129 --> 00:08:34,710 feel bad for not being good product managers. 178 00:08:34,710 --> 00:08:37,429 And that was a mistake I made, regretted that for years. 179 00:08:37,428 --> 00:08:39,250 I just shouldn't have done that. 180 00:08:39,250 --> 00:08:40,649 And part of me feels like I'm now 181 00:08:40,649 --> 00:08:44,409 going back to repeat that exact same mistake. 182 00:08:44,409 --> 00:08:47,959 Having said that, I find that the fact 183 00:08:47,960 --> 00:08:50,840 that I can write code, but also talk 184 00:08:50,840 --> 00:08:53,560 to users to shape what to do, that lets me 185 00:08:53,559 --> 00:08:55,779 and the engineers that can do this go much faster. 186 00:08:55,779 --> 00:08:58,000 So I think maybe worth taking another look 187 00:08:58,000 --> 00:09:02,720 at whether engineers can do a bit more of this work, 188 00:09:02,720 --> 00:09:05,120 because then if you're not waiting for someone else 189 00:09:05,120 --> 00:09:06,580 to take the product to customers, 190 00:09:06,580 --> 00:09:08,940 you can just write code, have a gut for what to do next, 191 00:09:08,940 --> 00:09:11,840 and iterate that pace, that velocity of execution 192 00:09:11,840 --> 00:09:13,920 is much faster. 193 00:09:13,919 --> 00:09:17,879 And then before I hand over to Laurence, just one last thing 194 00:09:17,879 --> 00:09:23,679 I want to share, which is in terms of navigating your career, 195 00:09:23,679 --> 00:09:26,159 I think one of the most strong predictors 196 00:09:26,159 --> 00:09:30,199 for your speed of learning and for your level of success 197 00:09:30,200 --> 00:09:32,002 is the people you surround yourself with. 198 00:09:32,001 --> 00:09:33,459 I think we're all social creatures. 199 00:09:33,460 --> 00:09:35,560 We all learn from people around us. 200 00:09:35,559 --> 00:09:40,919 And it turns out there are studies in sociology that 201 00:09:40,919 --> 00:09:44,479 show that if your five closest friends are smokers, 202 00:09:44,480 --> 00:09:46,620 the odds of you being a smoker is pretty much high. 203 00:09:46,620 --> 00:09:48,279 Please don't smoke. 204 00:09:48,279 --> 00:09:51,039 It's just an example. 205 00:09:51,039 --> 00:09:52,799 I don't know of any study showing 206 00:09:52,799 --> 00:09:55,959 that if you're five or 10 closest friends are really 207 00:09:55,960 --> 00:10:00,735 hard working, determined people, learning quickly, trying 208 00:10:00,735 --> 00:10:02,360 to make the world a better place of AI, 209 00:10:02,360 --> 00:10:04,502 that you are more likely to do that too. 210 00:10:04,501 --> 00:10:05,959 But it's one of those things that I 211 00:10:05,960 --> 00:10:07,764 think is almost certainly true. 212 00:10:07,764 --> 00:10:10,139 It's like all of us are inspired by the people around us, 213 00:10:10,139 --> 00:10:12,919 and we're able to find a good group of people to work with, 214 00:10:12,919 --> 00:10:15,279 that helps drive you forward. 215 00:10:15,279 --> 00:10:17,879 In fact, here at Stanford, I feel very fortunate-- 216 00:10:17,879 --> 00:10:22,279 the fantastic student body, fantastic group of faculty. 217 00:10:22,279 --> 00:10:24,701 And then the other thing that I think 218 00:10:24,701 --> 00:10:26,159 we're fortunate to have at Stanford 219 00:10:26,159 --> 00:10:27,879 is our connective tissue. 220 00:10:27,879 --> 00:10:32,639 So candidly, a lot of the people working 221 00:10:32,639 --> 00:10:35,879 and a lot of the cutting edge AI labs, the frontier labs, 222 00:10:35,879 --> 00:10:39,679 they were former students of a lot of different Stanford 223 00:10:39,679 --> 00:10:40,639 faculty. 224 00:10:40,639 --> 00:10:43,870 And so that rich connective tissue candidly 225 00:10:43,870 --> 00:10:45,789 means that at Stanford, we often find out 226 00:10:45,789 --> 00:10:47,962 about a lot of stuff that's not widely 227 00:10:47,962 --> 00:10:50,129 known because of the relationships, the friendships. 228 00:10:50,129 --> 00:10:52,909 And when some company does something, 229 00:10:52,909 --> 00:10:54,317 one of my friends and the faculty 230 00:10:54,317 --> 00:10:56,649 will call up someone to come and say, hey, that's weird. 231 00:10:56,649 --> 00:10:57,567 Does this really work? 232 00:10:57,567 --> 00:11:02,430 And so that rich connective tissue means that we're all-- 233 00:11:02,429 --> 00:11:04,264 just as we try to pull our friends forward, 234 00:11:04,264 --> 00:11:06,389 our friends also pull us forward with the knowledge 235 00:11:06,389 --> 00:11:10,230 and the connective tissue and this know-how of bleeding edge 236 00:11:10,230 --> 00:11:12,110 AI, which unfortunately is not all 237 00:11:12,110 --> 00:11:14,490 published on the internet at this moment in time. 238 00:11:14,490 --> 00:11:18,230 So I think while you're at Stanford, make those friends, 239 00:11:18,230 --> 00:11:20,029 form that rich connective tissue. 240 00:11:20,029 --> 00:11:22,809 And there have been a lot of times that just for myself, 241 00:11:22,809 --> 00:11:25,269 where, frankly, I was thinking of going 242 00:11:25,269 --> 00:11:26,949 in some technical direction. 243 00:11:26,950 --> 00:11:30,670 I'd have one or two phone calls with someone 244 00:11:30,669 --> 00:11:33,309 really close to research, either Stanford researcher or someone 245 00:11:33,309 --> 00:11:34,409 in the frontier lab. 246 00:11:34,409 --> 00:11:37,289 They would share something with me that I didn't know before. 247 00:11:37,289 --> 00:11:39,543 And that changes the way I choose 248 00:11:39,543 --> 00:11:41,210 the technical architecture of a project. 249 00:11:41,210 --> 00:11:43,860 So I find that group of friends you surround yourself 250 00:11:43,860 --> 00:11:46,399 with, those little pieces of information-- try this. 251 00:11:46,399 --> 00:11:48,079 Don't do that-- that's just hype. 252 00:11:48,080 --> 00:11:49,460 Ignore the PR. 253 00:11:49,460 --> 00:11:50,840 Don't actually try that thing. 254 00:11:50,840 --> 00:11:53,340 Those things make a big difference 255 00:11:53,340 --> 00:11:56,920 in your ability to steer the direction of your projects. 256 00:11:56,919 --> 00:11:59,399 So while you're at Stanford, take advantage of that. 257 00:11:59,399 --> 00:12:01,632 This connective tissue that Stanford has, 258 00:12:01,633 --> 00:12:02,800 it's actually really unique. 259 00:12:02,799 --> 00:12:04,882 There are lots of great universities in the world, 260 00:12:04,883 --> 00:12:08,420 but at this moment in time, I don't think there's any-- 261 00:12:08,419 --> 00:12:10,802 I don't want to sound like I'm doing PR for Stanford now, 262 00:12:10,802 --> 00:12:13,220 but I really think there's no university in the world that 263 00:12:13,220 --> 00:12:16,279 is as privileged as Stanford at this moment in time, 264 00:12:16,279 --> 00:12:19,139 in terms of the richness of the connective tissue to all 265 00:12:19,139 --> 00:12:23,100 of the leading AI groups. 266 00:12:23,100 --> 00:12:25,700 But to me, there's also that we're lucky here 267 00:12:25,700 --> 00:12:27,700 to have a wonderful community of people 268 00:12:27,700 --> 00:12:30,100 to work with and learn from. 269 00:12:30,100 --> 00:12:31,220 And for you too. 270 00:12:31,220 --> 00:12:35,300 If you apply for jobs, the thing that is much more 271 00:12:35,299 --> 00:12:37,259 important for your career success would 272 00:12:37,259 --> 00:12:40,490 be if you go to a company, it'll be the people 273 00:12:40,490 --> 00:12:42,990 you work with day to day. 274 00:12:42,990 --> 00:12:49,090 So here's one story that I've told in previous classes, I 275 00:12:49,090 --> 00:12:52,050 repeat, which is there's a Stanford student that I knew 276 00:12:52,049 --> 00:12:54,839 this was many years ago, that I knew, 277 00:12:54,840 --> 00:12:56,590 and they did really good work at Stanford. 278 00:12:56,590 --> 00:12:58,250 I thought they were high flyer. 279 00:12:58,250 --> 00:13:00,970 And they applied for a job at a company, 280 00:13:00,970 --> 00:13:03,490 and they got a job offer from one of the companies 281 00:13:03,490 --> 00:13:07,370 with a hot AI brand. 282 00:13:07,370 --> 00:13:10,990 This company refused to tell him which team he would join. 283 00:13:10,990 --> 00:13:13,350 They said, oh, come sign up for a job. 284 00:13:13,350 --> 00:13:16,129 There's a rotation system, matching system, blah blah blah. 285 00:13:16,129 --> 00:13:17,669 Sign on the dotted line first. 286 00:13:17,669 --> 00:13:21,729 Then we'll figure out what's a good project for you. 287 00:13:21,730 --> 00:13:24,490 Partly because it was a good company. 288 00:13:24,490 --> 00:13:26,850 His parents were proud of him for getting a job 289 00:13:26,850 --> 00:13:27,950 at this company. 290 00:13:27,950 --> 00:13:29,810 This student joined this company hoping 291 00:13:29,809 --> 00:13:32,129 to work on exciting AI project. 292 00:13:32,129 --> 00:13:33,909 And after he signed on the dotted line, 293 00:13:33,909 --> 00:13:36,169 he was assigned to work on the back end 294 00:13:36,169 --> 00:13:39,199 Java payment processing system of the company. 295 00:13:39,200 --> 00:13:41,040 Nothing against anyone that wants to do Java 296 00:13:41,039 --> 00:13:42,539 back end payment processing systems. 297 00:13:42,539 --> 00:13:45,480 I think they're great, but this is an AI student that did not 298 00:13:45,480 --> 00:13:47,519 get matched to an AI project. 299 00:13:47,519 --> 00:13:50,120 And so for about a year, he was really frustrated, 300 00:13:50,120 --> 00:13:53,879 and he actually left this company after about a year. 301 00:13:53,879 --> 00:13:56,919 The unfortunate thing is, I told this story 302 00:13:56,919 --> 00:14:00,199 in CS230 some years back. 303 00:14:00,200 --> 00:14:04,120 And then after I was already telling 304 00:14:04,120 --> 00:14:08,299 the story in this class, a couple of years later, 305 00:14:08,299 --> 00:14:13,759 another student in CS230 went through the same experience 306 00:14:13,759 --> 00:14:16,460 with the same company, not Java back end payment processing, 307 00:14:16,460 --> 00:14:17,460 but different project. 308 00:14:17,460 --> 00:14:21,243 And I think this effect of trying to figure out 309 00:14:21,243 --> 00:14:23,160 who you'll be actually working with day to day 310 00:14:23,159 --> 00:14:25,159 and making sure you're surrounded by people that 311 00:14:25,159 --> 00:14:27,059 inspire you and work on exciting projects, 312 00:14:27,059 --> 00:14:28,179 I think that's important. 313 00:14:28,179 --> 00:14:30,599 And even completely candid, if a company 314 00:14:30,600 --> 00:14:34,540 refuses to tell you what team you'll be assigned to, 315 00:14:34,539 --> 00:14:37,469 that does raise a question in my mind of 316 00:14:37,470 --> 00:14:39,910 whether or not what will happen. 317 00:14:39,909 --> 00:14:42,069 And I think that instead of working 318 00:14:42,070 --> 00:14:44,670 for the company with the hottest brand, 319 00:14:44,669 --> 00:14:48,110 sometimes if you find a really good team with really 320 00:14:48,110 --> 00:14:50,950 hard working, knowledgeable, smart people trying to do good 321 00:14:50,950 --> 00:14:54,370 with AI, but the company logo just isn't as hot, 322 00:14:54,370 --> 00:14:56,429 I think that often means you actually 323 00:14:56,429 --> 00:14:59,709 learn faster and progress your career better because it 324 00:14:59,710 --> 00:15:03,590 is after all, we don't learn from the excitement 325 00:15:03,590 --> 00:15:05,769 of the company logo when you walk through the door, 326 00:15:05,769 --> 00:15:08,289 you learn from the people you deal with day to day. 327 00:15:08,289 --> 00:15:13,269 So I just urge you to use that as a huge criteria 328 00:15:13,269 --> 00:15:16,529 for your selection process for what you decide to do. 329 00:15:21,190 --> 00:15:25,070 But I think number one on my advice 330 00:15:25,070 --> 00:15:27,470 is it's become much easier than ever 331 00:15:27,470 --> 00:15:30,990 before to build powerful software faster. 332 00:15:30,990 --> 00:15:33,330 And what that means is do be responsible. 333 00:15:33,330 --> 00:15:35,790 Don't build software that hurts others. 334 00:15:35,789 --> 00:15:39,349 And at the same time, there are so many things that each of you 335 00:15:39,350 --> 00:15:40,389 can build. 336 00:15:40,389 --> 00:15:42,697 And what I find is the number of ideas out in the world 337 00:15:42,697 --> 00:15:45,029 is much greater than the number of people with the skill 338 00:15:45,029 --> 00:15:45,889 to build them. 339 00:15:45,889 --> 00:15:49,029 So I know that finding jobs has gotten tougher for fresh college 340 00:15:49,029 --> 00:15:49,529 grads. 341 00:15:49,529 --> 00:15:51,709 At the same time, a lot of teams just 342 00:15:51,710 --> 00:15:53,530 can't find enough skilled people. 343 00:15:53,529 --> 00:15:56,669 And so there are a lot of projects 344 00:15:56,669 --> 00:15:58,610 in the world that if you don't build it, 345 00:15:58,610 --> 00:16:00,990 I think no one else will build it either. 346 00:16:00,990 --> 00:16:04,376 So you don't need to-- so long as you don't harm others, 347 00:16:04,376 --> 00:16:06,709 be responsible, there are a lot of things that you don't 348 00:16:06,710 --> 00:16:07,810 need to wait for permission. 349 00:16:07,809 --> 00:16:10,059 You don't need to wait for someone else to do it first 350 00:16:10,059 --> 00:16:11,509 and then you do it. 351 00:16:11,509 --> 00:16:14,909 The cost of a failure is much lower than before 352 00:16:14,909 --> 00:16:17,429 because you waste a weekend but learn something. 353 00:16:17,429 --> 00:16:18,989 That seems fine to me. 354 00:16:18,990 --> 00:16:21,350 So I think so let's be responsible, 355 00:16:21,350 --> 00:16:24,629 going for trying things out and building lots of things 356 00:16:24,629 --> 00:16:27,429 would be the number one most important thing I 357 00:16:27,429 --> 00:16:31,029 think would help your careers. 358 00:16:31,029 --> 00:16:35,579 And yeah, I think I'm going to say one last thing that 359 00:16:35,580 --> 00:16:39,940 is considered not politically correct in some circles, 360 00:16:39,940 --> 00:16:43,860 but I'll just say it anyway, which is in some circles, 361 00:16:43,860 --> 00:16:47,060 it has become considered not politically correct 362 00:16:47,059 --> 00:16:50,339 to encourage others to work hard. 363 00:16:50,340 --> 00:16:53,540 I'm going to encourage you to work hard. 364 00:16:53,539 --> 00:16:55,620 Now, I think the reason some people don't 365 00:16:55,620 --> 00:16:57,913 like that is because there are some people that 366 00:16:57,913 --> 00:16:59,580 are in a phase of life where they're not 367 00:16:59,580 --> 00:17:00,960 in a position to work hard. 368 00:17:00,960 --> 00:17:03,200 So right after my children were born, 369 00:17:03,200 --> 00:17:05,960 I was not working hard for a short period of time. 370 00:17:05,960 --> 00:17:10,818 And there are people because of an injury or disability, 371 00:17:10,818 --> 00:17:12,460 whatever very valid reasons, they're 372 00:17:12,460 --> 00:17:14,799 not in a position to work hard at that moment in time. 373 00:17:14,799 --> 00:17:16,559 And we should respect them, support them, 374 00:17:16,559 --> 00:17:18,268 make sure they're well taken care of even 375 00:17:18,268 --> 00:17:19,700 though they're not working hard. 376 00:17:19,700 --> 00:17:22,180 Having said that, all of my, say, 377 00:17:22,180 --> 00:17:24,640 PhD students have become very successful, 378 00:17:24,640 --> 00:17:27,040 I saw every single one of them work incredibly hard. 379 00:17:27,039 --> 00:17:30,460 I mean, the 2:00 AM sitting up, hyperparameter tuning, 380 00:17:30,460 --> 00:17:31,440 been there, done that. 381 00:17:31,440 --> 00:17:33,490 Still doing it some days. 382 00:17:33,490 --> 00:17:36,450 And if you are fortunate enough to be in a position in life 383 00:17:36,450 --> 00:17:40,250 where you can work really hard, there 384 00:17:40,250 --> 00:17:44,210 are so many opportunities to do things right now. 385 00:17:44,210 --> 00:17:47,529 If you get excited, as I do, spending evenings and weekends 386 00:17:47,529 --> 00:17:50,289 coding and building stuff and getting user feedback, 387 00:17:50,289 --> 00:17:51,990 if you lean in and do those things, 388 00:17:51,990 --> 00:17:54,470 it will increase your odds of being really successful. 389 00:17:54,470 --> 00:17:55,289 So I don't know. 390 00:17:55,289 --> 00:17:57,432 Maybe I get into some trouble with some people 391 00:17:57,432 --> 00:17:58,849 encouraging me to work hard, but I 392 00:17:58,849 --> 00:18:02,049 find that the truth is people that work hard 393 00:18:02,049 --> 00:18:02,956 get a lot more done. 394 00:18:02,957 --> 00:18:05,290 We should also respect people that don't and people that 395 00:18:05,289 --> 00:18:06,809 aren't in a position to do so. 396 00:18:06,809 --> 00:18:10,129 But between watching some dumb TV 397 00:18:10,130 --> 00:18:14,010 show versus firing your agentic coder on a weekend 398 00:18:14,009 --> 00:18:16,049 to try something, I'm going to choose 399 00:18:16,049 --> 00:18:18,407 the latter almost every time. 400 00:18:18,407 --> 00:18:20,949 Unless I'm watching a show with my kids, sometimes I do that. 401 00:18:20,950 --> 00:18:23,190 But you mean-- 402 00:18:23,190 --> 00:18:26,450 I hope you do that. 403 00:18:26,450 --> 00:18:29,650 All right, so those are the main things I wanted to say. 404 00:18:29,650 --> 00:18:33,640 What I want to do is hand the stage over to my good friend 405 00:18:33,640 --> 00:18:38,520 Laurence Moroney, who share a lot more about career 406 00:18:38,519 --> 00:18:39,133 advice on AI. 407 00:18:39,133 --> 00:18:40,133 Let me just quick intro. 408 00:18:40,133 --> 00:18:42,355 I've known Laurence for a long time. 409 00:18:42,355 --> 00:18:44,480 He's done a lot of online education work, sometimes 410 00:18:44,480 --> 00:18:46,380 with me and my teams, taught a lot of people 411 00:18:46,380 --> 00:18:49,060 Tensorflow, taught a lot of people PyTorch. 412 00:18:49,059 --> 00:18:52,119 He was lead AI advocate at Google for many years, now 413 00:18:52,119 --> 00:18:53,707 runs a group at Arm. 414 00:18:53,708 --> 00:18:55,500 I've also enjoyed quite a few of his books. 415 00:18:55,500 --> 00:18:56,619 This is one of them. 416 00:18:56,619 --> 00:18:59,259 He recently also published a new book on PyTorch. 417 00:18:59,259 --> 00:19:02,200 This is an excellent book, Introduction to PyTorch. 418 00:19:02,200 --> 00:19:06,100 And he's a very sought after speakers in many circles, 419 00:19:06,099 --> 00:19:10,639 so I was very grateful when he agreed to come speak to us. 420 00:19:10,640 --> 00:19:11,740 Pleasure is all mine. 421 00:19:11,740 --> 00:19:12,839 I just want to reinforce something 422 00:19:12,839 --> 00:19:14,279 that Andrew was talking about earlier 423 00:19:14,279 --> 00:19:15,839 on about choosing the people that you 424 00:19:15,839 --> 00:19:17,639 work with being very important. 425 00:19:17,640 --> 00:19:20,360 But I also want to show that from the other way around that 426 00:19:20,359 --> 00:19:22,119 the company, when they're interviewing you 427 00:19:22,119 --> 00:19:23,839 are also choosing you. 428 00:19:23,839 --> 00:19:25,559 And the good companies really want 429 00:19:25,559 --> 00:19:27,919 to choose the people that they work with also. 430 00:19:27,920 --> 00:19:30,990 And I've been doing a lot of mentoring of young people 431 00:19:30,990 --> 00:19:32,910 over the last, particularly over the last 18 432 00:19:32,910 --> 00:19:36,009 months, who are hunting for careers for themselves. 433 00:19:36,009 --> 00:19:40,470 And I want to tell the story of one young man and this guy, 434 00:19:40,470 --> 00:19:46,670 very well educated, great experience, super elite coder. 435 00:19:46,670 --> 00:19:49,650 He could do every challenge that was in front of him, 436 00:19:49,650 --> 00:19:51,410 and he got laid off from his job in April. 437 00:19:51,410 --> 00:19:54,750 He worked in medical software, and medical software business 438 00:19:54,750 --> 00:19:56,529 has been changing drastically. 439 00:19:56,529 --> 00:19:58,629 Funding has been cut by the Federal government 440 00:19:58,630 --> 00:20:01,430 in a number of areas, and he got laid off from his job. 441 00:20:01,430 --> 00:20:03,250 And with his experience, with his ability, 442 00:20:03,250 --> 00:20:05,170 with his skills, all of these kind of things, 443 00:20:05,170 --> 00:20:06,190 he thought that it would be very easy 444 00:20:06,190 --> 00:20:07,750 for him to find another job. 445 00:20:07,750 --> 00:20:09,970 And the poor young guy had a really terrible April. 446 00:20:09,970 --> 00:20:12,085 He got laid off from his job in April. 447 00:20:12,085 --> 00:20:13,710 Immediately before that, his girlfriend 448 00:20:13,710 --> 00:20:15,350 had broken up with him, and then a couple of weeks 449 00:20:15,349 --> 00:20:16,569 later, his dog died. 450 00:20:16,569 --> 00:20:19,109 So he was not in a good place. 451 00:20:19,109 --> 00:20:22,629 And so I sat down with him after a couple of months 452 00:20:22,630 --> 00:20:23,850 and took a look. 453 00:20:23,849 --> 00:20:27,500 And he had a spreadsheet of jobs that he was applying to, 454 00:20:27,500 --> 00:20:31,539 and he had over 300 jobs that he was tracking in the spreadsheet. 455 00:20:31,539 --> 00:20:33,579 And in a number of these jobs, he actually 456 00:20:33,579 --> 00:20:35,779 got into the interview process, and he 457 00:20:35,779 --> 00:20:37,660 went very deep in the interview process 458 00:20:37,660 --> 00:20:40,519 with companies like Meta. 459 00:20:40,519 --> 00:20:41,019 Who else? 460 00:20:41,019 --> 00:20:42,180 Not Google. 461 00:20:42,180 --> 00:20:42,840 It was Meta. 462 00:20:42,839 --> 00:20:43,959 There was Microsoft. 463 00:20:43,960 --> 00:20:45,501 There was one of the other large tech 464 00:20:45,501 --> 00:20:48,319 companies where you do lots and lots of interview loops. 465 00:20:48,319 --> 00:20:51,299 And every time towards the end of the loop, 466 00:20:51,299 --> 00:20:52,879 he knew he did a great loop. 467 00:20:52,880 --> 00:20:54,460 He solved all the coding. 468 00:20:54,460 --> 00:20:56,600 He had great conversations with the people, 469 00:20:56,599 --> 00:20:58,039 or at least he thought he had. 470 00:20:58,039 --> 00:20:59,559 And then every time within a day, 471 00:20:59,559 --> 00:21:04,059 the recruiter would call him and say, no, you didn't get the job. 472 00:21:04,059 --> 00:21:06,519 And it was like it was heartbreaking. 473 00:21:06,519 --> 00:21:10,220 And like I said, 300 plus jobs he had been tracking. 474 00:21:10,220 --> 00:21:13,180 So I started working with him to do some mock interviews 475 00:21:13,180 --> 00:21:15,019 and to do some fine tuning. 476 00:21:15,019 --> 00:21:17,940 Oh, it was Jeff Bezos company, not Amazon. 477 00:21:17,940 --> 00:21:19,740 That was one of the other big tech company 478 00:21:19,740 --> 00:21:21,180 that he'd interviewed with. 479 00:21:21,180 --> 00:21:22,700 And I started working through him 480 00:21:22,700 --> 00:21:25,075 and doing some test interviews and all this kind of thing 481 00:21:25,075 --> 00:21:25,680 with him. 482 00:21:25,680 --> 00:21:27,890 Terrific, terrific candidate couldn't figure out 483 00:21:27,890 --> 00:21:31,210 what was going wrong until I decided to try and do 484 00:21:31,210 --> 00:21:33,970 a different sort of interview where I gave him 485 00:21:33,970 --> 00:21:36,210 a really tough interview. 486 00:21:36,210 --> 00:21:38,490 I gave him some tough LeetCode. 487 00:21:38,490 --> 00:21:43,769 I gave him some really obscure corner cases in his coding. 488 00:21:43,769 --> 00:21:46,129 And I saw how he reacted. 489 00:21:46,130 --> 00:21:48,210 And how he reacted was the advice 490 00:21:48,210 --> 00:21:50,625 that was given to him in the recruiting pamphlets. 491 00:21:50,625 --> 00:21:52,250 And a lot of these recruiting pamphlets 492 00:21:52,250 --> 00:21:57,089 will say things like, you're going to have an opportunity 493 00:21:57,089 --> 00:21:59,829 to share an opinion, and you've got to stand your ground. 494 00:21:59,829 --> 00:22:01,269 You've got to have a backbone. 495 00:22:01,269 --> 00:22:03,129 Don't bend. 496 00:22:03,130 --> 00:22:07,930 His interpretation of that was to be really, really tough. 497 00:22:07,930 --> 00:22:09,650 So I would pick corners. 498 00:22:09,650 --> 00:22:11,310 I would pick holes in his code. 499 00:22:11,309 --> 00:22:13,909 I'd pick corner cases where things may not work, 500 00:22:13,910 --> 00:22:15,910 and I would give him a test of crisis. 501 00:22:15,910 --> 00:22:18,970 And this advice that he'd been given to stand his ground 502 00:22:18,970 --> 00:22:23,450 ended up making him hostile in these interview environments. 503 00:22:23,450 --> 00:22:26,370 And I was looking at this then from the point 504 00:22:26,369 --> 00:22:28,469 of view of what Andrew was just talking about, 505 00:22:28,470 --> 00:22:31,809 where it's a case of hey, good people, good teams, people 506 00:22:31,809 --> 00:22:33,529 that you can work together with. 507 00:22:33,529 --> 00:22:35,109 And from the interviewer perspective, 508 00:22:35,109 --> 00:22:38,689 if I'm managing this team, this person is that cliched 10x 509 00:22:38,690 --> 00:22:41,769 engineer, but I don't want him anywhere near my team 510 00:22:41,769 --> 00:22:44,250 because of this attitude. 511 00:22:44,250 --> 00:22:45,069 We worked on that. 512 00:22:45,069 --> 00:22:45,950 We fine-tuned it. 513 00:22:45,950 --> 00:22:49,730 And the strange part is he's a really, really nice guy. 514 00:22:49,730 --> 00:22:52,410 It's just this was the advice he was given, 515 00:22:52,410 --> 00:22:54,970 and he followed that advice, and he failed so many interviews 516 00:22:54,970 --> 00:22:56,130 as a result. 517 00:22:56,130 --> 00:22:58,850 So when I gave him the next job that he was interviewing at 518 00:22:58,849 --> 00:23:03,129 was at a company where teamwork is very, very highly valued. 519 00:23:03,130 --> 00:23:05,990 And the good news is he got the job at that company. 520 00:23:05,990 --> 00:23:07,390 He's now working there. 521 00:23:07,390 --> 00:23:10,009 He doubled his salary from the job he was laid off from, 522 00:23:10,009 --> 00:23:12,170 and he ended up having about-- now he looks back 523 00:23:12,170 --> 00:23:14,380 and he had six months of fun employment. 524 00:23:14,380 --> 00:23:16,630 But at the time when he was going through all of that, 525 00:23:16,630 --> 00:23:19,210 it was a very, very difficult time for him. 526 00:23:19,210 --> 00:23:21,650 So the flip side of it, if you're looking at a company 527 00:23:21,650 --> 00:23:23,567 and looking at the paper you'd be working with 528 00:23:23,567 --> 00:23:24,660 is very, very important. 529 00:23:24,660 --> 00:23:28,040 But also realize they are looking at you in the same way. 530 00:23:28,039 --> 00:23:31,836 And so if you've gone to a tech interview coaching, 531 00:23:31,836 --> 00:23:33,920 and they gave you that advice to stand your ground 532 00:23:33,920 --> 00:23:36,180 and have a backbone, it's good to do that. 533 00:23:36,180 --> 00:23:38,253 But don't be a jerk while you're doing so. 534 00:23:38,252 --> 00:23:39,169 Can you see my slides? 535 00:23:39,170 --> 00:23:39,670 OK. 536 00:23:39,670 --> 00:23:41,480 So I'm Laurence. 537 00:23:41,480 --> 00:23:44,279 I've been working in tech for more decades 538 00:23:44,279 --> 00:23:48,759 than ChatGPT thinks there are oars in strawberry. 539 00:23:48,759 --> 00:23:51,460 So I've worked in many of the big tech companies. 540 00:23:51,460 --> 00:23:54,319 I spent many years at Microsoft, spent many years at Google, 541 00:23:54,319 --> 00:23:56,939 also worked in places like Reuters. 542 00:23:56,940 --> 00:23:59,480 I've done a lot of work in startups, both in this country 543 00:23:59,480 --> 00:24:00,519 and abroad. 544 00:24:00,519 --> 00:24:02,480 And so what I really want to talk about today 545 00:24:02,480 --> 00:24:06,039 is like to think about what does the career landscape look 546 00:24:06,039 --> 00:24:09,440 like today, particularly in AI. 547 00:24:09,440 --> 00:24:13,299 Because first of all, what Andrew said about in Stanford, 548 00:24:13,299 --> 00:24:16,274 you've got the ability to make use of the networks 549 00:24:16,275 --> 00:24:18,400 that you have in Stanford, make use of the prestige 550 00:24:18,400 --> 00:24:21,230 that you have, and I say use every weapon you have. 551 00:24:21,230 --> 00:24:23,029 Because unfortunately, the landscape right 552 00:24:23,029 --> 00:24:25,029 now is not ideal. 553 00:24:25,029 --> 00:24:27,085 We've gone through some very difficult times. 554 00:24:27,085 --> 00:24:28,710 All you have to do is look at the news, 555 00:24:28,710 --> 00:24:33,230 and you can see massive tech layoffs, slowing hiring in tech, 556 00:24:33,230 --> 00:24:34,410 and lots of stuff like that. 557 00:24:34,410 --> 00:24:36,910 But it's not necessarily a bad thing 558 00:24:36,910 --> 00:24:38,610 if you do it the right way. 559 00:24:38,609 --> 00:24:40,909 So I want to just have a quick look the job market 560 00:24:40,910 --> 00:24:42,790 reality check. 561 00:24:42,789 --> 00:24:44,470 Actually out of interest, I don't know. 562 00:24:44,470 --> 00:24:46,692 This is a-- are you juniors? 563 00:24:46,692 --> 00:24:49,109 You're graduating this year or you're graduating next year 564 00:24:49,109 --> 00:24:52,109 or what is the general survey? 565 00:24:52,109 --> 00:24:53,750 You're third year of four? 566 00:24:53,750 --> 00:24:55,269 [INAUDIBLE] 567 00:24:55,269 --> 00:24:56,690 Third year of three, I would say. 568 00:24:56,690 --> 00:24:59,309 So you're going to be graduating coming summer. 569 00:24:59,309 --> 00:25:02,549 How many people are already looking for jobs? 570 00:25:02,549 --> 00:25:04,009 OK, quite a few of you. 571 00:25:04,009 --> 00:25:06,910 How many people have had success? 572 00:25:06,910 --> 00:25:07,830 Nobody. 573 00:25:07,829 --> 00:25:08,329 Oh, one. 574 00:25:08,329 --> 00:25:08,869 OK. 575 00:25:08,869 --> 00:25:09,989 That's good. 576 00:25:09,990 --> 00:25:12,630 So you're probably seeing some of these things, the signals 577 00:25:12,630 --> 00:25:15,530 out there, junior hiring slowing significantly. 578 00:25:15,529 --> 00:25:18,389 When I say junior, I mean graduate level. 579 00:25:18,390 --> 00:25:21,240 High-profile layoffs are dominating the headlines. 580 00:25:21,240 --> 00:25:23,140 I was at Google a couple of years ago 581 00:25:23,140 --> 00:25:25,220 when they had the biggest layoff they'd ever had. 582 00:25:25,220 --> 00:25:28,079 We're seeing layoffs at the likes of Amazon, Microsoft, 583 00:25:28,079 --> 00:25:30,220 other companies like that. 584 00:25:30,220 --> 00:25:33,559 It feels that entry-level positions are scarce, 585 00:25:33,559 --> 00:25:35,879 and I'm underlining the word "feels" there, 586 00:25:35,880 --> 00:25:38,540 and I want to get into that in a little bit more detail later. 587 00:25:38,539 --> 00:25:41,700 And also, competition is fierce. 588 00:25:41,700 --> 00:25:43,500 But my question is, should you worry? 589 00:25:43,500 --> 00:25:45,099 And I say, no. 590 00:25:45,099 --> 00:25:49,519 Because if you can approach things in the right way, 591 00:25:49,519 --> 00:25:52,220 if you can approach the job hunting thing in the right way, 592 00:25:52,220 --> 00:25:55,900 particularly understanding how rapidly the AI landscape is 593 00:25:55,900 --> 00:25:58,740 changing, then I think people with the right mindset 594 00:25:58,740 --> 00:26:00,700 will thrive. 595 00:26:00,700 --> 00:26:03,259 So what do I mean by that? 596 00:26:03,259 --> 00:26:06,339 So as Andrew had mentioned, the AI hiring landscape 597 00:26:06,339 --> 00:26:10,259 is changing because the AI industry is changing. 598 00:26:10,259 --> 00:26:12,039 The AI industry I-- 599 00:26:12,039 --> 00:26:16,700 I actually first got involved in AI back way back in 1992. 600 00:26:16,700 --> 00:26:19,789 I worked in it for a little while just before the AI winter. 601 00:26:19,789 --> 00:26:23,950 Everything failed drastically, but I got bitten by the AI bug. 602 00:26:23,950 --> 00:26:29,390 And then in 2015, when Google were launching TensorFlow, 603 00:26:29,390 --> 00:26:32,970 I got pulled right back into it, became part of the whole AI 604 00:26:32,970 --> 00:26:35,690 boom, launching TensorFlow, advocating TensorFlow 605 00:26:35,690 --> 00:26:37,330 to millions of people, and seeing 606 00:26:37,329 --> 00:26:38,929 the changes that happened. 607 00:26:38,930 --> 00:26:44,730 But along 2021, 2022, we had a global pandemic. 608 00:26:44,730 --> 00:26:48,210 The global pandemic caused a massive industrial slowdown. 609 00:26:48,210 --> 00:26:50,049 This massive industrial slowdown meant 610 00:26:50,049 --> 00:26:51,930 that companies had to start pivoting 611 00:26:51,930 --> 00:26:55,289 towards things that drove revenue and directly drove 612 00:26:55,289 --> 00:26:56,049 revenue. 613 00:26:56,049 --> 00:26:58,549 And at Google, TensorFlow was an open-source product. 614 00:26:58,549 --> 00:27:00,829 It didn't directly drive revenue. 615 00:27:00,829 --> 00:27:02,369 We began to scale back. 616 00:27:02,369 --> 00:27:04,289 Every company in the world also scaled back 617 00:27:04,289 --> 00:27:06,329 on hiring at this time. 618 00:27:06,329 --> 00:27:09,069 Then we get to about 2022, 2023. 619 00:27:09,069 --> 00:27:10,109 What happens? 620 00:27:10,109 --> 00:27:12,529 We begin to come out of the global pandemic. 621 00:27:12,529 --> 00:27:15,329 We begin to realize all industries have 622 00:27:15,329 --> 00:27:19,199 this massive logjam of non-hiring that they had done 623 00:27:19,200 --> 00:27:21,279 or hiring that they hadn't done. 624 00:27:21,279 --> 00:27:23,519 And we're also entering a time where 625 00:27:23,519 --> 00:27:25,019 AI was exploding on the scene. 626 00:27:25,019 --> 00:27:27,039 Thanks to the work of people like Andrew, 627 00:27:27,039 --> 00:27:30,079 the world was pivoting and changing to be AI first 628 00:27:30,079 --> 00:27:31,659 in just about everything. 629 00:27:31,660 --> 00:27:34,480 And every company needed to hire like crazy. 630 00:27:34,480 --> 00:27:38,519 Every company then hiring like crazy in 2022, 2023 631 00:27:38,519 --> 00:27:42,879 meant that most companies ended up overhiring. 632 00:27:42,880 --> 00:27:45,760 And what that generally meant was 633 00:27:45,759 --> 00:27:50,359 people who were not qualified for higher positions usually got 634 00:27:50,359 --> 00:27:53,359 higher positions because you had to enter into a bidding war 635 00:27:53,359 --> 00:27:54,899 just to be able to get talent. 636 00:27:54,900 --> 00:27:56,600 You ended up having talent grabs, 637 00:27:56,599 --> 00:27:59,599 and you ended up having stories like the one Andrew told where 638 00:27:59,599 --> 00:28:03,139 it's a case here's a person with AI talent, let's grab them, 639 00:28:03,140 --> 00:28:05,700 let's throw money at them, let's have them come work for us, 640 00:28:05,700 --> 00:28:07,799 and then we'll figure out what we want to do. 641 00:28:07,799 --> 00:28:11,799 So as a result, 2022, 2023 all of this massive overhiring 642 00:28:11,799 --> 00:28:16,839 happens because of AI and because of the COVID logjam. 643 00:28:16,839 --> 00:28:20,919 And then 2024, 2025 is the great wake-up, where 644 00:28:20,920 --> 00:28:24,200 a lot of companies realize this over hiring that they had done, 645 00:28:24,200 --> 00:28:27,080 they have ended up with a lot of people who are underqualified. 646 00:28:27,079 --> 00:28:27,579 I'm sorry. 647 00:28:27,579 --> 00:28:29,919 Yeah, underqualified for the job that they were doing. 648 00:28:29,920 --> 00:28:32,253 A lot of people ended up getting hired just because they 649 00:28:32,252 --> 00:28:33,399 had AI on their resume. 650 00:28:33,400 --> 00:28:35,192 And there's a big adjustment going on. 651 00:28:35,192 --> 00:28:36,900 And in the light of this big adjustment-- 652 00:28:36,900 --> 00:28:37,840 show you-- just one second. 653 00:28:37,839 --> 00:28:39,639 In the light of this big adjustment-- oh, 654 00:28:39,640 --> 00:28:40,610 you're not saying my slides? 655 00:28:40,609 --> 00:28:41,109 OK. 656 00:28:41,109 --> 00:28:45,519 And in the light of this big adjustment-- there we go. 657 00:28:45,519 --> 00:28:46,920 I think it's because my power. 658 00:28:46,920 --> 00:28:48,633 I'm not plugged into power mains. 659 00:28:48,633 --> 00:28:50,299 And in the light of this big adjustment, 660 00:28:50,299 --> 00:28:52,319 then what has happened is now a lot 661 00:28:52,319 --> 00:28:56,240 of companies are much more cautious about AI skills 662 00:28:56,240 --> 00:28:57,240 that they're hiring. 663 00:28:57,240 --> 00:28:59,359 And if you're coming into that with that mindset 664 00:28:59,359 --> 00:29:04,000 and understanding that, realize opportunity is still there, 665 00:29:04,000 --> 00:29:06,720 and opportunity is there massively 666 00:29:06,720 --> 00:29:09,019 if you approach it strategically. 667 00:29:09,019 --> 00:29:10,519 So what I want to talk through today 668 00:29:10,519 --> 00:29:13,789 is how you can do exactly that. 669 00:29:13,789 --> 00:29:17,670 So I see three pillars of success in the business world 670 00:29:17,670 --> 00:29:19,570 and particularly in the AI business world. 671 00:29:19,569 --> 00:29:21,869 And nowadays you can't just have AI on your resume 672 00:29:21,869 --> 00:29:23,089 and get overhired. 673 00:29:23,089 --> 00:29:25,109 Nowadays, not only do you have to be 674 00:29:25,109 --> 00:29:28,669 able to tell that you have the mindset of these three 675 00:29:28,670 --> 00:29:32,070 pillars of success, but you also have to be able to show. 676 00:29:32,069 --> 00:29:34,429 And to be able to show these, that actually has never 677 00:29:34,430 --> 00:29:35,670 been a better time. 678 00:29:35,670 --> 00:29:38,430 As Andrew demonstrated earlier on, the ability to vibe 679 00:29:38,430 --> 00:29:39,751 code things into existence. 680 00:29:39,751 --> 00:29:41,210 He doesn't like the word vibe code. 681 00:29:41,210 --> 00:29:42,950 I agree with him, but the ability 682 00:29:42,950 --> 00:29:45,430 to prompt things into existence, or whatever the word is 683 00:29:45,430 --> 00:29:48,190 that we want to use, allows you to be 684 00:29:48,190 --> 00:29:51,470 able to show better than ever before. 685 00:29:51,470 --> 00:29:54,170 He was talking earlier on about product managers, 686 00:29:54,170 --> 00:29:55,990 and he had this time when he got engineers 687 00:29:55,990 --> 00:29:58,150 to be product managers, and then those engineers 688 00:29:58,150 --> 00:30:00,330 ended up being really bad product managers. 689 00:30:00,329 --> 00:30:03,990 I actually interviewed at Google twice and failed twice 690 00:30:03,990 --> 00:30:07,509 despite being very successful at Microsoft, 691 00:30:07,509 --> 00:30:11,343 authored 20 plus books, taught college courses. 692 00:30:11,343 --> 00:30:13,259 I interviewed at Google twice and failed twice 693 00:30:13,259 --> 00:30:15,299 because I was interviewing to be a product manager, 694 00:30:15,299 --> 00:30:17,799 and then when I interviewed to be an engineer, they hired me 695 00:30:17,799 --> 00:30:20,779 and they were like, why didn't you try to join us years ago? 696 00:30:20,779 --> 00:30:23,720 So a lot of it is just being a good engineer. 697 00:30:23,720 --> 00:30:27,019 You've got the ability to do that and show that nowadays. 698 00:30:27,019 --> 00:30:29,599 And with that ratio of engineer to product manager, 699 00:30:29,599 --> 00:30:31,779 changing engineering skills are also 700 00:30:31,779 --> 00:30:33,579 far more valuable than ever. 701 00:30:33,579 --> 00:30:35,480 So the three pillars to success. 702 00:30:35,480 --> 00:30:37,779 Number 1, understanding in depth. 703 00:30:37,779 --> 00:30:40,700 And I'm going to mean this in two different ways. 704 00:30:40,700 --> 00:30:45,259 Number one is academically, to have the understanding and depth 705 00:30:45,259 --> 00:30:47,940 academically of machine learning, 706 00:30:47,940 --> 00:30:50,220 of particular model architectures, 707 00:30:50,220 --> 00:30:52,880 to be able to understand them, to be able to read papers, 708 00:30:52,880 --> 00:30:55,420 to be able to understand what's in those papers, 709 00:30:55,420 --> 00:30:59,019 and to be able to understand, in particular, how to take 710 00:30:59,019 --> 00:31:00,940 that stuff and put it to work. 711 00:31:00,940 --> 00:31:03,240 The second part of understanding in depth 712 00:31:03,240 --> 00:31:07,299 is really having your finger on the pulse of particular trends 713 00:31:07,299 --> 00:31:10,450 and where the signal-to-noise ratio favors 714 00:31:10,450 --> 00:31:11,670 signal in those trends. 715 00:31:11,670 --> 00:31:13,503 And I'm going to be going into that in a lot 716 00:31:13,502 --> 00:31:15,329 more detail a little bit later. 717 00:31:15,329 --> 00:31:19,649 Secondly, and also very, very importantly is business focus. 718 00:31:19,650 --> 00:31:22,009 So Andrew said something politically incorrect 719 00:31:22,009 --> 00:31:22,509 earlier on. 720 00:31:22,509 --> 00:31:25,490 I'm going to also say a similar politically incorrect thing. 721 00:31:25,490 --> 00:31:28,329 First of all, hard work. 722 00:31:28,329 --> 00:31:32,049 Hard work is such a nebulous term 723 00:31:32,049 --> 00:31:35,210 that I would say that think about hard work in terms of you 724 00:31:35,210 --> 00:31:37,130 are what you measure. 725 00:31:37,130 --> 00:31:38,750 There is the whole trend out there. 726 00:31:38,750 --> 00:31:41,380 I'm trying to remember, is it 996 or is it 669? 727 00:31:41,380 --> 00:31:42,110 996. 728 00:31:42,109 --> 00:31:46,809 9:00 AM to 9:00 PM, six days a week is a metric of hard work. 729 00:31:46,809 --> 00:31:47,852 It's not. 730 00:31:47,853 --> 00:31:49,269 There's not a metric of hard work. 731 00:31:49,269 --> 00:31:51,690 That's a metric of time spent. 732 00:31:51,690 --> 00:31:54,250 So I would encourage everybody, in the same way as Andrew 733 00:31:54,250 --> 00:31:55,950 did, to think about hard work. 734 00:31:55,950 --> 00:31:59,890 But what hard work is how you measure that hard work. 735 00:31:59,890 --> 00:32:03,770 You can work eight hours a day and be incredibly productive. 736 00:32:03,769 --> 00:32:06,889 You can work six hours a day and be incredibly productive, 737 00:32:06,890 --> 00:32:09,280 but it's the metric of how hard you work 738 00:32:09,279 --> 00:32:10,859 and how you measure that. 739 00:32:10,859 --> 00:32:13,199 I personally measure that from output, 740 00:32:13,200 --> 00:32:16,720 things that I have created in the time that I spent. 741 00:32:16,720 --> 00:32:21,000 I joke a lot, but it's true that I've written a lot of books. 742 00:32:21,000 --> 00:32:22,140 Andrew held up one. 743 00:32:22,140 --> 00:32:25,680 That one that he held up, that he helped me write a little bit, 744 00:32:25,680 --> 00:32:28,600 I actually wrote that book in about two months. 745 00:32:28,599 --> 00:32:31,071 And people say, well, how do you have time with your jobs 746 00:32:31,071 --> 00:32:32,279 and all these kind of things? 747 00:32:32,279 --> 00:32:34,208 You must work like 16 hours a day 748 00:32:34,208 --> 00:32:35,500 in order to be able to do this. 749 00:32:35,500 --> 00:32:38,640 But actually, the key to me being able to write books 750 00:32:38,640 --> 00:32:40,320 is baseball. 751 00:32:40,319 --> 00:32:42,439 Any baseball fans here? 752 00:32:42,440 --> 00:32:45,320 So I love baseball, but if you sit down and try to watch 753 00:32:45,319 --> 00:32:48,759 baseball on TV, a match can take like 3 and 1/2 or four hours. 754 00:32:48,759 --> 00:32:51,319 So all of my writing I tend to do in baseball season. 755 00:32:51,319 --> 00:32:54,000 So I'm like, if I'm going to sit down, I like the Mariners. 756 00:32:54,000 --> 00:32:54,940 I'm from Seattle. 757 00:32:54,940 --> 00:32:57,279 I like the Dodgers. 758 00:32:57,279 --> 00:32:58,019 Nobody booed. 759 00:32:58,019 --> 00:32:59,240 OK, good. 760 00:32:59,240 --> 00:33:02,400 And so usually one of those is going to be playing at 7 o'clock 761 00:33:02,400 --> 00:33:02,900 at night. 762 00:33:02,900 --> 00:33:04,700 So instead of sitting in front of the TV, 763 00:33:04,700 --> 00:33:06,393 just like watching baseball mindlessly. 764 00:33:06,393 --> 00:33:08,309 I'll actually be writing a book while baseball 765 00:33:08,309 --> 00:33:09,309 is on in the background. 766 00:33:09,309 --> 00:33:10,710 It's a very slow moving game. 767 00:33:10,710 --> 00:33:11,730 This is something. 768 00:33:11,730 --> 00:33:14,089 That's the hard work in this case. 769 00:33:14,089 --> 00:33:17,990 And I would encourage you to try to find areas where you can 770 00:33:17,990 --> 00:33:20,670 work hard and produce output. 771 00:33:20,670 --> 00:33:22,789 And that's the second pillar here is that business 772 00:33:22,789 --> 00:33:25,389 focus, the output that you produce 773 00:33:25,390 --> 00:33:28,910 to align that output with the business focus that you 774 00:33:28,910 --> 00:33:31,269 want to have and with the work that you want to do. 775 00:33:31,269 --> 00:33:35,049 There's an old saying, "Don't dress for the job you have, 776 00:33:35,049 --> 00:33:36,669 dress for the one you want." 777 00:33:36,670 --> 00:33:39,910 I would say a new angle on that saying would 778 00:33:39,910 --> 00:33:42,650 be, don't let your output be for the job you have. 779 00:33:42,650 --> 00:33:45,062 Let your output be for the job you want. 780 00:33:45,061 --> 00:33:47,269 And if I go back to when I spoke about I failed twice 781 00:33:47,269 --> 00:33:51,029 at Google to get in, the third time when I got in, 782 00:33:51,029 --> 00:33:53,230 I had actually decided to do to approach 783 00:33:53,230 --> 00:33:54,470 this in a different way. 784 00:33:54,470 --> 00:33:57,170 And I was interviewing at the time for their cloud team. 785 00:33:57,170 --> 00:33:59,350 They were just really launching cloud, 786 00:33:59,349 --> 00:34:01,949 and I had just written a book on Java. 787 00:34:01,950 --> 00:34:03,630 And so I decided to see what I could 788 00:34:03,630 --> 00:34:05,410 do with Java in their cloud. 789 00:34:05,410 --> 00:34:07,670 I ended up writing a Java application that 790 00:34:07,670 --> 00:34:10,710 ran in their cloud for predicting stock prices using 791 00:34:10,710 --> 00:34:13,269 technical analytics and all that kind of stuff. 792 00:34:13,269 --> 00:34:15,050 And when it got to the interview, 793 00:34:15,050 --> 00:34:17,909 instead of them asking me stupid questions like how many golf 794 00:34:17,909 --> 00:34:21,769 balls can fit in a bus, they saw this code. 795 00:34:21,769 --> 00:34:22,989 I had put this code. 796 00:34:22,989 --> 00:34:26,570 I remember I was producing output for the job I wanted. 797 00:34:26,570 --> 00:34:30,269 I'd put this code on my resume, and my entire interview loop 798 00:34:30,269 --> 00:34:32,230 was them asking me about my code. 799 00:34:32,230 --> 00:34:34,610 So it put the power on me. 800 00:34:34,610 --> 00:34:37,230 It gave me the power to communicate about things 801 00:34:37,230 --> 00:34:42,909 that I knew, as opposed to going in blind to somebody 802 00:34:42,909 --> 00:34:44,710 asking me random questions in the hope 803 00:34:44,710 --> 00:34:46,329 that I'll be able to answer them. 804 00:34:46,329 --> 00:34:49,349 And it's the same thing I would say in the AI world. 805 00:34:49,349 --> 00:34:53,150 The business focus, the ability for you now to prompt code 806 00:34:53,150 --> 00:34:56,190 into existence, to prompt products into existence 807 00:34:56,190 --> 00:34:58,510 and if you can build those products 808 00:34:58,510 --> 00:35:02,090 and line them up with the thing that it is that you want to do, 809 00:35:02,090 --> 00:35:03,940 be it a Google or Meta or a startup 810 00:35:03,940 --> 00:35:05,860 or any of those kind of things, and have 811 00:35:05,860 --> 00:35:08,760 that in-depth understanding not just of your code, 812 00:35:08,760 --> 00:35:10,680 but how it aligns to their business, 813 00:35:10,679 --> 00:35:13,139 this is a pillar of success in this time and age. 814 00:35:13,139 --> 00:35:14,882 And I will also argue that even though it 815 00:35:14,882 --> 00:35:17,340 looks like the signals look like there aren't a lot of jobs 816 00:35:17,340 --> 00:35:19,180 out there, there are. 817 00:35:19,179 --> 00:35:21,299 What there aren't a lot of is a good combination 818 00:35:21,300 --> 00:35:23,940 of jobs and people to match them. 819 00:35:23,940 --> 00:35:26,619 And then, of course, this bias towards delivery. 820 00:35:26,619 --> 00:35:29,739 "Ideas are cheap, execution is everything." 821 00:35:29,739 --> 00:35:31,539 I've interviewed many, many people 822 00:35:31,539 --> 00:35:34,980 who came in with very, very fluffy ideas and no way 823 00:35:34,980 --> 00:35:36,300 to be able to ground them. 824 00:35:36,300 --> 00:35:39,220 I've interviewed people who came in with half-baked ideas 825 00:35:39,219 --> 00:35:41,159 that they grounded very, very well. 826 00:35:41,159 --> 00:35:42,819 Guess which ones got the job? 827 00:35:42,820 --> 00:35:44,740 So I would say these three things. 828 00:35:44,739 --> 00:35:48,419 Understanding and depth of the academics behind AI 829 00:35:48,420 --> 00:35:52,260 of the practicalities behind AI and the things that you need 830 00:35:52,260 --> 00:35:53,020 to do. 831 00:35:53,019 --> 00:35:56,478 Business focus, focusing on delivery for the business, 832 00:35:56,478 --> 00:35:58,019 understanding what the business needs 833 00:35:58,019 --> 00:36:00,239 and being able to deliver for that, and again, 834 00:36:00,239 --> 00:36:03,289 that bias towards delivery. 835 00:36:03,289 --> 00:36:04,789 So a quick pivot. 836 00:36:04,789 --> 00:36:07,809 What's it actually like working in AI right now? 837 00:36:07,809 --> 00:36:09,250 It's interesting. 838 00:36:09,250 --> 00:36:15,210 So as recently as two or three years ago, working in AI 839 00:36:15,210 --> 00:36:18,490 was if you could do a thing, you're great. 840 00:36:18,489 --> 00:36:21,309 If you can build an image classifier, you're golden. 841 00:36:21,309 --> 00:36:25,130 We'll throw six figure salaries and massive stock benefits 842 00:36:25,130 --> 00:36:25,829 at you. 843 00:36:25,829 --> 00:36:28,090 Unfortunately, that's not the case anymore. 844 00:36:28,090 --> 00:36:30,850 It's really a lot of today what you'll see 845 00:36:30,849 --> 00:36:32,730 is the P word, production. 846 00:36:32,730 --> 00:36:34,829 What can you do for production? 847 00:36:34,829 --> 00:36:38,949 What can you do if it's building new models, 848 00:36:38,949 --> 00:36:44,329 if it's optimizing models, if it's understanding users, 849 00:36:44,329 --> 00:36:46,150 UX is really, really important. 850 00:36:46,150 --> 00:36:48,389 Everything is geared towards production. 851 00:36:48,389 --> 00:36:50,809 Everything is biased towards production. 852 00:36:50,809 --> 00:36:52,329 The history that I told you about, 853 00:36:52,329 --> 00:36:57,409 going from the pandemic into the overhiring phase that we'd had, 854 00:36:57,409 --> 00:37:01,199 the businesses have pulled back and are optimized 855 00:37:01,199 --> 00:37:02,879 towards the bottom line. 856 00:37:02,880 --> 00:37:04,720 I have an old saying that the bottom line is 857 00:37:04,719 --> 00:37:06,679 that the bottom line is the bottom line, 858 00:37:06,679 --> 00:37:08,699 and this is the environment that we're in today. 859 00:37:08,699 --> 00:37:10,839 And if you can come in with that mindset 860 00:37:10,840 --> 00:37:12,720 when you're talking with companies, 861 00:37:12,719 --> 00:37:16,012 that's one of the keys to open the door. 862 00:37:16,012 --> 00:37:17,679 One of the things I've seen in the field 863 00:37:17,679 --> 00:37:20,039 has been maturing from it used to be really nice that we 864 00:37:20,039 --> 00:37:22,539 could do cool things and we could build cool things. 865 00:37:22,539 --> 00:37:25,079 Now it's really build useful things. 866 00:37:25,079 --> 00:37:27,299 Those useful things can be cool too, by the way, 867 00:37:27,300 --> 00:37:28,860 and the results of them can be cool. 868 00:37:28,860 --> 00:37:31,160 And the changes that we see that come 869 00:37:31,159 --> 00:37:34,119 about as a result of delivering them can be cool. 870 00:37:34,119 --> 00:37:36,259 So it's not just coolness for coolness sake, 871 00:37:36,260 --> 00:37:43,203 but to focus on delivery, focus on being able to provide value, 872 00:37:43,202 --> 00:37:44,619 and then the coolness will follow. 873 00:37:44,619 --> 00:37:47,319 I guess what I'm trying to argue. 874 00:37:47,320 --> 00:37:50,760 So for realities, number 1, unfortunately nowadays 875 00:37:50,760 --> 00:37:53,200 business focus is non-negotiable. 876 00:37:53,199 --> 00:37:56,559 Now, let me-- I'm going to be a little bit politically incorrect 877 00:37:56,559 --> 00:37:59,349 here again for a moment. 878 00:37:59,349 --> 00:38:03,349 I've been working, like I said, for most of the last 35 years 879 00:38:03,349 --> 00:38:03,889 in tech. 880 00:38:03,889 --> 00:38:06,710 I would say for most of the last 10 years, 881 00:38:06,710 --> 00:38:10,690 a lot of large companies, particularly in Silicon Valley 882 00:38:10,690 --> 00:38:14,070 have really focused on developing their people 883 00:38:14,070 --> 00:38:15,470 above everything. 884 00:38:15,469 --> 00:38:20,429 Part of developing their people was bringing their entire self 885 00:38:20,429 --> 00:38:21,549 to work. 886 00:38:21,550 --> 00:38:23,990 Part of bringing their entire self to work 887 00:38:23,989 --> 00:38:28,029 was bringing the things that they care about outside of work. 888 00:38:28,030 --> 00:38:31,950 And that led to a lot of activism within companies. 889 00:38:31,949 --> 00:38:35,089 Now, please let me underline this. 890 00:38:35,090 --> 00:38:36,730 There is nothing wrong with activism. 891 00:38:36,730 --> 00:38:41,710 There is nothing wrong with wanting to support causes, 892 00:38:41,710 --> 00:38:44,289 not wanting to support causes where of justice. 893 00:38:44,289 --> 00:38:46,369 There is absolutely nothing wrong with that. 894 00:38:46,369 --> 00:38:50,130 But the overindexing on that, in my experience, 895 00:38:50,130 --> 00:38:52,230 has led to a lot of companies getting 896 00:38:52,230 --> 00:38:56,539 trapped by having to support activism above business. 897 00:38:56,539 --> 00:38:59,500 You've probably seen an example about two years ago 898 00:38:59,500 --> 00:39:03,500 of where activists in Google broke into the Google Cloud 899 00:39:03,500 --> 00:39:08,300 heads office because they were protesting a country that Google 900 00:39:08,300 --> 00:39:09,960 Cloud were doing business with. 901 00:39:09,960 --> 00:39:12,940 They broke into his office, they had a sit-in his office, 902 00:39:12,940 --> 00:39:15,740 and they used the bathroom all over his desk and stuff 903 00:39:15,739 --> 00:39:16,399 like that. 904 00:39:16,400 --> 00:39:18,740 This is where activism got out of hand. 905 00:39:18,739 --> 00:39:21,579 And as a result, the unfortunate truth 906 00:39:21,579 --> 00:39:25,659 is the good signals in that activism are now being lost. 907 00:39:25,659 --> 00:39:28,679 Because of those actions, people are being laid off. 908 00:39:28,679 --> 00:39:29,799 People are losing jobs. 909 00:39:29,800 --> 00:39:33,220 Activism is being stifled, and business focus 910 00:39:33,219 --> 00:39:34,799 has become non-negotiable. 911 00:39:34,800 --> 00:39:37,140 There's a bit of a pendulum swing going on. 912 00:39:37,139 --> 00:39:40,460 And the pendulum that had swung too far towards allowing people 913 00:39:40,460 --> 00:39:42,780 to bring their full selves to work 914 00:39:42,780 --> 00:39:45,100 is now swinging back in the other direction. 915 00:39:45,099 --> 00:39:47,380 We might blame the person in the White House 916 00:39:47,380 --> 00:39:49,280 and all that for these kind of things, 917 00:39:49,280 --> 00:39:50,800 but it's not solely that. 918 00:39:50,800 --> 00:39:52,519 It is that ongoing pendulum there. 919 00:39:52,519 --> 00:39:54,599 And I think it's an important part of it, 920 00:39:54,599 --> 00:39:57,509 is that you have to realize going into companies now, 921 00:39:57,510 --> 00:40:01,130 that business focus is absolutely non-negotiable. 922 00:40:01,130 --> 00:40:04,210 Secondly, risk mitigation is part of the job. 923 00:40:04,210 --> 00:40:07,090 And I think a very important part of any job, particularly 924 00:40:07,090 --> 00:40:08,210 with AI. 925 00:40:08,210 --> 00:40:11,449 I think if you can come into AI with a focus and a mindset 926 00:40:11,449 --> 00:40:15,129 around understanding the risks of transforming 927 00:40:15,130 --> 00:40:19,730 a particular business process to be an AI-oriented one 928 00:40:19,730 --> 00:40:22,030 and to help mitigate those risks, 929 00:40:22,030 --> 00:40:24,522 I think is really, really powerful. 930 00:40:24,521 --> 00:40:26,730 And I would argue in an interview environment, that's 931 00:40:26,730 --> 00:40:31,610 the number one skill to have, to have that mindset around you 932 00:40:31,610 --> 00:40:34,329 are doing a business transformation from heuristic 933 00:40:34,329 --> 00:40:36,750 computing to intelligent computing. 934 00:40:36,750 --> 00:40:37,489 Here's the risks. 935 00:40:37,489 --> 00:40:38,989 Here's how you mitigate those risks, 936 00:40:38,989 --> 00:40:41,329 and here's the mindset behind that. 937 00:40:41,329 --> 00:40:44,409 The third part responsibility is evolving. 938 00:40:44,409 --> 00:40:48,129 Now responsibility in AI has again 939 00:40:48,130 --> 00:40:54,090 changed from a very fluffy definition of let's make sure 940 00:40:54,090 --> 00:40:58,890 that the AI works for everybody to a definition of let's make 941 00:40:58,889 --> 00:41:00,750 sure that the AI works. 942 00:41:00,750 --> 00:41:03,010 Let's make sure that it drives the business. 943 00:41:03,010 --> 00:41:06,010 And then let's make sure that it works for everybody. 944 00:41:06,010 --> 00:41:08,810 Often that has been inverted over the last few years, 945 00:41:08,809 --> 00:41:11,869 and that has led to some famous documented disasters. 946 00:41:11,869 --> 00:41:15,130 Let me share one with you. 947 00:41:15,130 --> 00:41:16,210 Let's see. 948 00:41:16,210 --> 00:41:17,710 I have lots of windows open. 949 00:41:17,710 --> 00:41:20,090 OK. 950 00:41:20,090 --> 00:41:21,809 Everybody knows image generation, 951 00:41:21,809 --> 00:41:23,190 text to image generation. 952 00:41:23,190 --> 00:41:25,090 I want to share a-- 953 00:41:25,090 --> 00:41:27,769 these were things that happened a couple of years 954 00:41:27,769 --> 00:41:30,329 ago with Gemini. 955 00:41:30,329 --> 00:41:33,690 So with Gemini, I was doing some testing around this one 956 00:41:33,690 --> 00:41:37,409 and I was working heavily on responsible AI. 957 00:41:37,409 --> 00:41:39,809 And part of responsible AI is you want 958 00:41:39,809 --> 00:41:42,409 to be representative of people. 959 00:41:42,409 --> 00:41:43,969 And when you're building something, 960 00:41:43,969 --> 00:41:46,669 like if you're a Google, you're indexing information, 961 00:41:46,670 --> 00:41:48,730 you really want to make sure that you don't 962 00:41:48,730 --> 00:41:50,969 reinforce negative biases. 963 00:41:50,969 --> 00:41:53,679 And if you're generating images, it's very easy 964 00:41:53,679 --> 00:41:55,839 to reinforce negative biases. 965 00:41:55,840 --> 00:41:57,240 So for example, if I said give me 966 00:41:57,239 --> 00:42:00,359 an image of a doctor, if the training set primarily 967 00:42:00,360 --> 00:42:03,519 has men as doctors, it's more likely to give a man. 968 00:42:03,519 --> 00:42:06,500 If I say give me an image of a nurse, if the training set more 969 00:42:06,500 --> 00:42:08,000 likely to have women as nurses, it's 970 00:42:08,000 --> 00:42:09,980 more likely to give me an image of a woman. 971 00:42:09,980 --> 00:42:12,719 But that's reinforcing a negative stereotype. 972 00:42:12,719 --> 00:42:16,319 So I wanted to do a test of how Google were trying 973 00:42:16,320 --> 00:42:20,920 to overcome that, given that these negative biases are 974 00:42:20,920 --> 00:42:22,840 already in the training set. 975 00:42:22,840 --> 00:42:25,380 So I said, OK, here's a prompt where I said, 976 00:42:25,380 --> 00:42:27,280 "give me a young Asian woman in a cornfield, 977 00:42:27,280 --> 00:42:28,940 wearing a summer dress and a straw hat, 978 00:42:28,940 --> 00:42:31,639 looking intently at her iPhone," and it gave me these beautiful 979 00:42:31,639 --> 00:42:32,179 images. 980 00:42:32,179 --> 00:42:34,359 It did a really nice job. 981 00:42:34,360 --> 00:42:38,420 And I said, this is a virtual actress I've been working with. 982 00:42:38,420 --> 00:42:39,840 I'll share that in a moment. 983 00:42:39,840 --> 00:42:44,519 And I say, OK, what if I ask for an Indian one? 984 00:42:44,519 --> 00:42:48,940 So I said, OK, whoops, a young Indian woman, same prompt. 985 00:42:48,940 --> 00:42:52,550 And it gave me beautiful images of a young Indian woman. 986 00:42:52,550 --> 00:42:58,150 Then I was like, OK, what if I want her to be Black? 987 00:42:58,150 --> 00:43:00,269 For some reason it only gave me three. 988 00:43:00,269 --> 00:43:03,289 I'm not sure why, but it's still adhere to the prompt. 989 00:43:03,289 --> 00:43:06,750 So the responsibility was looking really, really good. 990 00:43:06,750 --> 00:43:10,989 So then I asked it to give me a Latina. 991 00:43:10,989 --> 00:43:13,309 Latina, it gave me four. 992 00:43:13,309 --> 00:43:15,625 But yeah, she looks pretty Latina. 993 00:43:15,625 --> 00:43:17,750 Maybe the one on the bottom left looks a little bit 994 00:43:17,750 --> 00:43:22,469 like Hermione Granger, but on the whole looks pretty good. 995 00:43:22,469 --> 00:43:24,629 Then I asked it to give me a Caucasian. 996 00:43:24,630 --> 00:43:26,750 What do you think happened? 997 00:43:26,750 --> 00:43:28,329 "While I understand your request, 998 00:43:28,329 --> 00:43:31,069 I am unable to generate images of people as this could 999 00:43:31,070 --> 00:43:34,870 potentially lead to harmful stereotypes and biases." 1000 00:43:34,869 --> 00:43:38,509 This was a very poorly implemented safety filter, 1001 00:43:38,510 --> 00:43:41,790 where the safety filter in this case was like looking 1002 00:43:41,789 --> 00:43:44,902 for the word "Caucasian" or looking for the word "whites" 1003 00:43:44,902 --> 00:43:46,610 and the results saying it wouldn't do it. 1004 00:43:46,610 --> 00:43:48,985 I was like, OK, well, let me test the filter a little bit 1005 00:43:48,985 --> 00:43:52,579 and I said, OK, instead of Caucasian, let me try white. 1006 00:43:52,579 --> 00:43:55,420 And yet, while I'm unable to fulfill your-- 1007 00:43:55,420 --> 00:43:58,340 "While I'm able to fulfill your requests, 1008 00:43:58,340 --> 00:44:00,640 I'm not currently generating images of people." 1009 00:44:00,639 --> 00:44:03,059 It lied to my face because it had just 1010 00:44:03,059 --> 00:44:04,579 generate images of people. 1011 00:44:04,579 --> 00:44:07,199 Anybody know the hack that I used to get it to work? 1012 00:44:10,380 --> 00:44:11,400 This is a funny one. 1013 00:44:11,400 --> 00:44:13,039 So I will show you. 1014 00:44:13,039 --> 00:44:14,980 One moment. 1015 00:44:14,980 --> 00:44:18,019 I asked it to generate an Irish woman. 1016 00:44:18,019 --> 00:44:19,159 What do you think it did? 1017 00:44:21,739 --> 00:44:24,679 It gave me this image of an Irish woman, no problem, 1018 00:44:24,679 --> 00:44:27,579 in a summer dress, straw hat, looking intently at her phone. 1019 00:44:27,579 --> 00:44:30,219 What do you notice about this image? 1020 00:44:30,219 --> 00:44:32,859 She's got red hair in every image. 1021 00:44:32,860 --> 00:44:35,700 I grew up in Ireland, and Ireland 1022 00:44:35,699 --> 00:44:38,599 does have the highest proportion of redheads in the world. 1023 00:44:38,599 --> 00:44:40,139 It's about 8%. 1024 00:44:40,139 --> 00:44:42,779 But if you're going to draw an image 1025 00:44:42,780 --> 00:44:45,860 of a person and associate a particular ethnicity 1026 00:44:45,860 --> 00:44:47,809 with a color of hair, you can begin 1027 00:44:47,809 --> 00:44:49,549 to see this is massively problematic. 1028 00:44:49,550 --> 00:44:51,410 There are areas, I believe, in China 1029 00:44:51,409 --> 00:44:54,690 where the description of a demon is a red-headed person. 1030 00:44:54,690 --> 00:44:57,369 So what ended up happening here, from the responsible AI 1031 00:44:57,369 --> 00:45:00,929 perspective, was one very narrow view 1032 00:45:00,929 --> 00:45:03,049 of the world of what is responsible 1033 00:45:03,050 --> 00:45:04,810 and what is not responsible. 1034 00:45:04,809 --> 00:45:06,769 Ended up taking over the model, ended up 1035 00:45:06,769 --> 00:45:08,500 damaging the reputation of the model 1036 00:45:08,500 --> 00:45:10,250 and damaging the reputation of the company 1037 00:45:10,250 --> 00:45:13,929 as a result. In this case, it's borderline 1038 00:45:13,929 --> 00:45:17,109 offensive to draw all Irish people as having red hair, 1039 00:45:17,110 --> 00:45:19,809 but that never even entered into the mindset of those 1040 00:45:19,809 --> 00:45:22,090 that were building the safety filters here. 1041 00:45:22,090 --> 00:45:25,210 So when I talk about responsibility is evolving, 1042 00:45:25,210 --> 00:45:27,490 that's the direction that I want to-- 1043 00:45:27,489 --> 00:45:28,271 sorry, one moment. 1044 00:45:28,271 --> 00:45:29,730 Let me get my slides back. --that's 1045 00:45:29,730 --> 00:45:31,396 the direction I want you to think about, 1046 00:45:31,396 --> 00:45:33,529 that now responsible AI has moved out 1047 00:45:33,530 --> 00:45:38,530 of very fluffy social issues and into more hard line things that 1048 00:45:38,530 --> 00:45:41,330 are associated with the business and prevent damaging 1049 00:45:41,329 --> 00:45:43,110 the reputation of the business. 1050 00:45:43,110 --> 00:45:45,900 There's a lot of great research out there around responsible AI, 1051 00:45:45,900 --> 00:45:48,862 and that's the stuff that's been rolled into products. 1052 00:45:48,862 --> 00:45:50,820 And then, of course, I just showed with Gemini, 1053 00:45:50,820 --> 00:45:52,080 learning from mistakes is constant 1054 00:45:52,079 --> 00:45:53,099 questioning at the front. 1055 00:45:53,099 --> 00:45:53,599 Yes. 1056 00:45:53,599 --> 00:45:57,239 I also heard that, I didn't verify that to be true, 1057 00:45:57,239 --> 00:46:03,199 but I incorporated this feature that makes certain races 1058 00:46:03,199 --> 00:46:06,119 and ethnicities historical objects. 1059 00:46:06,119 --> 00:46:07,425 Yeah. 1060 00:46:07,425 --> 00:46:08,159 Yeah. 1061 00:46:08,159 --> 00:46:11,440 So the question was issues where races and things 1062 00:46:11,440 --> 00:46:15,000 were mixed in historical context was the same problem. 1063 00:46:15,000 --> 00:46:17,119 So, for example, if you had a prompt that 1064 00:46:17,119 --> 00:46:19,799 said, draw me a samurai, the idea 1065 00:46:19,800 --> 00:46:22,039 was like they didn't want to have-- 1066 00:46:22,039 --> 00:46:25,079 the engine that changed the prompt 1067 00:46:25,079 --> 00:46:28,299 to make sure that it was fair would end up saying, 1068 00:46:28,300 --> 00:46:32,187 give me a mixture of samurai of diverse backgrounds. 1069 00:46:32,186 --> 00:46:34,019 And then you'd have male and female samurai, 1070 00:46:34,019 --> 00:46:36,099 samurai of different races and those kind of things. 1071 00:46:36,099 --> 00:46:37,519 And it was the same prompting that 1072 00:46:37,519 --> 00:46:40,440 ended up causing the damage that I just demonstrated. 1073 00:46:40,440 --> 00:46:43,039 So the idea was to intercept your prompts 1074 00:46:43,039 --> 00:46:46,279 to make sure that the outputs of the model 1075 00:46:46,280 --> 00:46:51,440 would end up providing something that was more fair when it comes 1076 00:46:51,440 --> 00:46:53,599 to diverse representation. 1077 00:46:53,599 --> 00:46:56,500 So it was a very naive solution that ended up being rolled in. 1078 00:46:56,500 --> 00:46:57,780 That was a few years ago. 1079 00:46:57,780 --> 00:46:59,985 They've massively improved it since then, 1080 00:46:59,985 --> 00:47:01,360 but that's when I'm talking about 1081 00:47:01,360 --> 00:47:03,260 if you're working in the AI space nowadays, 1082 00:47:03,260 --> 00:47:05,320 that's how responsibility is evolving. 1083 00:47:05,320 --> 00:47:08,039 You can't just get away with that stuff anymore. 1084 00:47:08,039 --> 00:47:10,480 That Gemini lesson was a good-- that Gemini example 1085 00:47:10,480 --> 00:47:11,840 is a good lesson from that. 1086 00:47:11,840 --> 00:47:15,180 And the mindset of you will make mistakes, 1087 00:47:15,179 --> 00:47:18,039 so learning from mistakes is a constant ongoing thing. 1088 00:47:18,039 --> 00:47:19,480 And going back to the people point 1089 00:47:19,480 --> 00:47:21,880 that Andrew made earlier on, the people around you 1090 00:47:21,880 --> 00:47:23,480 will make mistakes too. 1091 00:47:23,480 --> 00:47:25,119 So to have the ability to give them 1092 00:47:25,119 --> 00:47:27,000 grace when they make mistakes and to work 1093 00:47:27,000 --> 00:47:29,800 through those mistakes and move on is really, really important 1094 00:47:29,800 --> 00:47:33,600 and is a reality of AI at work. 1095 00:47:33,599 --> 00:47:35,980 I've spoken a lot about the business focus advantage, 1096 00:47:35,980 --> 00:47:38,240 so I'm going to skip over this. 1097 00:47:38,239 --> 00:47:41,159 So now let's talk about vibe coding. 1098 00:47:41,159 --> 00:47:43,409 So let's talk about the whole idea of generating code. 1099 00:47:43,409 --> 00:47:46,149 Now, the meme is out there that it makes engineers 1100 00:47:46,150 --> 00:47:49,110 less useful by the fact that somebody can just prompt code 1101 00:47:49,110 --> 00:47:50,990 into existence. 1102 00:47:50,989 --> 00:47:53,529 There is no smoke without fire, of course, 1103 00:47:53,530 --> 00:47:57,390 but I would say don't let that meme get you down 1104 00:47:57,389 --> 00:48:00,309 because that's when you start peeling into these things, that 1105 00:48:00,309 --> 00:48:02,590 is ultimately not the truth. 1106 00:48:02,590 --> 00:48:04,570 The more skilled you are as an engineer, 1107 00:48:04,570 --> 00:48:07,998 the better you become using this type of vibe. 1108 00:48:07,998 --> 00:48:10,289 Somebody give me another phrase other than vibe coding, 1109 00:48:10,289 --> 00:48:12,269 using this probe to coding. 1110 00:48:12,269 --> 00:48:14,469 And I always like to think about this 1111 00:48:14,469 --> 00:48:17,629 and to try and put you and put people 1112 00:48:17,630 --> 00:48:20,550 that I speak with into the role of being a trusted 1113 00:48:20,550 --> 00:48:23,190 advisor for the people that you speak with. 1114 00:48:23,190 --> 00:48:25,070 So whether you're interviewing with somebody, 1115 00:48:25,070 --> 00:48:27,070 get yourself into the mindset of being a trusted 1116 00:48:27,070 --> 00:48:29,550 advisor of the company that you're interviewing for, 1117 00:48:29,550 --> 00:48:32,350 whether you're consulting or whatever those kind of things 1118 00:48:32,349 --> 00:48:32,909 are. 1119 00:48:32,909 --> 00:48:36,429 So when you want to get into the idea of being a trusted advisor, 1120 00:48:36,429 --> 00:48:39,190 then you really need to understand the implications 1121 00:48:39,190 --> 00:48:40,920 of generated code. 1122 00:48:40,920 --> 00:48:43,539 And nobody can understand the implications of generated code 1123 00:48:43,539 --> 00:48:44,853 better than an engineer. 1124 00:48:44,853 --> 00:48:47,019 And the metric that I always like to use around that 1125 00:48:47,019 --> 00:48:48,820 is technical debt. 1126 00:48:48,820 --> 00:48:50,620 Quick question. 1127 00:48:50,619 --> 00:48:54,460 Are you familiar with the phrase technical debt? 1128 00:48:54,460 --> 00:48:55,099 Nobody. 1129 00:48:55,099 --> 00:48:56,219 OK. 1130 00:48:56,219 --> 00:48:57,779 Andrew and I were doing a conference 1131 00:48:57,780 --> 00:49:00,920 in New York on Friday, and I used the phrase, 1132 00:49:00,920 --> 00:49:02,599 and I saw a lot of blank faces. 1133 00:49:02,599 --> 00:49:04,219 So I didn't realize that people didn't understand 1134 00:49:04,219 --> 00:49:05,177 what technical debt is. 1135 00:49:05,177 --> 00:49:07,119 So let me just take a moment to explain that, 1136 00:49:07,119 --> 00:49:10,500 because I find it's an excellent framework to help you understand 1137 00:49:10,500 --> 00:49:12,739 the power of vibe coding. 1138 00:49:12,739 --> 00:49:15,259 Think about debt the way you normally would. 1139 00:49:15,260 --> 00:49:16,580 Buying a house. 1140 00:49:16,579 --> 00:49:20,159 If you buy a house, say, you borrow half a million dollars 1141 00:49:20,159 --> 00:49:21,099 to buy a house. 1142 00:49:21,099 --> 00:49:24,237 In a 30-year mortgage, when you're buying that house at half 1143 00:49:24,237 --> 00:49:26,820 a million dollars, with all the interest that you pay is about 1144 00:49:26,820 --> 00:49:27,500 double. 1145 00:49:27,500 --> 00:49:29,980 So you end up paying back the bank about $1 million 1146 00:49:29,980 --> 00:49:31,860 on half a million owned. 1147 00:49:31,860 --> 00:49:35,420 So you have 30 years of home ownership 1148 00:49:35,420 --> 00:49:38,340 at a cost of $1 million in debt. 1149 00:49:38,340 --> 00:49:41,090 That is probably a good debt to take on, 1150 00:49:41,090 --> 00:49:44,230 because the value of the house will increase over that time. 1151 00:49:44,230 --> 00:49:46,349 You're not paying rent over that time, 1152 00:49:46,349 --> 00:49:47,849 and that million dollars that you're 1153 00:49:47,849 --> 00:49:50,089 spending on this house over those 30 years 1154 00:49:50,090 --> 00:49:51,970 is a good debt to take on, because you're 1155 00:49:51,969 --> 00:49:56,169 getting greater than $1 million worth of value out of it. 1156 00:49:56,170 --> 00:49:58,970 A bad debt would be an impulse purchase on a high interest 1157 00:49:58,969 --> 00:50:00,009 credit card. 1158 00:50:00,010 --> 00:50:02,050 Those pair of shoes, those latest ones 1159 00:50:02,050 --> 00:50:03,470 I really want to buy them. 1160 00:50:03,469 --> 00:50:04,629 It's $200. 1161 00:50:04,630 --> 00:50:06,950 By the time I've paid them off, it's $500. 1162 00:50:06,949 --> 00:50:10,689 You're not getting $500 worth of benefit out of those shoes. 1163 00:50:10,690 --> 00:50:13,690 Approaching software development with the same mindset 1164 00:50:13,690 --> 00:50:15,610 is the right way to go. 1165 00:50:15,610 --> 00:50:18,849 Every time you build something, you take on debt. 1166 00:50:18,849 --> 00:50:21,049 It doesn't matter how good it is, there's always 1167 00:50:21,050 --> 00:50:21,870 going to be bugs. 1168 00:50:21,869 --> 00:50:23,327 There's always going to be support. 1169 00:50:23,327 --> 00:50:25,536 There's always going to be new requirements coming in 1170 00:50:25,536 --> 00:50:26,269 from people. 1171 00:50:26,269 --> 00:50:28,030 There's always going needs to market it. 1172 00:50:28,030 --> 00:50:29,830 There's always going needs for feedback. 1173 00:50:29,829 --> 00:50:33,069 All of these things are debt, every time you do a thing. 1174 00:50:33,070 --> 00:50:35,610 The only way to avoid debt is to do nothing. 1175 00:50:35,610 --> 00:50:37,880 So your mindset should then get into when 1176 00:50:37,880 --> 00:50:40,200 you are creating a thing, whether you're 1177 00:50:40,199 --> 00:50:42,719 coding it yourself or whether you're vibe coding it 1178 00:50:42,719 --> 00:50:44,959 or any of these things that you are increasing 1179 00:50:44,960 --> 00:50:48,039 your amount of technical debt, those things 1180 00:50:48,039 --> 00:50:50,679 that you need to pay off over time. 1181 00:50:50,679 --> 00:50:52,719 So the question then becomes, as you 1182 00:50:52,719 --> 00:50:55,639 vibe code a thing into existence in the same way 1183 00:50:55,639 --> 00:50:58,480 as buying a thing, is it worth the technical debt 1184 00:50:58,480 --> 00:51:00,159 that you're taking on? 1185 00:51:00,159 --> 00:51:02,679 What does technical debt generally look like? 1186 00:51:02,679 --> 00:51:04,559 Bugs that you need to fix, people 1187 00:51:04,559 --> 00:51:08,159 that you need to convince to help you maintain the code, 1188 00:51:08,159 --> 00:51:10,039 documentation that you need to do, 1189 00:51:10,039 --> 00:51:14,119 features that you need to add, all of these kind of things. 1190 00:51:14,119 --> 00:51:16,239 You're all very familiar with them. 1191 00:51:16,239 --> 00:51:18,079 Think about those as that extra work 1192 00:51:18,079 --> 00:51:20,259 that you need to do beyond your current work. 1193 00:51:20,260 --> 00:51:22,240 That's the debt that you're taking on. 1194 00:51:22,239 --> 00:51:25,119 There are soft debt, and there are hard debt. 1195 00:51:25,119 --> 00:51:28,679 So to me, that would be the number one piece of advice 1196 00:51:28,679 --> 00:51:29,299 that I give. 1197 00:51:29,300 --> 00:51:32,400 And it's the one that I give every time I work with companies 1198 00:51:32,400 --> 00:51:33,840 around vibe coding. 1199 00:51:33,840 --> 00:51:37,230 And a lot of companies that I speak with, a lot of companies 1200 00:51:37,230 --> 00:51:38,289 that I consult with-- 1201 00:51:38,289 --> 00:51:39,829 I do a lot of work with startups, 1202 00:51:39,829 --> 00:51:42,469 in particular-- they just want to get straight 1203 00:51:42,469 --> 00:51:45,589 into opening Gemini or GPT or Anthropic 1204 00:51:45,590 --> 00:51:47,590 and start churning code out. 1205 00:51:47,590 --> 00:51:50,490 Let's get to a prototype phase very quickly. 1206 00:51:50,489 --> 00:51:52,009 Let's go to investors. 1207 00:51:52,010 --> 00:51:53,090 Let's do stuff. 1208 00:51:53,090 --> 00:51:54,510 It's great. 1209 00:51:54,510 --> 00:51:55,570 It can be. 1210 00:51:55,570 --> 00:51:59,289 But debt, debt, debt, debt, debt is always going to be there. 1211 00:51:59,289 --> 00:52:00,610 How do you manage your debt? 1212 00:52:00,610 --> 00:52:03,630 A good financier manages their debt and they become rich. 1213 00:52:03,630 --> 00:52:05,910 A good coder manages their technical debt, 1214 00:52:05,909 --> 00:52:07,829 and they become rich also. 1215 00:52:07,829 --> 00:52:10,449 So how do you get the good technical debt? 1216 00:52:10,449 --> 00:52:13,009 How do you the mortgage instead of the high credit card debt? 1217 00:52:13,010 --> 00:52:14,890 Well, number one is your objectives. 1218 00:52:14,889 --> 00:52:15,489 What are they? 1219 00:52:15,489 --> 00:52:16,629 Are they clear? 1220 00:52:16,630 --> 00:52:18,512 And have you met them? 1221 00:52:18,512 --> 00:52:19,929 You knew what you needed to build. 1222 00:52:19,929 --> 00:52:23,369 You didn't just fire up ChatGPT and start spinning code out. 1223 00:52:23,369 --> 00:52:24,710 At least I hope you didn't. 1224 00:52:24,710 --> 00:52:26,650 Think about how you build it. 1225 00:52:26,650 --> 00:52:28,869 AI was there to help you build it faster. 1226 00:52:28,869 --> 00:52:31,269 I'm working on my own little startup 1227 00:52:31,269 --> 00:52:33,570 at the moment in the movie making space. 1228 00:52:33,570 --> 00:52:36,860 And I've been using code generation almost completely 1229 00:52:36,860 --> 00:52:38,300 for that. 1230 00:52:38,300 --> 00:52:40,900 But what I've ended up doing for my clear objectives 1231 00:52:40,900 --> 00:52:42,980 met box here is that I've started 1232 00:52:42,980 --> 00:52:44,079 building this application. 1233 00:52:44,079 --> 00:52:44,704 I've tested it. 1234 00:52:44,704 --> 00:52:45,739 I've thrown it away. 1235 00:52:45,739 --> 00:52:47,979 I started again, tested it, thrown it away. 1236 00:52:47,980 --> 00:52:51,460 Each time my requirements have been improving in my mind. 1237 00:52:51,460 --> 00:52:53,840 I understand how to do the thing a little bit better, 1238 00:52:53,840 --> 00:52:56,260 and I can show some of the output of it in a few minutes. 1239 00:52:56,260 --> 00:52:58,100 But the idea there is that it's always 1240 00:52:58,099 --> 00:53:00,711 about having those clear objectives and meeting them. 1241 00:53:00,711 --> 00:53:02,420 And then if you're building out the thing 1242 00:53:02,420 --> 00:53:04,086 and you're not meeting those objectives, 1243 00:53:04,086 --> 00:53:05,119 that's still a learning. 1244 00:53:05,119 --> 00:53:06,819 And there's no harm in throwing it away 1245 00:53:06,820 --> 00:53:10,660 because code is cheap now in the age of generated code. 1246 00:53:10,659 --> 00:53:13,980 Finished code, engineered code is not cheap. 1247 00:53:13,980 --> 00:53:16,320 So get those objectives, make them clear, 1248 00:53:16,320 --> 00:53:20,420 build it, hit a specific requirement and move on. 1249 00:53:20,420 --> 00:53:22,039 Is there business value delivered? 1250 00:53:22,039 --> 00:53:23,340 Is the other part of it. 1251 00:53:23,340 --> 00:53:27,019 I've seen people vibe coding for hours on things like Replit 1252 00:53:27,019 --> 00:53:29,059 to build a really, really cool website. 1253 00:53:29,059 --> 00:53:32,059 And then the answer was, so what? 1254 00:53:32,059 --> 00:53:33,938 I mean, how is this helping the business? 1255 00:53:33,938 --> 00:53:35,480 How is this really driving something? 1256 00:53:35,480 --> 00:53:36,380 It's really cool. 1257 00:53:36,380 --> 00:53:38,820 Yes, Mr. VP, I know you've never written a line of code 1258 00:53:38,820 --> 00:53:40,987 in your life, and it's really cool that you've built 1259 00:53:40,987 --> 00:53:42,980 a website now, but so what? 1260 00:53:42,980 --> 00:53:45,159 So think about that, and focus on that. 1261 00:53:45,159 --> 00:53:47,899 And that's how you avoid the bad technical debt. 1262 00:53:47,900 --> 00:53:51,498 And then, of course, the most understated part of this, 1263 00:53:51,498 --> 00:53:53,539 and in some ways the most important, particularly 1264 00:53:53,539 --> 00:53:55,199 if you're working in an organization, 1265 00:53:55,199 --> 00:53:56,859 is human understanding. 1266 00:53:56,860 --> 00:53:59,180 The worst technical debt that you can take on 1267 00:53:59,179 --> 00:54:02,522 is delivering code that nobody understands. 1268 00:54:02,523 --> 00:54:03,940 Only you understand that, and then 1269 00:54:03,940 --> 00:54:05,480 you quit and get a better job. 1270 00:54:05,480 --> 00:54:08,380 And then the company is now dependent on that code. 1271 00:54:08,380 --> 00:54:11,700 So being able to, as part of the process 1272 00:54:11,699 --> 00:54:15,259 of building it, to make sure that your code is understandable 1273 00:54:15,260 --> 00:54:17,520 through documentation, through clear algorithms, 1274 00:54:17,519 --> 00:54:19,644 through the fact that you've spent some time poring 1275 00:54:19,644 --> 00:54:21,179 through it to make sure that even 1276 00:54:21,179 --> 00:54:24,019 simple things like variable names make sense 1277 00:54:24,019 --> 00:54:28,340 is a really, really important way to avoid bad technical debt. 1278 00:54:28,340 --> 00:54:31,090 And that bad technical debt, my favorite one 1279 00:54:31,090 --> 00:54:33,809 is the classic solution looking for a problem. 1280 00:54:33,809 --> 00:54:34,829 Somebody has an idea. 1281 00:54:34,829 --> 00:54:36,090 Somebody has a tool. 1282 00:54:36,090 --> 00:54:37,750 If the only tool you have is a hammer, 1283 00:54:37,750 --> 00:54:39,489 every problem looks like a nail. 1284 00:54:39,489 --> 00:54:42,169 And you end up having all of these tools 1285 00:54:42,170 --> 00:54:44,490 that get vibe coded into existence. 1286 00:54:44,489 --> 00:54:46,129 I've worked in large organizations 1287 00:54:46,130 --> 00:54:48,632 where people just vibe coded stuff, checked it into the code 1288 00:54:48,632 --> 00:54:51,090 base, and then it became really hard to find the good stuff 1289 00:54:51,090 --> 00:54:53,010 amongst all the bad. 1290 00:54:53,010 --> 00:54:53,990 Spaghetti code. 1291 00:54:53,989 --> 00:54:56,027 Of course, poorly structured stuff, 1292 00:54:56,027 --> 00:54:58,569 particularly when you prompt and prompt and prompt and prompt 1293 00:54:58,570 --> 00:55:01,170 again, that it can end up getting 1294 00:55:01,170 --> 00:55:02,628 into all kinds of trouble. 1295 00:55:02,628 --> 00:55:05,170 My favorite one at the moment that I'm really struggling with 1296 00:55:05,170 --> 00:55:08,349 is I'm building a macOS application. 1297 00:55:08,349 --> 00:55:12,009 Anybody ever build in SwiftUI on macOS? 1298 00:55:12,010 --> 00:55:14,250 OK, a couple. 1299 00:55:14,250 --> 00:55:16,730 SwiftUI is the default language that Apple 1300 00:55:16,730 --> 00:55:19,969 use for building for macOS as well as iPhone. 1301 00:55:19,969 --> 00:55:22,269 But when you look at the training set, 1302 00:55:22,269 --> 00:55:25,110 the data training sets that are used to train these models, 1303 00:55:25,110 --> 00:55:28,789 the vast majority of the code is iPhone code, not macOS code. 1304 00:55:28,789 --> 00:55:30,500 And when I prompt code into existence, 1305 00:55:30,500 --> 00:55:35,139 it's often given me iOS APIs and those kind of things. 1306 00:55:35,139 --> 00:55:38,519 Even though I'm in Xcode and I've created a macOS app 1307 00:55:38,519 --> 00:55:41,420 and it's a macOS template and I'm talking to it in Xcode, 1308 00:55:41,420 --> 00:55:43,940 it still gives me iOS code, stuff like that. 1309 00:55:43,940 --> 00:55:46,700 And then if I try to change it using prompting, 1310 00:55:46,699 --> 00:55:49,419 you end up spiraling into spaghetti code, 1311 00:55:49,420 --> 00:55:52,342 and you have to end up changing a lot of this stuff manually. 1312 00:55:52,342 --> 00:55:53,759 And then, of course, the other one 1313 00:55:53,760 --> 00:55:56,480 that I joked about it earlier, but it's also true, 1314 00:55:56,480 --> 00:55:58,599 is some of the bad technical debt 1315 00:55:58,599 --> 00:56:01,679 that you're going to encounter in the workspace is authority 1316 00:56:01,679 --> 00:56:03,359 over merit. 1317 00:56:03,360 --> 00:56:05,980 That VP suddenly took out his credit card, 1318 00:56:05,980 --> 00:56:08,519 subscribed to Replit, and started building stuff 1319 00:56:08,519 --> 00:56:09,179 in Replit. 1320 00:56:09,179 --> 00:56:11,639 And guess whose job it is to fix it? 1321 00:56:11,639 --> 00:56:15,239 So a lot of the advice that I start 1322 00:56:15,239 --> 00:56:17,359 giving companies and a lot of the words 1323 00:56:17,360 --> 00:56:19,880 that I would encourage you to start thinking 1324 00:56:19,880 --> 00:56:23,640 of in being a trusted advisor is to understand this stuff 1325 00:56:23,639 --> 00:56:27,509 and to manage expectations accordingly. 1326 00:56:27,510 --> 00:56:29,870 OK, so framework for responsible vibe 1327 00:56:29,869 --> 00:56:32,190 coding we've just spoke about. 1328 00:56:32,190 --> 00:56:35,630 So one of the things I want to get into as we're coming soon 1329 00:56:35,630 --> 00:56:37,950 to a close is the hype cycle. 1330 00:56:37,949 --> 00:56:41,009 So hype is the most amazing force. 1331 00:56:41,010 --> 00:56:43,190 I mean, I think it's one of the strongest forces 1332 00:56:43,190 --> 00:56:45,829 in the universe, and particularly in anything 1333 00:56:45,829 --> 00:56:49,190 that's hot, such as the fields that I work in that are super 1334 00:56:49,190 --> 00:56:51,570 hot at the moment and full of hype or AI and crypto-- 1335 00:56:51,570 --> 00:56:53,630 you should see my Twitter feed-- 1336 00:56:53,630 --> 00:56:57,855 that the amount of nonsense that's out there is incredible. 1337 00:56:57,855 --> 00:56:59,230 So one of the things that I would 1338 00:56:59,230 --> 00:57:01,990 say about the anatomy of hype that you really 1339 00:57:01,989 --> 00:57:05,189 need to think about is if you are consuming 1340 00:57:05,190 --> 00:57:09,869 news via social media, that the currency of social media 1341 00:57:09,869 --> 00:57:12,029 is engagement. 1342 00:57:12,030 --> 00:57:15,390 Accuracy is not the currency of social media. 1343 00:57:15,389 --> 00:57:18,829 So I go on to-- even LinkedIn, which 1344 00:57:18,829 --> 00:57:21,110 is supposed to be the more professional of these, 1345 00:57:21,110 --> 00:57:26,059 is absolutely overwhelmed with influencers posting things 1346 00:57:26,059 --> 00:57:28,860 that they've used, Gemini or GPT, 1347 00:57:28,860 --> 00:57:32,420 to write an engaging post so that they can get engagement 1348 00:57:32,420 --> 00:57:33,860 and they can get likes. 1349 00:57:33,860 --> 00:57:37,539 And the engine itself is engineered, excuse the pun, 1350 00:57:37,539 --> 00:57:39,880 to reward those types of posts. 1351 00:57:39,880 --> 00:57:42,019 And we end up with that snowball effect 1352 00:57:42,019 --> 00:57:44,980 of engagement being rewarded. 1353 00:57:44,980 --> 00:57:46,619 If you are the kind of person who 1354 00:57:46,619 --> 00:57:49,259 can filter the signal from the noise, 1355 00:57:49,260 --> 00:57:53,220 and then who can encourage others around the signal and not 1356 00:57:53,219 --> 00:57:56,939 the noise, that puts you in a huge advantage that 1357 00:57:56,940 --> 00:57:58,659 makes you very distinctive. 1358 00:57:58,659 --> 00:58:01,859 It's not as quickly and easily tangible as likes 1359 00:58:01,860 --> 00:58:03,980 and engagements on social media. 1360 00:58:03,980 --> 00:58:06,300 But when you're in a one-to-one environment like a job 1361 00:58:06,300 --> 00:58:08,660 interview, or if you are in a job 1362 00:58:08,659 --> 00:58:11,420 and you are bringing that signal to the table instead 1363 00:58:11,420 --> 00:58:15,300 of the noise, that makes you immensely valuable. 1364 00:58:15,300 --> 00:58:17,460 So coming in with that mindset, coming 1365 00:58:17,460 --> 00:58:20,780 in with the idea of trying to filter 1366 00:58:20,780 --> 00:58:23,420 that signal from the noise, trying to understand 1367 00:58:23,420 --> 00:58:27,690 what is important in current affairs, how 1368 00:58:27,690 --> 00:58:30,829 you can be a trusted advisor in those things, 1369 00:58:30,829 --> 00:58:34,569 and how you can really whittle down that noise to help someone 1370 00:58:34,570 --> 00:58:35,830 is immensely valuable. 1371 00:58:35,829 --> 00:58:38,009 I want to start with one story. 1372 00:58:38,010 --> 00:58:39,770 I might be stealing my own thunder. 1373 00:58:39,769 --> 00:58:41,429 I'll go on to in a moment. 1374 00:58:41,429 --> 00:58:43,329 So one story. 1375 00:58:43,329 --> 00:58:46,690 Last year when agents started becoming the key word 1376 00:58:46,690 --> 00:58:49,769 and everybody saying, in 2025, agent 1377 00:58:49,769 --> 00:58:51,969 will be the word of the year and the trend 1378 00:58:51,969 --> 00:58:55,329 of the year, a company in Europe asked 1379 00:58:55,329 --> 00:58:57,849 me to help them to implement an agent. 1380 00:58:57,849 --> 00:58:59,569 So let me ask you a question. 1381 00:58:59,570 --> 00:59:01,350 If a company came up to you and said, 1382 00:59:01,349 --> 00:59:04,329 please help me implement an agent, 1383 00:59:04,329 --> 00:59:07,309 what's the correct first question that you ask them? 1384 00:59:10,929 --> 00:59:12,109 What is an agent for you? 1385 00:59:12,110 --> 00:59:12,610 OK. 1386 00:59:12,610 --> 00:59:13,128 That's good. 1387 00:59:13,128 --> 00:59:14,170 What is an agent for you? 1388 00:59:14,170 --> 00:59:17,070 I'd actually have a more fundamental question. 1389 00:59:17,070 --> 00:59:17,570 Yep. 1390 00:59:17,570 --> 00:59:18,570 What do you want to do? 1391 00:59:18,570 --> 00:59:19,380 What do you want to do? 1392 00:59:19,380 --> 00:59:19,880 OK. 1393 00:59:19,880 --> 00:59:21,410 Even more fundamental. 1394 00:59:21,409 --> 00:59:24,449 My question was why? 1395 00:59:24,449 --> 00:59:25,730 Why? 1396 00:59:25,730 --> 00:59:27,429 And peel that apart. 1397 00:59:27,429 --> 00:59:30,307 I spoke with the CEO, and he was like, oh. 1398 00:59:30,307 --> 00:59:31,849 Yeah, everybody's telling me that I'm 1399 00:59:31,849 --> 00:59:33,769 going to save business costs. 1400 00:59:33,769 --> 00:59:36,309 And I'm going to be able to do these amazing things. 1401 00:59:36,309 --> 00:59:38,201 And yeah, my business is going to get 1402 00:59:38,202 --> 00:59:39,410 better because I have agents. 1403 00:59:39,409 --> 00:59:41,601 And I'm like, well, who told you that? 1404 00:59:41,601 --> 00:59:43,809 It was like, oh, yeah, I read this thing on LinkedIn, 1405 00:59:43,809 --> 00:59:45,110 and I saw this thing on Twitter. 1406 00:59:45,110 --> 00:59:45,849 And it was like-- 1407 00:59:45,849 --> 00:59:47,589 and we ended up having that conversation. 1408 00:59:47,590 --> 00:59:49,530 And it was a difficult conversation 1409 00:59:49,530 --> 00:59:51,033 because I had to keep peeling apart. 1410 00:59:51,032 --> 00:59:52,449 And I started asking the questions 1411 00:59:52,449 --> 00:59:55,369 that you two just mentioned as well, until we really 1412 00:59:55,369 --> 00:59:57,969 got to the essence of what he wanted to do. 1413 00:59:57,969 --> 00:59:59,809 And what he really wanted to do, when 1414 00:59:59,809 --> 01:00:03,170 we take all domain knowledge about AI aside, 1415 01:00:03,170 --> 01:00:06,393 was that he wanted to make his salespeople more efficient. 1416 01:00:06,393 --> 01:00:08,809 And I was like, OK, you want to make your salespeople more 1417 01:00:08,809 --> 01:00:09,309 efficient. 1418 01:00:09,309 --> 01:00:11,809 Nowhere in that sentence do I hear the word AI, 1419 01:00:11,809 --> 01:00:14,969 and nowhere in that sentence do I hear the word agent. 1420 01:00:14,969 --> 01:00:17,250 So now, as a trusted advisor, let 1421 01:00:17,250 --> 01:00:19,650 me see what I can do to help your salespeople become 1422 01:00:19,650 --> 01:00:20,849 more efficient. 1423 01:00:20,849 --> 01:00:23,414 And I'm not going to be an AI Shill or an agent Shill. 1424 01:00:23,414 --> 01:00:26,039 I just want to say, what do we do to make your salespeople more 1425 01:00:26,039 --> 01:00:27,119 efficient? 1426 01:00:27,119 --> 01:00:29,239 If anybody here has ever worked in sales, 1427 01:00:29,239 --> 01:00:31,919 one of the things you realize what a good salesperson has 1428 01:00:31,920 --> 01:00:34,680 to do is their homework. 1429 01:00:34,679 --> 01:00:36,862 Before you have a sales call with somebody, 1430 01:00:36,862 --> 01:00:38,779 before you have a sales meeting with somebody, 1431 01:00:38,780 --> 01:00:40,100 you need to check their background. 1432 01:00:40,099 --> 01:00:41,349 You need to check the company. 1433 01:00:41,349 --> 01:00:43,199 You need to check the needs of the company. 1434 01:00:43,199 --> 01:00:45,819 You see it sometimes in the movie that, oh, 1435 01:00:45,820 --> 01:00:46,900 such and such plays golf. 1436 01:00:46,900 --> 01:00:48,192 So I'll take them to play golf. 1437 01:00:48,192 --> 01:00:51,000 It's not really that cliched, but there is a lot of background 1438 01:00:51,000 --> 01:00:52,239 that needs to be done. 1439 01:00:52,239 --> 01:00:56,000 So I spoke with him, and I spoke with their leading salespeople 1440 01:00:56,000 --> 01:00:58,699 and found out that-- and I asked the salespeople, 1441 01:00:58,699 --> 01:01:00,919 what do you hate most about your job? 1442 01:01:00,920 --> 01:01:02,680 And they were like, well, I hate the fact 1443 01:01:02,679 --> 01:01:04,839 that I have to waste all my time going 1444 01:01:04,840 --> 01:01:07,320 to visit these company websites, going 1445 01:01:07,320 --> 01:01:09,440 to look up people on LinkedIn. 1446 01:01:09,440 --> 01:01:12,079 And every website is structured differently. 1447 01:01:12,079 --> 01:01:16,159 So I can't just have a path through a website 1448 01:01:16,159 --> 01:01:17,259 that I can follow. 1449 01:01:17,260 --> 01:01:19,870 I have to take on all this cognitive load. 1450 01:01:19,869 --> 01:01:24,230 And they were spending about 80% of their time researching 1451 01:01:24,230 --> 01:01:26,550 and about 20% of their time selling. 1452 01:01:26,550 --> 01:01:28,150 Oh, and by the way, most salespeople 1453 01:01:28,150 --> 01:01:29,329 don't get paid very much. 1454 01:01:29,329 --> 01:01:31,210 They have to make it up by commission, 1455 01:01:31,210 --> 01:01:33,869 so they're only spending 20% of their time doing the thing that 1456 01:01:33,869 --> 01:01:35,670 gets them commission directly. 1457 01:01:35,670 --> 01:01:37,550 So we're like, OK, well, here's something now 1458 01:01:37,550 --> 01:01:40,050 where we can start thinking about making them more efficient 1459 01:01:40,050 --> 01:01:41,350 by cutting into that. 1460 01:01:41,349 --> 01:01:45,150 So we set a goal is like to make salespeople 20% more efficient. 1461 01:01:45,150 --> 01:01:48,030 And then we could start rolling out the ideas of AI. 1462 01:01:48,030 --> 01:01:51,190 And then we could start rolling out the ideas of agentic AI. 1463 01:01:51,190 --> 01:01:53,630 And a quick question what's the difference 1464 01:01:53,630 --> 01:01:55,690 between AI and agentic AI? 1465 01:02:00,730 --> 01:02:01,230 OK. 1466 01:02:01,230 --> 01:02:03,289 So-- yeah. 1467 01:02:03,289 --> 01:02:07,110 Like a good AI can do some [INAUDIBLE] a couple of steps. 1468 01:02:07,110 --> 01:02:07,670 OK. 1469 01:02:07,670 --> 01:02:10,849 [INAUDIBLE] 1470 01:02:11,269 --> 01:02:11,769 Yep. 1471 01:02:11,769 --> 01:02:12,150 Excellent. 1472 01:02:12,150 --> 01:02:12,410 Yeah. 1473 01:02:12,409 --> 01:02:14,326 So agentic AI is really about breaking it down 1474 01:02:14,327 --> 01:02:17,900 into steps, which is good engineering to begin with. 1475 01:02:17,900 --> 01:02:20,619 But agentic AI, in particular, I find 1476 01:02:20,619 --> 01:02:23,799 there's a set pattern of steps that if you follow them, 1477 01:02:23,800 --> 01:02:25,700 you end up with a whole idea of an agent. 1478 01:02:25,699 --> 01:02:29,019 The first of these steps is to understand intent. 1479 01:02:29,019 --> 01:02:32,360 We tend to use the words AI, Artificial Intelligence, a lot. 1480 01:02:32,360 --> 01:02:35,019 But what large language models are really, really good at 1481 01:02:35,019 --> 01:02:36,900 is also understanding. 1482 01:02:36,900 --> 01:02:39,180 So if the first step of anything that you want to do 1483 01:02:39,179 --> 01:02:41,379 is to understand intent. 1484 01:02:41,380 --> 01:02:44,230 And you can use an LLM to do that to think about this 1485 01:02:44,230 --> 01:02:45,480 is the task that I need to do. 1486 01:02:45,480 --> 01:02:46,771 This is how I'm going to do it. 1487 01:02:46,771 --> 01:02:47,539 Here's the intent. 1488 01:02:47,539 --> 01:02:53,900 I want to meet Bob Smith and sell widgets to Bob Smith. 1489 01:02:53,900 --> 01:02:56,740 And this is what I know about Bob Smith. 1490 01:02:56,739 --> 01:02:58,199 Help me with that intent. 1491 01:02:58,199 --> 01:03:01,259 The second part then is planning. 1492 01:03:01,260 --> 01:03:04,880 So you declare to an agent what tools are available to it, 1493 01:03:04,880 --> 01:03:06,800 browsing the web, searching the web, 1494 01:03:06,800 --> 01:03:08,100 all of these kind of things. 1495 01:03:08,099 --> 01:03:10,737 And once you understand your clear intent 1496 01:03:10,737 --> 01:03:12,820 to be able to go to the step of planning and using 1497 01:03:12,820 --> 01:03:15,620 those tools for planning, and an LLM is very, very good 1498 01:03:15,619 --> 01:03:17,449 at then breaking that down into the steps 1499 01:03:17,449 --> 01:03:19,669 that it needs to do to execute a plan. 1500 01:03:19,670 --> 01:03:21,750 Search the web with these keywords. 1501 01:03:21,750 --> 01:03:24,570 Browse this website and find these links, 1502 01:03:24,570 --> 01:03:25,970 those types of things. 1503 01:03:25,969 --> 01:03:27,669 Once it's then figured out that plan, 1504 01:03:27,670 --> 01:03:30,432 then it uses the tools to get to a results. 1505 01:03:30,432 --> 01:03:32,849 And then once it has the result, the fourth and final step 1506 01:03:32,849 --> 01:03:35,690 is to reflect on that result. And looking at the results 1507 01:03:35,690 --> 01:03:38,070 and going back to the intent, did we meet the intent? 1508 01:03:38,070 --> 01:03:38,789 Yes or no. 1509 01:03:38,789 --> 01:03:40,809 If we didn't, then go back to that loop. 1510 01:03:40,809 --> 01:03:43,690 All agent is really broken down into those things. 1511 01:03:43,690 --> 01:03:45,690 And if you think about breaking any problem down 1512 01:03:45,690 --> 01:03:47,409 into those four steps, that's when 1513 01:03:47,409 --> 01:03:49,069 you start building an agent. 1514 01:03:49,070 --> 01:03:50,990 And that was part of being a trusted advisor, 1515 01:03:50,989 --> 01:03:53,102 instead of coming in and waving hands and saying, 1516 01:03:53,103 --> 01:03:54,530 agent this, agent that. 1517 01:03:54,530 --> 01:03:56,590 Look at this toolkit, save 20%. 1518 01:03:56,590 --> 01:03:58,710 It's really to break it down into those steps. 1519 01:03:58,710 --> 01:03:59,210 Se we did. 1520 01:03:59,210 --> 01:04:01,110 We broke it down into those steps. 1521 01:04:01,110 --> 01:04:04,690 We built a pilot for the salespeople of this company, 1522 01:04:04,690 --> 01:04:09,349 and they ended up saving between 10% and 15% of their time, 1523 01:04:09,349 --> 01:04:10,710 of their wasted time. 1524 01:04:10,710 --> 01:04:13,690 The doctrine of unintended consequences 1525 01:04:13,690 --> 01:04:14,690 hit, though, after this. 1526 01:04:14,690 --> 01:04:17,880 And the unintended consequence was the salespeople 1527 01:04:17,880 --> 01:04:21,960 were much happier because the average salesperson was making 1528 01:04:21,960 --> 01:04:24,900 several percentage points more sales in a given week, 1529 01:04:24,900 --> 01:04:27,119 they were earning more money in a given week, 1530 01:04:27,119 --> 01:04:30,659 and their job just became a little bit less miserable. 1531 01:04:30,659 --> 01:04:32,659 And then refinement to that agentic process, 1532 01:04:32,659 --> 01:04:34,759 to be able to do all of that research for them 1533 01:04:34,760 --> 01:04:37,080 and to help give them a brief in a few minutes instead 1534 01:04:37,079 --> 01:04:39,860 of a few hours to help them with the sales process, 1535 01:04:39,860 --> 01:04:42,460 ended up being like a win-win-win all around. 1536 01:04:42,460 --> 01:04:44,420 But if you go in being hype led and oh, 1537 01:04:44,420 --> 01:04:48,079 build an agent for the thing without really peeling apart 1538 01:04:48,079 --> 01:04:50,469 the business requirements, the why, the what, 1539 01:04:50,469 --> 01:04:54,199 the how, and all of these kind of things, we ended up like, 1540 01:04:54,199 --> 01:04:56,319 this company just would have been lost in hype. 1541 01:04:56,320 --> 01:04:58,100 You've probably seen reports recently. 1542 01:04:58,099 --> 01:05:01,880 I think McKinsey put one out last week showing that about 85% 1543 01:05:01,880 --> 01:05:05,360 of AI projects at companies fail. 1544 01:05:05,360 --> 01:05:07,160 And part of the main reason for that 1545 01:05:07,159 --> 01:05:08,940 is that they're not well scoped. 1546 01:05:08,940 --> 01:05:10,722 People are jumping on the hype bandwagon, 1547 01:05:10,722 --> 01:05:12,639 and they're not really understanding their way 1548 01:05:12,639 --> 01:05:13,902 through the problem. 1549 01:05:13,902 --> 01:05:15,360 And I think you know the big brains 1550 01:05:15,360 --> 01:05:18,880 in this room and the network that you folks have are really 1551 01:05:18,880 --> 01:05:21,119 key component of being able to succeed 1552 01:05:21,119 --> 01:05:23,599 is to understand your way through that problem. 1553 01:05:23,599 --> 01:05:26,119 So that was a hype example around agentic 1554 01:05:26,119 --> 01:05:29,400 that I was thankfully able to help this company through. 1555 01:05:29,400 --> 01:05:31,519 Other recent hype examples you've probably seen, 1556 01:05:31,519 --> 01:05:33,259 the software engineering is dead. 1557 01:05:33,260 --> 01:05:38,280 My personal favorite, Hollywood is dead or AGI by year end. 1558 01:05:38,280 --> 01:05:41,080 I was in Saudi Arabia this time last year 1559 01:05:41,079 --> 01:05:42,960 at a thing called the FYI. 1560 01:05:42,960 --> 01:05:44,639 And it was a dinner at the FYI, and I 1561 01:05:44,639 --> 01:05:48,440 sat beside the CEO of a company who I'm not going to name, 1562 01:05:48,440 --> 01:05:51,500 but this was a CEO of a generative AI company. 1563 01:05:51,500 --> 01:05:53,400 And at that time he was showing everybody 1564 01:05:53,400 --> 01:05:55,519 around the table this thing that he'd 1565 01:05:55,519 --> 01:05:57,639 done, where it was text to video, 1566 01:05:57,639 --> 01:06:00,239 and he could put in a text prompt and get video out 1567 01:06:00,239 --> 01:06:02,799 of the prompt and get about six seconds worth of video 1568 01:06:02,800 --> 01:06:03,519 out of it. 1569 01:06:03,519 --> 01:06:04,599 A year ago, that was-- 1570 01:06:04,599 --> 01:06:06,739 I beg your pardon, two years ago. 1571 01:06:06,739 --> 01:06:08,179 Two years ago, that was hot stuff. 1572 01:06:08,179 --> 01:06:10,000 Nowadays, obviously, it's quite passé. 1573 01:06:10,000 --> 01:06:11,659 Anybody can do it. 1574 01:06:11,659 --> 01:06:13,609 But he made a comment at that table, 1575 01:06:13,610 --> 01:06:16,910 and it was a lot of media executives at that table 1576 01:06:16,909 --> 01:06:19,730 was like, by this time next year, from a single prompt, 1577 01:06:19,730 --> 01:06:21,889 we'll be able to do 90 minutes of video. 1578 01:06:21,889 --> 01:06:24,670 And so bye-bye, Hollywood. 1579 01:06:24,670 --> 01:06:27,950 So the whole Hollywood is dead meme, I think, came out of that. 1580 01:06:27,949 --> 01:06:30,750 First of all, we can't do 90 minutes, even two years later 1581 01:06:30,750 --> 01:06:31,523 from a prompt. 1582 01:06:31,523 --> 01:06:33,190 And even if you did, what kind of prompt 1583 01:06:33,190 --> 01:06:35,670 would be able to tell you a full story of a movie? 1584 01:06:35,670 --> 01:06:39,930 So this type of hype leads to engagement. 1585 01:06:39,929 --> 01:06:42,309 This type of hype leads to attention. 1586 01:06:42,309 --> 01:06:45,789 But my encouragement to you is to peel that apart. 1587 01:06:45,789 --> 01:06:47,550 Look for the signal. 1588 01:06:47,550 --> 01:06:48,850 Ask the why question. 1589 01:06:48,849 --> 01:06:52,829 Ask what question and move on from there. 1590 01:06:52,829 --> 01:06:55,829 So becoming that trusted advisor. 1591 01:06:55,829 --> 01:06:57,009 World's drowning in hype. 1592 01:06:57,010 --> 01:06:57,930 How do you do it? 1593 01:06:57,929 --> 01:07:00,710 Look at the trends, evaluate them objectively. 1594 01:07:00,710 --> 01:07:02,190 Look at the genuine opportunities 1595 01:07:02,190 --> 01:07:04,070 that are out there. 1596 01:07:04,070 --> 01:07:05,613 There are fashionable distractions. 1597 01:07:05,612 --> 01:07:07,529 I don't know what the next one is going to be, 1598 01:07:07,530 --> 01:07:08,830 but there are these distractions that 1599 01:07:08,829 --> 01:07:10,329 are out there that will get you lots 1600 01:07:10,329 --> 01:07:11,960 of engagement on social media. 1601 01:07:11,960 --> 01:07:13,780 Ignore them, and ignore the people 1602 01:07:13,780 --> 01:07:15,180 that are leaning into them. 1603 01:07:15,179 --> 01:07:18,899 And then really lean into your skills 1604 01:07:18,900 --> 01:07:23,139 about explaining technical reality to leadership. 1605 01:07:23,139 --> 01:07:25,259 One skill that one person coached me 1606 01:07:25,260 --> 01:07:27,540 in once that I thought was really interesting, 1607 01:07:27,539 --> 01:07:30,480 because it sounded wrong, but it ended up being right, 1608 01:07:30,480 --> 01:07:32,400 was whenever you see something like this, 1609 01:07:32,400 --> 01:07:35,860 try to figure out how to make it as mundane as possible. 1610 01:07:35,860 --> 01:07:38,720 When you can figure out how to make it as mundane as possible, 1611 01:07:38,719 --> 01:07:41,099 then you really begin to build the grounding 1612 01:07:41,099 --> 01:07:43,860 for being able to explain it in detail in ways 1613 01:07:43,860 --> 01:07:46,579 that people need to understand. 1614 01:07:46,579 --> 01:07:49,739 If you go and you look at, I think 1615 01:07:49,739 --> 01:07:53,299 Gemini 3 was released today, but there were leaks 1616 01:07:53,300 --> 01:07:54,740 earlier this week. 1617 01:07:54,739 --> 01:07:57,899 And one person leaked that I built a Minecraft clone 1618 01:07:57,900 --> 01:08:00,039 in a prompt, that kind of stuff. 1619 01:08:00,039 --> 01:08:02,219 This is the opposite of mundane. 1620 01:08:02,219 --> 01:08:05,074 This was massively hyping the thing, massively showing. 1621 01:08:05,074 --> 01:08:06,199 And of course, they didn't. 1622 01:08:06,199 --> 01:08:07,241 They built a flashy demo. 1623 01:08:07,242 --> 01:08:09,310 They didn't really build a Minecraft clone. 1624 01:08:09,309 --> 01:08:12,489 But the idea here is if you can peel that apart to OK, 1625 01:08:12,489 --> 01:08:15,329 how do I think about what are the mundane things that 1626 01:08:15,329 --> 01:08:17,329 are happening here? 1627 01:08:17,329 --> 01:08:20,649 The one that I've been working with a lot recently is video. 1628 01:08:20,649 --> 01:08:23,410 So text to video prompts, as I've mentioned, 1629 01:08:23,409 --> 01:08:27,329 instead of the magical, you can do whatever you want all nice 1630 01:08:27,329 --> 01:08:29,449 and fluffy Hollywood is dead, what 1631 01:08:29,449 --> 01:08:31,739 is the mundane element of doing text to Video 1632 01:08:31,739 --> 01:08:33,489 The mundane element of doing text to video 1633 01:08:33,489 --> 01:08:37,648 is that when you train a model to create video from a text 1634 01:08:37,649 --> 01:08:39,930 prompt, what it is doing is it's creating 1635 01:08:39,930 --> 01:08:41,789 a number of successive frames. 1636 01:08:41,789 --> 01:08:43,609 And each of those successive frames 1637 01:08:43,609 --> 01:08:46,410 is going to be slightly different from the frame before. 1638 01:08:46,409 --> 01:08:50,050 And you've trained a model by looking at video to say, well, 1639 01:08:50,050 --> 01:08:52,529 if in frame 1, the person's hands like this and frame 2 1640 01:08:52,529 --> 01:08:54,210 it's like that, then you can predict 1641 01:08:54,210 --> 01:08:56,449 it moves this way if there's a matching prompt. 1642 01:08:56,449 --> 01:08:58,949 And suddenly it's become a little bit more mundane, 1643 01:08:58,949 --> 01:09:00,992 but suddenly they begin to understand it. 1644 01:09:00,992 --> 01:09:02,449 And then the people who are experts 1645 01:09:02,449 --> 01:09:05,349 in that specific field, not the technical side of it, 1646 01:09:05,350 --> 01:09:06,840 are now the ones that will actually 1647 01:09:06,840 --> 01:09:11,039 be able to come up and do brilliant things with it. 1648 01:09:11,039 --> 01:09:13,399 So that height navigation strategy-- 1649 01:09:13,399 --> 01:09:16,520 filter actively, go deep on the fundamentals, 1650 01:09:16,520 --> 01:09:17,814 get your slides to work. 1651 01:09:17,814 --> 01:09:19,939 And then, of course, keep your finger on the pulse. 1652 01:09:19,939 --> 01:09:21,100 The hardest part of that, I think, 1653 01:09:21,100 --> 01:09:22,319 is the third one is really keeping 1654 01:09:22,319 --> 01:09:23,500 your finger on the pulse. 1655 01:09:23,500 --> 01:09:26,539 And that's when you have to wade into those cesspits of people 1656 01:09:26,539 --> 01:09:28,417 just farming engagement and really 1657 01:09:28,417 --> 01:09:30,500 try to figure out the signal from the noise there. 1658 01:09:30,500 --> 01:09:32,207 But I think it's really important for you 1659 01:09:32,207 --> 01:09:34,879 to be able to do that, to be connected, to understand that. 1660 01:09:34,880 --> 01:09:36,230 Reading papers is all very good. 1661 01:09:36,230 --> 01:09:38,438 The signal-to-noise ratio, I think, in reading papers 1662 01:09:38,439 --> 01:09:39,700 is a lot better. 1663 01:09:39,699 --> 01:09:41,840 But to understand the landscape that the people 1664 01:09:41,840 --> 01:09:44,119 that you are advising, they are the ones 1665 01:09:44,119 --> 01:09:47,722 who are waiting in the cesspools of Twitter and X and LinkedIn. 1666 01:09:47,722 --> 01:09:49,640 And there's nothing wrong with those platforms 1667 01:09:49,640 --> 01:09:51,390 in and of themselves, but the stuff that's 1668 01:09:51,390 --> 01:09:54,560 posted on those platforms. 1669 01:09:54,560 --> 01:10:00,440 So overall landscape, it is ripe with opportunity, 1670 01:10:00,439 --> 01:10:02,799 absolutely ripe with opportunity. 1671 01:10:02,800 --> 01:10:04,400 So I would encourage you, as Andrew 1672 01:10:04,399 --> 01:10:07,269 did, to continue learning, to continue digging 1673 01:10:07,270 --> 01:10:09,830 into what you can do and to continue building. 1674 01:10:09,829 --> 01:10:12,670 But there are risks ahead. 1675 01:10:12,670 --> 01:10:16,390 Anybody remember the movie Titanic? 1676 01:10:16,390 --> 01:10:19,510 Remember the famous phrase in that, "iceberg right ahead"? 1677 01:10:19,510 --> 01:10:23,257 But immediately before that, there's a scene in Titanic-- 1678 01:10:23,256 --> 01:10:25,089 if we weren't being filmed, I would show it, 1679 01:10:25,090 --> 01:10:27,350 but I can't for copyright reasons-- where 1680 01:10:27,350 --> 01:10:31,350 the two guys up in the crow's nest are freezing and talking. 1681 01:10:31,350 --> 01:10:33,230 And the crow's nest at the top of the ship 1682 01:10:33,229 --> 01:10:36,466 is where the spotters would be to spot any icebergs in front. 1683 01:10:36,466 --> 01:10:38,049 And go back and watch the movie again. 1684 01:10:38,050 --> 01:10:40,489 You'll see the conversation between these two guys 1685 01:10:40,489 --> 01:10:43,550 is that all they're talking about is how cold they are. 1686 01:10:43,550 --> 01:10:45,190 And then it cuts away to the crew 1687 01:10:45,189 --> 01:10:47,349 of the ship who are like, wait, aren't they 1688 01:10:47,350 --> 01:10:48,942 supposed to have binoculars? 1689 01:10:48,942 --> 01:10:51,149 And then the crew is like, oh, we left the binoculars 1690 01:10:51,149 --> 01:10:52,629 behind in port. 1691 01:10:52,630 --> 01:10:55,289 That framing the whole idea was like, 1692 01:10:55,289 --> 01:10:58,109 they were so arrogant in being able to move forward 1693 01:10:58,109 --> 01:11:00,567 that they didn't want to look out for any particular risks. 1694 01:11:00,568 --> 01:11:02,402 And even though they had people whose job it 1695 01:11:02,402 --> 01:11:04,380 was to look out for risks, they didn't properly 1696 01:11:04,380 --> 01:11:05,779 equip or train them. 1697 01:11:05,779 --> 01:11:07,979 And that, to me, is a really good metaphor 1698 01:11:07,979 --> 01:11:10,019 for where the AI industry is today. 1699 01:11:10,020 --> 01:11:12,500 There are risks in front of us. 1700 01:11:12,500 --> 01:11:14,539 Those risks, the B word, the bubble word 1701 01:11:14,539 --> 01:11:17,859 you're probably reading in the news is there, are there. 1702 01:11:17,859 --> 01:11:24,139 To me, though, the opportunity and the things to think about 1703 01:11:24,140 --> 01:11:28,220 in terms of a bubble are most of you probably don't remember 1704 01:11:28,220 --> 01:11:31,140 dotcom bubble of the 2000s. 1705 01:11:31,140 --> 01:11:33,619 But if you think about the dotcom bubble, 1706 01:11:33,619 --> 01:11:36,460 that was the biggest bubble in history. 1707 01:11:36,460 --> 01:11:40,220 It bursts, but we're still here. 1708 01:11:40,220 --> 01:11:46,260 And the people who dotcom rights not only survived, they thrived. 1709 01:11:46,260 --> 01:11:49,659 Amazon, Google, they did it right. 1710 01:11:49,659 --> 01:11:51,619 They understood the fundamentals of what 1711 01:11:51,619 --> 01:11:52,981 it was to build a dotcom. 1712 01:11:52,981 --> 01:11:54,939 They understood the fundamentals of what it was 1713 01:11:54,939 --> 01:11:56,659 to build a business on dotcom. 1714 01:11:56,659 --> 01:11:59,399 And when the bubble of hype burst, they didn't go with it. 1715 01:11:59,399 --> 01:12:01,859 There was one website, I believe it was pets.com, 1716 01:12:01,859 --> 01:12:06,579 that they had the mindset of if you build it, they will come. 1717 01:12:06,579 --> 01:12:10,019 They had Super Bowl commercials around pets.com. 1718 01:12:10,020 --> 01:12:12,260 They couldn't handle the traffic that they got. 1719 01:12:12,260 --> 01:12:15,480 And that was the kind of site that when the bubble burst, 1720 01:12:15,479 --> 01:12:17,779 those were the sites that just evaporated. 1721 01:12:17,779 --> 01:12:20,279 So that bubble in AI is likely coming. 1722 01:12:20,279 --> 01:12:22,259 There is always a bubble. 1723 01:12:22,260 --> 01:12:25,420 So the companies that are doing AI right 1724 01:12:25,420 --> 01:12:27,460 are the ones, like I said, that won't just 1725 01:12:27,460 --> 01:12:33,220 avoid the bubble that they will actually thrive post bubble. 1726 01:12:33,220 --> 01:12:37,380 And the people who are doing AI right, the folks 1727 01:12:37,380 --> 01:12:39,300 in this room who are thinking about AI 1728 01:12:39,300 --> 01:12:40,898 and how you bring it to your company, 1729 01:12:40,898 --> 01:12:42,940 and the advice that you're giving to your company 1730 01:12:42,939 --> 01:12:45,179 and leaning into that in the right way 1731 01:12:45,180 --> 01:12:48,619 will also be the ones who not only avoid getting laid off 1732 01:12:48,619 --> 01:12:52,579 in the bubble crashes, but will be the ones who will thrive 1733 01:12:52,579 --> 01:12:54,779 through and after the bubble. 1734 01:12:54,779 --> 01:12:57,739 So anatomy of any bubble, and what I'm seeing in the AI 1735 01:12:57,739 --> 01:13:00,289 one in particular, is this kind of pyramid. 1736 01:13:00,289 --> 01:13:02,710 At the top is the hype that I've been talking about. 1737 01:13:02,710 --> 01:13:05,770 At the bottom is massive VC investment. 1738 01:13:05,770 --> 01:13:06,470 I'll be frank. 1739 01:13:06,470 --> 01:13:08,730 I'm already seeing that drying up. 1740 01:13:08,729 --> 01:13:10,649 Once upon a time, you could go out 1741 01:13:10,649 --> 01:13:12,409 with anything that had AI written on it 1742 01:13:12,409 --> 01:13:13,967 and get VC investment. 1743 01:13:13,967 --> 01:13:16,010 Then you could go out and do anything with an LLM 1744 01:13:16,010 --> 01:13:17,530 and get VC investment. 1745 01:13:17,529 --> 01:13:20,769 Now there are far, far, far more cautious. 1746 01:13:20,770 --> 01:13:22,930 I've been advising a lot of startups. 1747 01:13:22,930 --> 01:13:27,329 The amount that they're getting invested is being scaled back. 1748 01:13:27,329 --> 01:13:30,130 The stuff that's being invested in is changing. 1749 01:13:30,130 --> 01:13:35,170 And the second layer down, massive VC investment 1750 01:13:35,170 --> 01:13:37,369 is already beginning to vanish. 1751 01:13:37,369 --> 01:13:39,369 Unrealistic valuations. 1752 01:13:39,369 --> 01:13:42,289 Companies that aren't making money being valued 1753 01:13:42,289 --> 01:13:43,109 massively high. 1754 01:13:43,109 --> 01:13:44,369 We all who they are. 1755 01:13:44,369 --> 01:13:47,329 We're beginning to see those unrealistic valuations being 1756 01:13:47,329 --> 01:13:49,050 fed off of that hype. 1757 01:13:49,050 --> 01:13:51,993 # #MeToo products, where somebody does something, 1758 01:13:51,993 --> 01:13:53,410 and it's successful, and everybody 1759 01:13:53,409 --> 01:13:54,750 jumps on the bandwagon. 1760 01:13:54,750 --> 01:13:56,390 We're also seeing them everywhere. 1761 01:13:56,390 --> 01:13:59,160 We saw them throughout the dotcom bubble. 1762 01:13:59,159 --> 01:14:01,800 And then right at the bottom is that real value. 1763 01:14:01,800 --> 01:14:04,460 I probably shouldn't have done the triangle like this. 1764 01:14:04,460 --> 01:14:06,300 It should be more an upside down triangle. 1765 01:14:06,300 --> 01:14:08,820 Because the real value here is small. 1766 01:14:08,819 --> 01:14:11,179 But I've vibe coded these slides into existence. 1767 01:14:11,180 --> 01:14:14,039 So this is one of the technical debt I took on. 1768 01:14:14,039 --> 01:14:18,180 But the real value there, that kernel of value is there, 1769 01:14:18,180 --> 01:14:22,840 and the ones that build for that will be the ones that survive. 1770 01:14:22,840 --> 01:14:28,777 So the direction that I see the AI industry going in 1771 01:14:28,777 --> 01:14:31,359 and the direction that I would encourage you to start thinking 1772 01:14:31,359 --> 01:14:33,859 about your skills in, is really over the next five years, 1773 01:14:33,859 --> 01:14:36,059 there's going to be a bifurcation. 1774 01:14:36,060 --> 01:14:38,760 I'm just going to be ornery in how 1775 01:14:38,760 --> 01:14:40,880 I describe it as big and small. 1776 01:14:40,880 --> 01:14:43,720 Big AI will be what we see today, with the large language 1777 01:14:43,720 --> 01:14:48,440 models getting bigger in the desire to drive towards AGI. 1778 01:14:48,439 --> 01:14:52,079 The Geminis, the Claudes, the OpenAIs of the world 1779 01:14:52,079 --> 01:14:54,319 are going to continue to drive bigger, and bigger 1780 01:14:54,319 --> 01:14:57,829 is better in the mindset of those companies 1781 01:14:57,829 --> 01:15:01,659 towards achieving AGI or towards achieving better business value. 1782 01:15:01,659 --> 01:15:03,409 That's going to be one side of the branch. 1783 01:15:03,409 --> 01:15:05,867 The other side of the branch is I'm going to call it small. 1784 01:15:05,868 --> 01:15:08,510 We've all seen open-source models. 1785 01:15:08,510 --> 01:15:10,347 I hate the term open source. 1786 01:15:10,347 --> 01:15:12,390 Let me call them open weights or let me call them 1787 01:15:12,390 --> 01:15:15,590 self-hostable models are becoming-- they're 1788 01:15:15,590 --> 01:15:17,670 exploding onto the landscape. 1789 01:15:17,670 --> 01:15:20,069 I read an article recently about Y Combinator 1790 01:15:20,069 --> 01:15:23,069 that 80% of the companies in Y Combinator 1791 01:15:23,069 --> 01:15:26,429 were using small models from China in particular. 1792 01:15:26,430 --> 01:15:29,230 So the Chinese models in particular 1793 01:15:29,229 --> 01:15:31,269 are doing really well, probably because of 1794 01:15:31,270 --> 01:15:32,380 the overall landscape. 1795 01:15:32,380 --> 01:15:34,630 They're not leaning into the large models the same way 1796 01:15:34,630 --> 01:15:36,230 as the West is. 1797 01:15:36,229 --> 01:15:37,771 I see that bifurcation happening. 1798 01:15:37,771 --> 01:15:39,229 China, I think, has that head start 1799 01:15:39,229 --> 01:15:41,089 on the small models that may last. 1800 01:15:41,090 --> 01:15:41,989 It may not. 1801 01:15:41,989 --> 01:15:43,069 I don't know. 1802 01:15:43,069 --> 01:15:45,829 But the point is, we're heading in that particular direction 1803 01:15:45,829 --> 01:15:48,729 of I'm going to call them instead of big and small now, 1804 01:15:48,729 --> 01:15:52,429 models that are hosted on your behalf by somebody else, 1805 01:15:52,430 --> 01:15:54,950 like a GPT or a Gemini or a Claude, 1806 01:15:54,949 --> 01:15:59,099 or models that you can host yourself for your own needs. 1807 01:15:59,100 --> 01:16:03,220 As this side has right now is underserved, 1808 01:16:03,220 --> 01:16:05,180 this bubble may burst. 1809 01:16:05,180 --> 01:16:06,920 This one right now is underserved. 1810 01:16:06,920 --> 01:16:09,699 And this bubble will be later on. 1811 01:16:09,699 --> 01:16:12,819 And the major skills that I can see developers needing 1812 01:16:12,819 --> 01:16:16,299 over the next two to three years on this side of the fence 1813 01:16:16,300 --> 01:16:18,500 will be fine tuning. 1814 01:16:18,500 --> 01:16:21,539 So the ability to take an open-source model 1815 01:16:21,539 --> 01:16:25,019 and fine-tune it for particular downstream tasks. 1816 01:16:25,020 --> 01:16:27,700 Let me give one concrete example of that I've personally 1817 01:16:27,699 --> 01:16:28,739 experienced. 1818 01:16:28,739 --> 01:16:30,939 I work a lot in Hollywood, and I've worked a lot 1819 01:16:30,939 --> 01:16:33,259 with studios making movies. 1820 01:16:33,260 --> 01:16:36,020 And one studio in particular I was lucky enough 1821 01:16:36,020 --> 01:16:38,560 to sell a movie to, it's still in preproduction. 1822 01:16:38,560 --> 01:16:41,180 It'll probably be in preproduction forever. 1823 01:16:41,180 --> 01:16:44,340 But one of the things I learned as part of that process 1824 01:16:44,340 --> 01:16:49,020 was IP in studios is so protected. 1825 01:16:49,020 --> 01:16:50,980 It's not even funny. 1826 01:16:50,979 --> 01:16:52,779 Go in Google for James Cameron, who 1827 01:16:52,779 --> 01:16:55,090 created Avatar and the lawsuits that he's 1828 01:16:55,090 --> 01:16:58,170 involved in of this person who apparently sent him 1829 01:16:58,170 --> 01:17:00,810 a story many years ago about blue aliens 1830 01:17:00,810 --> 01:17:02,850 and is now suing him for billions of dollars 1831 01:17:02,850 --> 01:17:06,410 because obviously there were blue aliens in Avatar. 1832 01:17:06,409 --> 01:17:09,889 That level of IP protection in Hollywood is insane. 1833 01:17:09,890 --> 01:17:12,730 The opportunity with large language models 1834 01:17:12,729 --> 01:17:15,129 is equally insane. 1835 01:17:15,130 --> 01:17:17,949 A lot of the focus is on large language models for creation, 1836 01:17:17,949 --> 01:17:20,069 for storytelling, for rendering and all that, 1837 01:17:20,069 --> 01:17:22,289 but actually the major opportunity that they have is 1838 01:17:22,289 --> 01:17:27,170 actually for analysis to take a look at synopses of movies 1839 01:17:27,170 --> 01:17:29,369 and find out what works and what doesn't. 1840 01:17:29,369 --> 01:17:32,090 Why was this movie a hit and this one wasn't? 1841 01:17:32,090 --> 01:17:34,650 What time of year was this one released and it became 1842 01:17:34,649 --> 01:17:36,489 successful and this one wasn't? 1843 01:17:36,489 --> 01:17:39,149 And with a margin on movies being razor thin, 1844 01:17:39,149 --> 01:17:40,929 that kind of analysis is huge. 1845 01:17:40,930 --> 01:17:42,637 But in order to do that kind of analysis, 1846 01:17:42,637 --> 01:17:44,430 you need to share the details of your movie 1847 01:17:44,430 --> 01:17:45,650 with a large language model. 1848 01:17:45,649 --> 01:17:48,862 And they will absolutely not do that with GPT or Gemini 1849 01:17:48,863 --> 01:17:50,530 or whatever, because they're now sharing 1850 01:17:50,529 --> 01:17:52,809 their IP with a third party. 1851 01:17:52,810 --> 01:17:55,330 Enter small models, where they can self-host 1852 01:17:55,329 --> 01:17:58,189 their own small model and they are getting smarter and smarter. 1853 01:17:58,189 --> 01:18:02,089 The 7B model of today is as smart as the 50B model 1854 01:18:02,090 --> 01:18:03,170 of yesterday. 1855 01:18:03,170 --> 01:18:06,409 A year from now, the 7B model of a year from now will be as smart 1856 01:18:06,409 --> 01:18:09,849 as the 300B model of yesteryear. 1857 01:18:09,850 --> 01:18:13,090 So they're moving in that direction of building 1858 01:18:13,090 --> 01:18:16,369 using small self-hosted models, which they can then 1859 01:18:16,369 --> 01:18:18,104 fine-tune on downstream tasks. 1860 01:18:18,104 --> 01:18:19,729 Similar with other things where privacy 1861 01:18:19,729 --> 01:18:21,809 is important law offices, medical offices, all 1862 01:18:21,810 --> 01:18:22,950 of those kind of things. 1863 01:18:22,949 --> 01:18:25,170 So those type of skills are fundamentally 1864 01:18:25,170 --> 01:18:27,090 important going forward. 1865 01:18:27,090 --> 01:18:30,449 So that's the bifurcation that I'm seeing happening in AI. 1866 01:18:30,449 --> 01:18:34,329 The sooner bubble I think is in the bigger non self-hosted. 1867 01:18:34,329 --> 01:18:36,809 The later bubble is in the smaller self-hosted. 1868 01:18:36,810 --> 01:18:39,330 But either way, for you, for your career, 1869 01:18:39,329 --> 01:18:42,689 to avoid the impact of any bubble bursting, 1870 01:18:42,689 --> 01:18:44,429 focus on the fundamentals. 1871 01:18:44,430 --> 01:18:46,190 Build those real solutions. 1872 01:18:46,189 --> 01:18:48,389 Understand the business side, and most of all, 1873 01:18:48,390 --> 01:18:49,789 diversify your skills. 1874 01:18:49,789 --> 01:18:53,579 Don't be that one trick pony who only knows how to do one thing. 1875 01:18:53,579 --> 01:18:55,600 I've worked with brilliant people who 1876 01:18:55,600 --> 01:18:58,620 are fantastic at coding, in particular API, 1877 01:18:58,619 --> 01:18:59,699 or particular framework. 1878 01:18:59,699 --> 01:19:03,720 And then the industry moved on and they got left behind. 1879 01:19:03,720 --> 01:19:07,600 OK, so yeah, when bubbles burst, that overall fallout kind of 1880 01:19:07,600 --> 01:19:09,400 spoken about it a little bit already. 1881 01:19:09,399 --> 01:19:12,019 Funding evaporates, hiring freezes become layoffs, 1882 01:19:12,020 --> 01:19:14,320 projects get canceled, and talent floods the market. 1883 01:19:14,319 --> 01:19:14,819 Yeah. 1884 01:19:14,819 --> 01:19:17,939 Quick question from the last slide. 1885 01:19:17,939 --> 01:19:23,679 [INAUDIBLE] I heard a lot about how NVIDIA is hiring, 1886 01:19:23,680 --> 01:19:26,240 and they're very specific about they 1887 01:19:26,239 --> 01:19:30,599 want people for very specific problem that they have. 1888 01:19:30,600 --> 01:19:34,600 So they can require people to be basically put out that one thing 1889 01:19:34,600 --> 01:19:35,800 that you're missing. 1890 01:19:35,800 --> 01:19:43,079 So how do you think-- how is it more important to diversify 1891 01:19:43,079 --> 01:19:46,420 skills versus actually focusing on, for example, 1892 01:19:46,420 --> 01:19:49,350 LLMs versus computer vision or versus 1893 01:19:49,350 --> 01:19:52,550 very specific downstream task? 1894 01:19:52,550 --> 01:19:55,230 So I mean, I think so the question was around NVIDIA 1895 01:19:55,229 --> 01:19:57,669 in particular or hiring for a very specific, very 1896 01:19:57,670 --> 01:19:58,806 narrow scenario. 1897 01:19:58,806 --> 01:20:00,389 So then the question is, how important 1898 01:20:00,390 --> 01:20:01,765 is it for you to become an expert 1899 01:20:01,765 --> 01:20:04,869 in a narrow scenario versus diversifying your skills? 1900 01:20:04,869 --> 01:20:08,750 I would always argue it's still better to diversify your skills, 1901 01:20:08,750 --> 01:20:11,069 because that one narrow scenario is only that one 1902 01:20:11,069 --> 01:20:13,112 narrow scenario, and you're putting all your eggs 1903 01:20:13,112 --> 01:20:13,850 into one basket. 1904 01:20:13,850 --> 01:20:16,310 NVIDIA would be a fantastic company to work for. 1905 01:20:16,310 --> 01:20:17,817 Nothing against them in any way. 1906 01:20:17,817 --> 01:20:20,149 But if you put all of your eggs into that basket and you 1907 01:20:20,149 --> 01:20:22,309 don't get it, then what? 1908 01:20:22,310 --> 01:20:24,310 So I think the idea of really being 1909 01:20:24,310 --> 01:20:28,230 able to-- if you are passionate about a thing, 1910 01:20:28,229 --> 01:20:31,289 to be very deep in that thing is very, very good. 1911 01:20:31,289 --> 01:20:33,689 But to only be able to do that thing, 1912 01:20:33,689 --> 01:20:36,809 I think I would always encourage to be diversified. 1913 01:20:36,810 --> 01:20:39,430 And when I say diversified, you're saying LLMs or computer 1914 01:20:39,430 --> 01:20:41,110 vision or anything like that, I think 1915 01:20:41,109 --> 01:20:42,489 I mean that's one part of it. 1916 01:20:42,489 --> 01:20:46,269 But it's like that knowledge of models and how to use them to me 1917 01:20:46,270 --> 01:20:47,940 is a uni skill. 1918 01:20:47,939 --> 01:20:51,556 The diversification of skills is breaking outside of that. 1919 01:20:51,556 --> 01:20:54,139 Also to be able to think, OK, what about building applications 1920 01:20:54,140 --> 01:20:55,320 on top of these? 1921 01:20:55,319 --> 01:20:57,319 What does scaling an application look like? 1922 01:20:57,319 --> 01:20:59,599 What does software engineering in this case look like? 1923 01:20:59,600 --> 01:21:02,720 What about user experience and user experience skills? 1924 01:21:02,720 --> 01:21:05,560 Because it's all very well to build a beautiful application. 1925 01:21:05,560 --> 01:21:06,960 But if nobody can use it-- 1926 01:21:06,960 --> 01:21:10,140 I'm looking at here at Microsoft Office. 1927 01:21:10,140 --> 01:21:13,220 There's stuff like that that's what I really 1928 01:21:13,220 --> 01:21:14,840 mean about diversifying beyond. 1929 01:21:14,840 --> 01:21:17,400 So even in that mono example with NVIDIA, 1930 01:21:17,399 --> 01:21:20,359 to be able to break out of that one particular example, 1931 01:21:20,359 --> 01:21:22,960 but to show skills in other areas that are of value, 1932 01:21:22,960 --> 01:21:26,060 I think is really important. 1933 01:21:26,060 --> 01:21:27,100 OK. 1934 01:21:27,100 --> 01:21:29,440 As we're just running a little bit-- so yeah, 1935 01:21:29,439 --> 01:21:30,419 I just wanted to-- 1936 01:21:30,420 --> 01:21:32,560 I've gone into it a little bit already, 1937 01:21:32,560 --> 01:21:35,000 but I'm a massive advocate for small AI. 1938 01:21:35,000 --> 01:21:38,300 I really do believe small AI is the next big thing, 1939 01:21:38,300 --> 01:21:39,920 because we're moving into a world, 1940 01:21:39,920 --> 01:21:42,220 and this is part of the job that I do at Arm, 1941 01:21:42,220 --> 01:21:44,740 is we're kind of moving into a world of AI everywhere 1942 01:21:44,739 --> 01:21:46,369 all at once. 1943 01:21:46,369 --> 01:21:47,827 So there's a traditional, and it's 1944 01:21:47,827 --> 01:21:49,409 interesting you just brought up NVIDIA 1945 01:21:49,409 --> 01:21:51,849 because there's a traditional conception 1946 01:21:51,850 --> 01:21:55,690 that compute platforms are CPU plus GPU when it comes to AI. 1947 01:21:55,689 --> 01:21:57,489 But that's also changing-- 1948 01:21:57,489 --> 01:22:00,189 CPU general purpose, GPU specialists. 1949 01:22:00,189 --> 01:22:02,729 But for example, in mobile space, 1950 01:22:02,729 --> 01:22:05,729 there's massive innovation being done with the technology 1951 01:22:05,729 --> 01:22:09,049 called SME, Scalable Matrix Extensions. 1952 01:22:09,050 --> 01:22:11,210 And what SME is all about is really 1953 01:22:11,210 --> 01:22:13,050 allowing you to bring AI workloads 1954 01:22:13,050 --> 01:22:15,050 and put them on the CPU. 1955 01:22:15,050 --> 01:22:19,489 The frontrunners in this are a couple of Chinese phone vendors, 1956 01:22:19,489 --> 01:22:22,809 Vivo and Oppo, who've just recently released phones 1957 01:22:22,810 --> 01:22:24,690 with SME-enabled chips. 1958 01:22:24,689 --> 01:22:26,989 And what's magical about these is that, A, 1959 01:22:26,989 --> 01:22:30,010 they don't need to have a separate external chip drawing 1960 01:22:30,010 --> 01:22:33,250 extra power, taking up extra footprint space just 1961 01:22:33,250 --> 01:22:35,109 to be able to run AI workloads. 1962 01:22:35,109 --> 01:22:38,549 And B, the CPU, of course, being a low power pulling thing, 1963 01:22:38,550 --> 01:22:40,670 being able to run AI workloads on that, 1964 01:22:40,670 --> 01:22:43,029 they've been able to build interesting new scenarios. 1965 01:22:43,029 --> 01:22:45,199 And if I talk about one in particular, 1966 01:22:45,199 --> 01:22:47,039 there's a company called Alipay. 1967 01:22:47,039 --> 01:22:50,659 And Alipay had an application where you would-- 1968 01:22:50,659 --> 01:22:52,279 and we've all seen these apps where 1969 01:22:52,279 --> 01:22:53,960 you can go through your photographs, 1970 01:22:53,960 --> 01:22:56,039 and you can search for a particular thing. 1971 01:22:56,039 --> 01:22:59,239 Places I ate sushi or something along those lines and use 1972 01:22:59,239 --> 01:23:00,880 that to create a slideshow. 1973 01:23:00,880 --> 01:23:03,840 All of those require a back end service. 1974 01:23:03,840 --> 01:23:06,640 So your photographs are hosted on Google Photos or Apple 1975 01:23:06,640 --> 01:23:08,539 Photos or something like that. 1976 01:23:08,539 --> 01:23:10,359 And that back end service runs the model 1977 01:23:10,359 --> 01:23:12,119 that you can search against it and be 1978 01:23:12,119 --> 01:23:14,239 able to do the assembly of them. 1979 01:23:14,239 --> 01:23:16,239 What Alipay wanted to do was like, say, there 1980 01:23:16,239 --> 01:23:17,880 are three problems with this. 1981 01:23:17,880 --> 01:23:19,699 Problem number one, privacy. 1982 01:23:19,699 --> 01:23:21,840 You have to share your photos with a third party. 1983 01:23:21,840 --> 01:23:23,659 Problem number two, latency. 1984 01:23:23,659 --> 01:23:25,019 You got to upload those photos. 1985 01:23:25,020 --> 01:23:26,220 You got to send the thing. 1986 01:23:26,220 --> 01:23:28,021 You got to have the back end do the thing, 1987 01:23:28,021 --> 01:23:30,479 and then you've got to download the results from the thing. 1988 01:23:30,479 --> 01:23:33,279 And then number three is building that cloud service 1989 01:23:33,279 --> 01:23:36,119 and standing that up cost time and money. 1990 01:23:36,119 --> 01:23:39,720 So if they could move all of this onto the device itself, 1991 01:23:39,720 --> 01:23:41,960 now the idea was they could run a model 1992 01:23:41,960 --> 01:23:44,420 on the device that searches the photos on the device. 1993 01:23:44,420 --> 01:23:45,680 You don't have the latency. 1994 01:23:45,680 --> 01:23:47,800 And business perspective, they're 1995 01:23:47,800 --> 01:23:51,060 now saving the money on creating this stand up service. 1996 01:23:51,060 --> 01:23:54,240 They now have AI running on CPU in order to be able to do that. 1997 01:23:54,239 --> 01:23:56,800 Apple are also people who've invested heavily 1998 01:23:56,800 --> 01:23:59,143 in this scalable matrix extensions. 1999 01:23:59,143 --> 01:24:00,560 You see whenever they talk about-- 2000 01:24:00,560 --> 01:24:03,760 if you've ever watched a WWDC or anything like that, when they 2001 01:24:03,760 --> 01:24:06,815 talk about the new A-series chips and M-series chips, 2002 01:24:06,814 --> 01:24:09,439 about the neural cores and those kind of things in them, that's 2003 01:24:09,439 --> 01:24:10,799 part of the idea. 2004 01:24:10,800 --> 01:24:15,279 So to think about breaking that habit that we've gotten into, 2005 01:24:15,279 --> 01:24:18,719 where you need a GPU to be able to do AI is part of the trend 2006 01:24:18,720 --> 01:24:20,143 that the world is heading in. 2007 01:24:20,143 --> 01:24:22,060 Apple are probably one of the leaders in that. 2008 01:24:22,060 --> 01:24:24,800 I'm very, very bullish on Apple and Apple Intelligence 2009 01:24:24,800 --> 01:24:31,760 as a result. And from the AI perspective, seeing that trend 2010 01:24:31,760 --> 01:24:36,039 and following that vector to its logical conclusion as models 2011 01:24:36,039 --> 01:24:39,600 are getting smaller embedded intelligence getting everywhere 2012 01:24:39,600 --> 01:24:40,740 isn't a pipe dream. 2013 01:24:40,739 --> 01:24:41,849 It isn't sci-fi anymore. 2014 01:24:41,850 --> 01:24:43,392 It's going to be a reality that we'll 2015 01:24:43,391 --> 01:24:44,809 be seeing very, very shortly. 2016 01:24:44,810 --> 01:24:47,410 So that idea of that convergence of AI, 2017 01:24:47,409 --> 01:24:50,869 because of the ability of smaller models getting smarter 2018 01:24:50,869 --> 01:24:53,930 and lower power devices being able to run them, 2019 01:24:53,930 --> 01:24:56,550 we see that convergence hitting, and I see massive opportunity 2020 01:24:56,550 --> 01:24:58,470 there. 2021 01:24:58,470 --> 01:25:01,890 So one last part and just going back to agents for a moment, 2022 01:25:01,890 --> 01:25:03,950 I think the one thing that I always 2023 01:25:03,949 --> 01:25:06,750 say is like a hidden part of artificial intelligence 2024 01:25:06,750 --> 01:25:09,430 is really what I like to call artificial understanding. 2025 01:25:09,430 --> 01:25:12,590 And when you can start using models to understand things 2026 01:25:12,590 --> 01:25:14,210 on your behalf. 2027 01:25:14,210 --> 01:25:16,170 And when they understand them on your behalf, 2028 01:25:16,170 --> 01:25:20,270 to be able to craft from that understanding new things, 2029 01:25:20,270 --> 01:25:22,188 you can actually develop superpowers 2030 01:25:22,188 --> 01:25:24,230 where you're far more effective than ever before, 2031 01:25:24,229 --> 01:25:26,829 be that creating code or creating other things. 2032 01:25:26,829 --> 01:25:30,470 I'm going to give one quick demo just so we can wrap up. 2033 01:25:30,470 --> 01:25:35,270 And I was talking earlier about generating video. 2034 01:25:35,270 --> 01:25:39,130 So this picture is-- oops. 2035 01:25:42,159 --> 01:25:42,659 Sorry. 2036 01:25:42,659 --> 01:25:45,920 The connection here is not very good, I lost it. 2037 01:25:45,920 --> 01:25:47,000 So here we go. 2038 01:25:47,000 --> 01:25:50,659 This picture here is actually of my son playing ice hockey. 2039 01:25:50,659 --> 01:25:53,239 And I took this picture, and I was saying, 2040 01:25:53,239 --> 01:25:56,859 OK, I think I'm very good at prompting. 2041 01:25:56,859 --> 01:26:00,399 And I wrote a nice prompt for this picture to get him. 2042 01:26:00,399 --> 01:26:02,399 He's in the middle of taking a slapshot. 2043 01:26:02,399 --> 01:26:04,699 He's got some beautiful flex on his stick. 2044 01:26:04,699 --> 01:26:08,340 And I asked it like, OK, to prompt him scoring a goal. 2045 01:26:08,340 --> 01:26:10,659 What do you think happened? 2046 01:26:10,659 --> 01:26:12,739 Should we watch? 2047 01:26:12,739 --> 01:26:13,840 Let's see if it works. 2048 01:26:13,840 --> 01:26:15,747 [VIDEO PLAYBACK] 2049 01:26:18,560 --> 01:26:20,796 [CROWD CHEERING] 2050 01:26:20,796 --> 01:26:21,379 [END PLAYBACK] 2051 01:26:21,380 --> 01:26:25,380 This was the wrong video, but it still shows the same idea. 2052 01:26:25,380 --> 01:26:29,500 Because of poor prompting or because of poor understanding 2053 01:26:29,500 --> 01:26:34,020 of my intent, if I talk about it in agentic senses, 2054 01:26:34,020 --> 01:26:36,580 the arena that he was in, which is a practice arena 2055 01:26:36,579 --> 01:26:38,890 and doesn't have any people in it-- sorry. 2056 01:26:38,890 --> 01:26:41,970 Let me pause it. 2057 01:26:41,970 --> 01:26:46,490 If I just rewind to here, if we look up 2058 01:26:46,489 --> 01:26:48,750 in this top right-hand corner here, 2059 01:26:48,750 --> 01:26:51,409 this is basically where they store all their garbage. 2060 01:26:51,409 --> 01:26:53,813 But the AI didn't know that, had no idea of it. 2061 01:26:53,813 --> 01:26:55,230 So it assumed it was a full arena, 2062 01:26:55,229 --> 01:26:57,169 and it started painting people in. 2063 01:26:57,170 --> 01:27:00,630 And even though he shot a mile wide, everybody cheers. 2064 01:27:00,630 --> 01:27:03,829 And somehow he has two sticks in his hand instead of one, 2065 01:27:03,829 --> 01:27:05,649 and they forgot his name. 2066 01:27:05,649 --> 01:27:09,710 So I did not go through an agentic workflow to do this. 2067 01:27:09,710 --> 01:27:13,630 I did not go through the steps of, A, understand my intent. 2068 01:27:13,630 --> 01:27:15,632 B, once you understand my intent, 2069 01:27:15,631 --> 01:27:17,589 understand the tools that are available to you. 2070 01:27:17,590 --> 01:27:19,610 In this case, it's Veo, and understand 2071 01:27:19,609 --> 01:27:21,969 the intricacies of using Veo. 2072 01:27:21,970 --> 01:27:23,273 Make a plan of how to use them. 2073 01:27:23,273 --> 01:27:25,190 Make a plan of how to build a prompt for them, 2074 01:27:25,189 --> 01:27:27,210 and then use them and then reflect. 2075 01:27:27,210 --> 01:27:32,090 So I've been advising a startup that is working 2076 01:27:32,090 --> 01:27:34,390 on movie creation using AI. 2077 01:27:34,390 --> 01:27:36,800 And I want to show you a little sample here of a movie 2078 01:27:36,800 --> 01:27:39,720 that we've been working on with them, where the whole idea is 2079 01:27:39,720 --> 01:27:42,360 like, if you want to have performances at a virtual actors 2080 01:27:42,359 --> 01:27:45,039 and actresses, you need to have emotion. 2081 01:27:45,039 --> 01:27:47,340 You need to be able to convey that emotion, 2082 01:27:47,340 --> 01:27:50,640 and you also need to be able to put that emotion in the context 2083 01:27:50,640 --> 01:27:52,119 of the entire story. 2084 01:27:52,119 --> 01:27:54,579 Because when you create a video from a prompt, 2085 01:27:54,579 --> 01:27:56,494 you're creating an eight-second snippet. 2086 01:27:56,494 --> 01:27:58,119 That eight-second snippet needs to know 2087 01:27:58,119 --> 01:28:00,559 what's going on in the rest of the story. 2088 01:28:00,560 --> 01:28:03,680 So if I show this one for a moment. 2089 01:28:03,680 --> 01:28:06,140 And it's a little wooden at the moment, 2090 01:28:06,140 --> 01:28:08,560 it's not really working perfectly. 2091 01:28:08,560 --> 01:28:10,538 I have professional actors who are friends 2092 01:28:10,537 --> 01:28:12,079 who are advising me on this, and they 2093 01:28:12,079 --> 01:28:13,600 laughed at the performances. 2094 01:28:13,600 --> 01:28:16,640 But try to view it through the difference 2095 01:28:16,640 --> 01:28:19,200 that we had from an agentic prompt with the hockey 2096 01:28:19,199 --> 01:28:20,619 player to this one. 2097 01:28:20,619 --> 01:28:22,519 [VIDEO PLAYBACK] 2098 01:28:22,520 --> 01:28:23,860 That's hopefully we can hear it. 2099 01:28:33,594 --> 01:28:36,250 - I guess I can do the pub quiz after all. 2100 01:28:40,550 --> 01:28:42,869 They just shut me down. 2101 01:28:42,869 --> 01:28:45,949 I'm so close. 2102 01:28:45,949 --> 01:28:48,250 But they wouldn't listen. 2103 01:28:48,250 --> 01:28:49,060 - I won't-- 2104 01:28:49,060 --> 01:28:49,643 [END PLAYBACK] 2105 01:28:49,643 --> 01:28:51,310 They never listen. 2106 01:28:51,310 --> 01:28:54,630 So here's the idea of, again, just 2107 01:28:54,630 --> 01:28:57,090 thinking in terms of agentic, as I was saying earlier on, 2108 01:28:57,090 --> 01:28:58,590 breaking it into those steps. 2109 01:28:58,590 --> 01:29:01,069 That allowed me to use exactly the same engine, 2110 01:29:01,069 --> 01:29:02,630 as I was showing you earlier on, that 2111 01:29:02,630 --> 01:29:04,789 fails to be able to show something 2112 01:29:04,789 --> 01:29:07,949 that works and is able to do things like portraying emotion 2113 01:29:07,949 --> 01:29:09,309 that I just spoke about. 2114 01:29:09,310 --> 01:29:11,450 So I know we're a little bit over time. 2115 01:29:11,449 --> 01:29:12,529 So sorry about that. 2116 01:29:12,529 --> 01:29:14,569 I can take any questions if anybody has any. 2117 01:29:14,569 --> 01:29:15,889 I see Andrew is here as well. 2118 01:29:15,890 --> 01:29:16,725 He's at the back. 2119 01:29:16,725 --> 01:29:18,350 And I just really want to say thank you 2120 01:29:18,350 --> 01:29:19,490 so much for your attention. 2121 01:29:19,489 --> 01:29:21,409 I really appreciate it. 2122 01:29:21,409 --> 01:29:24,225 [APPLAUSE] 2123 01:29:28,909 --> 01:29:29,710 Yep. 2124 01:29:29,710 --> 01:29:34,180 How much of this new generation [INAUDIBLE] 2125 01:29:34,180 --> 01:29:38,539 relation with the agentic [INAUDIBLE] use case 2126 01:29:38,539 --> 01:29:40,939 is improved with the agentic workflow? 2127 01:29:40,939 --> 01:29:43,899 And how much of it is a training set bias 2128 01:29:43,899 --> 01:29:48,179 where you might have only pictures 2129 01:29:48,180 --> 01:29:53,220 or videos with [INAUDIBLE] that are full of [INAUDIBLE] 2130 01:29:53,220 --> 01:29:55,645 Yeah, it's a great question. 2131 01:29:55,645 --> 01:29:58,020 Just to repeat for the video, how much of the improvement 2132 01:29:58,020 --> 01:30:00,220 is from the use of an agentic workflow 2133 01:30:00,220 --> 01:30:03,340 versus just lack of hockey stuff in the training set 2134 01:30:03,340 --> 01:30:06,060 for the failed one? 2135 01:30:06,060 --> 01:30:09,380 Not comparing like to, so just using my gut. 2136 01:30:09,380 --> 01:30:12,420 When I looked at when I broke this down into the workflow that 2137 01:30:12,420 --> 01:30:14,619 said, OK, I created scenes like this one 2138 01:30:14,619 --> 01:30:18,699 and they were awful when I just did it directly for myself 2139 01:30:18,699 --> 01:30:22,739 with no basis, no agentic, no artificial understanding. 2140 01:30:22,739 --> 01:30:25,979 And when I broke it down into the steps where it's like, OK, 2141 01:30:25,979 --> 01:30:28,500 in this scene, the girl is sitting on the bench, 2142 01:30:28,500 --> 01:30:30,100 and she's upset. 2143 01:30:30,100 --> 01:30:34,940 And the person is talking to her and he wants to comfort her. 2144 01:30:34,939 --> 01:30:38,259 Feeding that to a large language model 2145 01:30:38,260 --> 01:30:40,659 along with the entire story and along 2146 01:30:40,659 --> 01:30:43,019 with the constraints that I had, where the shot 2147 01:30:43,020 --> 01:30:45,460 has to be eight seconds long, clear dialogue 2148 01:30:45,460 --> 01:30:47,340 and all of those kind of things, and then 2149 01:30:47,340 --> 01:30:50,860 to understand my intent from that one, 2150 01:30:50,859 --> 01:30:53,699 the LLM ended up expressing a prompt that 2151 01:30:53,699 --> 01:30:57,139 was far more loquacious than I ever would have, 2152 01:30:57,140 --> 01:30:59,840 that was far more descriptive than I ever would have. 2153 01:30:59,840 --> 01:31:01,860 The LLM had understanding of what 2154 01:31:01,859 --> 01:31:03,899 makes a good shot, what makes a good angle, what 2155 01:31:03,899 --> 01:31:06,719 makes good emotion far more than I would have. 2156 01:31:06,720 --> 01:31:08,800 I could spend hours trying to describe it. 2157 01:31:08,800 --> 01:31:10,739 So that first step in the agentic flow 2158 01:31:10,739 --> 01:31:13,340 of it doing that for me and understanding my intent 2159 01:31:13,340 --> 01:31:14,739 was huge. 2160 01:31:14,739 --> 01:31:17,800 The second step then is the tools that it's going to use. 2161 01:31:17,800 --> 01:31:20,800 So I explicitly said which video engine I'm going to be using. 2162 01:31:20,800 --> 01:31:22,980 I was using Gemini as the LLM, and hopefully Gemini 2163 01:31:22,979 --> 01:31:25,079 is familiar with Veo, that kind of stuff, 2164 01:31:25,079 --> 01:31:27,340 so to understand the idiosyncrasies of doing things 2165 01:31:27,340 --> 01:31:28,539 with Veo. 2166 01:31:28,539 --> 01:31:30,210 What I learned, for example, Veo was 2167 01:31:30,210 --> 01:31:33,029 very bad at doing high-action scenes, 2168 01:31:33,029 --> 01:31:36,809 but is very good at doing slow camera pulls to do emotion, 2169 01:31:36,810 --> 01:31:38,250 as you saw in this case. 2170 01:31:38,250 --> 01:31:40,329 So the LLM knew that from me, declaring 2171 01:31:40,329 --> 01:31:41,630 I was using that as a tool. 2172 01:31:41,630 --> 01:31:43,449 And then further it built a prompt 2173 01:31:43,449 --> 01:31:45,809 and then further refined the prompt from that. 2174 01:31:45,810 --> 01:31:47,970 And then the third part actually using the tool 2175 01:31:47,970 --> 01:31:50,690 to actually generate it for me, generating a video 2176 01:31:50,689 --> 01:31:53,929 with something like Veo costs, I think, between $2 and $3 2177 01:31:53,930 --> 01:31:55,994 to generate four videos and credits. 2178 01:31:55,994 --> 01:31:57,369 So the last thing I want to do is 2179 01:31:57,369 --> 01:31:59,452 generate lots and lots and lots and lots of videos 2180 01:31:59,453 --> 01:32:01,170 and throw good money after bad. 2181 01:32:01,170 --> 01:32:04,449 But all of that token spend that I did earlier on 2182 01:32:04,449 --> 01:32:07,729 to understand my intent and then to make the plan for using 2183 01:32:07,729 --> 01:32:10,869 the agent was saved in the back end where it got it right. 2184 01:32:10,869 --> 01:32:13,273 Maybe not get it right first time, 2185 01:32:13,273 --> 01:32:15,690 but it would very rarely take more than two or three tries 2186 01:32:15,689 --> 01:32:17,969 to get something that was really, really nice. 2187 01:32:17,970 --> 01:32:21,409 So I think without comparing like with like, I 2188 01:32:21,409 --> 01:32:24,609 do think that plan of action and going through a workflow, that 2189 01:32:24,609 --> 01:32:27,529 worked very, very well. 2190 01:32:27,529 --> 01:32:32,679 Any other questions, thoughts, comments? 2191 01:32:32,680 --> 01:32:34,320 Yeah, up at the back. 2192 01:32:34,319 --> 01:32:37,599 What has surprised you the most about the AI 2193 01:32:37,600 --> 01:32:39,280 industry over the years? 2194 01:32:39,279 --> 01:32:41,639 What has surprised me the most about the AI 2195 01:32:41,640 --> 01:32:43,000 industry over the years? 2196 01:32:43,000 --> 01:32:45,760 Oh, that's a good one. 2197 01:32:45,760 --> 01:32:48,920 I think what has surprised me the most, 2198 01:32:48,920 --> 01:32:50,840 and it probably shouldn't have surprised me, 2199 01:32:50,840 --> 01:32:53,720 is how much hype took over. 2200 01:32:53,720 --> 01:32:56,520 I actually-- I honestly thought a lot of people 2201 01:32:56,520 --> 01:32:58,760 who are in important decision making roles 2202 01:32:58,760 --> 01:33:01,920 and that kind of thing would be able to see the signal better 2203 01:33:01,920 --> 01:33:03,840 than they did. 2204 01:33:03,840 --> 01:33:09,360 And I think the other part was that the desire to make 2205 01:33:09,359 --> 01:33:13,039 immediate profits as opposed to long-term gains 2206 01:33:13,039 --> 01:33:14,800 also surprised me a lot. 2207 01:33:14,800 --> 01:33:18,920 Let me share one story in that space was one of the things 2208 01:33:18,920 --> 01:33:22,279 that after Andrew and I taught that the TensorFlow 2209 01:33:22,279 --> 01:33:25,759 specializations on Coursera, and after that, Google 2210 01:33:25,760 --> 01:33:28,150 launched a professional certificate 2211 01:33:28,149 --> 01:33:30,109 where the idea of this professional certificate 2212 01:33:30,109 --> 01:33:32,229 was would give a rigorous exam. 2213 01:33:32,229 --> 01:33:33,729 And at the end of the rigorous exam, 2214 01:33:33,729 --> 01:33:38,149 if you got the certificate, it was a high prestige thing 2215 01:33:38,149 --> 01:33:40,189 that would help you find work, and particularly 2216 01:33:40,189 --> 01:33:43,949 at the time when TensorFlow was a very highly demanded skill 2217 01:33:43,949 --> 01:33:45,750 in order to get work. 2218 01:33:45,750 --> 01:33:49,909 Running that program cost Google $100,000 a year. 2219 01:33:49,909 --> 01:33:52,750 Drop in the bucket, not a lot of money. 2220 01:33:52,750 --> 01:33:56,710 The goodwill that came out of it was immense. 2221 01:33:56,710 --> 01:33:57,710 I can tell you-- 2222 01:33:57,710 --> 01:34:01,230 I'll tell one story very quickly, was a young man 2223 01:34:01,229 --> 01:34:03,789 and he went public in some advertising 2224 01:34:03,789 --> 01:34:08,310 stuff that with Google that he lived in Syria. 2225 01:34:08,310 --> 01:34:10,830 And we all know there was a huge civil war in Syria 2226 01:34:10,829 --> 01:34:12,390 over the last few years. 2227 01:34:12,390 --> 01:34:14,810 And he got the TensorFlow certificate. 2228 01:34:14,810 --> 01:34:16,970 He was one of the first in Syria to get it, 2229 01:34:16,970 --> 01:34:18,909 and it lifted him out of poverty, 2230 01:34:18,909 --> 01:34:21,029 where he was able to move to Germany 2231 01:34:21,029 --> 01:34:23,329 and get work at a major German firm. 2232 01:34:23,329 --> 01:34:25,779 And I met him at an event in Amsterdam 2233 01:34:25,779 --> 01:34:27,699 where he told me his story. 2234 01:34:27,699 --> 01:34:31,920 And now, because of the job that he had in this German firm, 2235 01:34:31,920 --> 01:34:34,699 he's able to support his family back home 2236 01:34:34,699 --> 01:34:36,460 and move them out of the war torn zone 2237 01:34:36,460 --> 01:34:41,539 into a peaceful zone all because he got this AI thing. 2238 01:34:41,539 --> 01:34:44,539 And there were countless stories like that. 2239 01:34:44,539 --> 01:34:47,039 Very inspirational, very beautiful stories. 2240 01:34:47,039 --> 01:34:48,539 But the thing that surprised me then 2241 01:34:48,539 --> 01:34:50,539 was sometimes the lack of investment 2242 01:34:50,539 --> 01:34:53,193 in that, where there was no revenue being generated 2243 01:34:53,193 --> 01:34:54,360 for the company out of that. 2244 01:34:54,359 --> 01:34:57,420 We deliberately kept it revenue neutral so 2245 01:34:57,420 --> 01:34:59,480 that the price of the exams could go down. 2246 01:34:59,479 --> 01:35:01,576 We wanted it to self-sustain. 2247 01:35:01,577 --> 01:35:03,159 It ended up not being revenue neutral. 2248 01:35:03,159 --> 01:35:06,220 It ended up costing the company about $100,000 to $150,000 2249 01:35:06,220 --> 01:35:06,840 a year. 2250 01:35:06,840 --> 01:35:08,569 So they canned it. 2251 01:35:08,569 --> 01:35:10,819 And it's a shame because of all the potential goodwill 2252 01:35:10,819 --> 01:35:12,279 that can come out of something like that. 2253 01:35:12,279 --> 01:35:13,939 But I think those were the two that 2254 01:35:13,939 --> 01:35:16,979 immediately jump to mind that have surprised me the most. 2255 01:35:16,979 --> 01:35:19,139 And then I guess one other part that I would say 2256 01:35:19,140 --> 01:35:24,850 is the people who've been able to be very successful with AI, 2257 01:35:24,850 --> 01:35:26,890 who you wouldn't think would be the ones that 2258 01:35:26,890 --> 01:35:29,910 would be successful with AI, has always been inspirational to me. 2259 01:35:29,909 --> 01:35:32,034 So allow me one more story. 2260 01:35:32,034 --> 01:35:32,909 I have a good friend. 2261 01:35:32,909 --> 01:35:34,529 I showed ice hockey a moment ago. 2262 01:35:34,529 --> 01:35:37,569 I have a good friend who is a former professional ice hockey 2263 01:35:37,569 --> 01:35:38,489 player. 2264 01:35:38,489 --> 01:35:40,689 And any ice hockey fans Here 2265 01:35:40,689 --> 01:35:43,489 It's a brutal sport. 2266 01:35:43,489 --> 01:35:46,090 You see a lot of fighting and a lot of stuff on the ice. 2267 01:35:46,090 --> 01:35:48,850 And he dropped out of school when he was 13 years old 2268 01:35:48,850 --> 01:35:50,530 to focus on skating. 2269 01:35:50,529 --> 01:35:52,170 And he will always tell everybody 2270 01:35:52,170 --> 01:35:55,497 that he's the dumbest person alive because he's uneducated. 2271 01:35:55,497 --> 01:35:56,829 He and I are complete opposites. 2272 01:35:56,829 --> 01:35:59,010 That's why we get on so well. 2273 01:35:59,010 --> 01:36:03,250 And he retired from ice hockey because of concussion issues. 2274 01:36:03,250 --> 01:36:05,489 And he now runs a nonprofit-- 2275 01:36:05,489 --> 01:36:08,090 the ice rinks for nonprofit. 2276 01:36:08,090 --> 01:36:11,409 And about three years ago, we were having a beer, 2277 01:36:11,409 --> 01:36:13,529 and he was like, so tell me about AI. 2278 01:36:13,529 --> 01:36:15,269 And tell me about this ChatGPT thing. 2279 01:36:15,270 --> 01:36:16,315 Is it any good? 2280 01:36:16,314 --> 01:36:18,189 And I was like, just sharing the whole thing. 2281 01:36:18,189 --> 01:36:19,939 Yes, it's good and all that kind of stuff. 2282 01:36:19,939 --> 01:36:22,689 And it was obviously a loaded question, and I didn't know why. 2283 01:36:22,689 --> 01:36:25,369 But part of his job at his nonprofit 2284 01:36:25,369 --> 01:36:27,329 is that every quarter, he has to present 2285 01:36:27,329 --> 01:36:30,250 to the board of directors the results of the operations 2286 01:36:30,250 --> 01:36:31,846 so that they can be funded properly, 2287 01:36:31,846 --> 01:36:33,429 because even though they're nonprofit, 2288 01:36:33,430 --> 01:36:35,289 they still need money to operate. 2289 01:36:35,289 --> 01:36:40,529 And he was spending upwards of $150,000 a year to bring 2290 01:36:40,529 --> 01:36:44,369 in consultants to pull the data from all of the different 2291 01:36:44,369 --> 01:36:45,256 sources. 2292 01:36:45,256 --> 01:36:47,089 They're pulling data from-- there's machines 2293 01:36:47,090 --> 01:36:49,690 in what's called the pump room that has a compressor that 2294 01:36:49,689 --> 01:36:50,589 cools the ice. 2295 01:36:50,590 --> 01:36:52,810 And there were spreadsheets and there was accounts 2296 01:36:52,810 --> 01:36:53,935 and all this kind of stuff. 2297 01:36:53,935 --> 01:36:56,690 And he was not tech savvy in any way. 2298 01:36:56,689 --> 01:36:59,469 But he needed to process all this data. 2299 01:36:59,470 --> 01:37:02,350 So he did an experiment where he got ChatGPT to do it. 2300 01:37:02,350 --> 01:37:03,770 And this was the loaded question, 2301 01:37:03,770 --> 01:37:05,162 asking me if it was any good. 2302 01:37:05,162 --> 01:37:06,869 And so we talked through it a little bit. 2303 01:37:06,869 --> 01:37:08,569 And then he told me why. 2304 01:37:08,569 --> 01:37:10,206 And so I took a look at the results 2305 01:37:10,207 --> 01:37:11,789 because he was uploading spreadsheets. 2306 01:37:11,789 --> 01:37:13,930 He was uploading PDFs and all this kind of thing 2307 01:37:13,930 --> 01:37:15,670 and getting it to assemble a report. 2308 01:37:15,670 --> 01:37:18,529 And it takes him about two hours to do the report himself 2309 01:37:18,529 --> 01:37:19,689 with ChatGPT. 2310 01:37:19,689 --> 01:37:22,039 And it worked, and it worked brilliantly. 2311 01:37:22,039 --> 01:37:25,880 And that $150,000 a year that he's saving on consulting is now 2312 01:37:25,880 --> 01:37:29,500 going to underprivileged kids for hockey equipment, 2313 01:37:29,500 --> 01:37:31,159 for ice skating equipment, for lessons, 2314 01:37:31,159 --> 01:37:32,340 and all of that kind of thing. 2315 01:37:32,340 --> 01:37:34,797 So it was taken out of the hands of an expensive consulting 2316 01:37:34,797 --> 01:37:37,018 company and put into the hands of people. 2317 01:37:37,018 --> 01:37:38,560 Because of this guy, and he says he's 2318 01:37:38,560 --> 01:37:40,960 the dumbest person alive, but-- 2319 01:37:40,960 --> 01:37:44,840 I hope he's not watching this video. 2320 01:37:44,840 --> 01:37:47,159 And I told him afterwards that, congratulations, you're 2321 01:37:47,159 --> 01:37:48,639 now a developer. 2322 01:37:48,640 --> 01:37:51,320 And he didn't like that. 2323 01:37:51,319 --> 01:37:55,159 But it's like surprises like that the superpowers that were 2324 01:37:55,159 --> 01:37:58,599 handed to somebody like him, that he's not technical in any 2325 01:37:58,600 --> 01:38:01,600 way, but he was able to effectively build a solution 2326 01:38:01,600 --> 01:38:05,320 that saved his nonprofit $100,000 or $150,000 a year. 2327 01:38:05,319 --> 01:38:07,759 And things like that are always surprising me 2328 01:38:07,760 --> 01:38:08,900 in a very pleasant way. 2329 01:38:12,039 --> 01:38:12,800 Yep. 2330 01:38:12,800 --> 01:38:13,300 Sorry. 2331 01:38:13,300 --> 01:38:14,020 I'll get to you next. 2332 01:38:14,020 --> 01:38:14,520 Sorry. 2333 01:38:14,520 --> 01:38:15,360 Yeah. 2334 01:38:15,359 --> 01:38:20,630 For engineers like us, it's easier to navigate the hype 2335 01:38:20,630 --> 01:38:24,869 because we can understand what the signal is from a research 2336 01:38:24,869 --> 01:38:25,630 paper. 2337 01:38:25,630 --> 01:38:30,230 But how about people who doesn't have this knowledge, like, 2338 01:38:30,229 --> 01:38:36,029 from humanities or something [INAUDIBLE]? 2339 01:38:36,029 --> 01:38:38,167 Yeah, so just to repeat the question for the video. 2340 01:38:38,167 --> 01:38:39,710 For engineers like us, sometimes it's 2341 01:38:39,710 --> 01:38:42,167 easy to navigate the hype to see the signal from the noise. 2342 01:38:42,167 --> 01:38:45,829 But what about people who don't have the same training as us? 2343 01:38:45,829 --> 01:38:49,789 I think that's our opportunity to be trusted advisors for them 2344 01:38:49,789 --> 01:38:53,229 and to really help them through that, to understand it. 2345 01:38:53,229 --> 01:38:55,750 I think the biggest part in the hype story 2346 01:38:55,750 --> 01:38:59,229 right now is just understanding the reward mechanism. 2347 01:38:59,229 --> 01:39:01,669 That everything rewards engagement rather than 2348 01:39:01,670 --> 01:39:03,149 actual substance. 2349 01:39:03,149 --> 01:39:05,909 And to me, step one is seeing through that. 2350 01:39:05,909 --> 01:39:08,309 The story I just told about my friend, 2351 01:39:08,310 --> 01:39:10,250 he'd seen all this kind of stuff, 2352 01:39:10,250 --> 01:39:12,409 but he wasn't willing to bet his career on it. 2353 01:39:12,409 --> 01:39:14,269 But he needed that kind of advice 2354 01:39:14,270 --> 01:39:16,847 around it and to start peeling apart what he had done 2355 01:39:16,846 --> 01:39:18,679 and what he did right and what he did wrong. 2356 01:39:18,680 --> 01:39:23,180 And so that positioning ourselves to be trusted advisors 2357 01:39:23,180 --> 01:39:24,940 by not leaning into the same mistakes 2358 01:39:24,939 --> 01:39:27,159 that the untrained people may be leaning into, 2359 01:39:27,159 --> 01:39:29,300 I think is the key to that. 2360 01:39:29,300 --> 01:39:32,659 And just understanding that the average person is generally 2361 01:39:32,659 --> 01:39:35,019 very intelligent, even if they may not 2362 01:39:35,020 --> 01:39:37,980 be experts in a specific domain, and to key 2363 01:39:37,979 --> 01:39:41,779 in on that intelligence and help them to foster and to grow that 2364 01:39:41,779 --> 01:39:44,550 in and navigate them through the parts 2365 01:39:44,550 --> 01:39:46,300 where they'll have difficulty and let them 2366 01:39:46,300 --> 01:39:49,100 shine in what they're very, very good at. 2367 01:39:49,100 --> 01:39:51,579 Over here there was one. 2368 01:39:51,579 --> 01:39:53,779 I have a question more for AI and machine 2369 01:39:53,779 --> 01:39:55,519 learning for scientific research. 2370 01:39:55,520 --> 01:39:56,060 OK. 2371 01:39:56,060 --> 01:39:59,340 Which is something that is very hard [INAUDIBLE] 2372 01:39:59,340 --> 01:40:01,039 to get your perspective on. 2373 01:40:01,039 --> 01:40:03,220 Where do you think that is a good idea 2374 01:40:03,220 --> 01:40:06,659 and where you might say, maybe be cautious? 2375 01:40:06,659 --> 01:40:09,960 So AI and machine learning for scientific research, 2376 01:40:09,960 --> 01:40:14,340 where is it a good idea and where should you be cautious? 2377 01:40:14,340 --> 01:40:16,250 Ooh. 2378 01:40:16,250 --> 01:40:20,090 My initial gut check would be I think it's always a good idea. 2379 01:40:20,090 --> 01:40:23,489 I think there was no harm in using the tools that you have 2380 01:40:23,489 --> 01:40:26,729 available to you, but to always to just double 2381 01:40:26,729 --> 01:40:29,209 check your results and double check your expectations 2382 01:40:29,210 --> 01:40:31,730 against the grounded reality. 2383 01:40:31,729 --> 01:40:36,209 I've always been a fan of using automation in research 2384 01:40:36,210 --> 01:40:37,189 as much as possible. 2385 01:40:37,189 --> 01:40:40,694 My undergraduate was physics many, many years ago, 2386 01:40:40,694 --> 01:40:42,569 and I was actually very successful in the lab 2387 01:40:42,569 --> 01:40:44,929 because I usually automated things through a computer 2388 01:40:44,930 --> 01:40:47,369 that other people did handwriting and pen and paper 2389 01:40:47,369 --> 01:40:48,010 with. 2390 01:40:48,010 --> 01:40:49,270 So I could move quickly. 2391 01:40:49,270 --> 01:40:51,190 So I know I'm biased in that regard. 2392 01:40:51,189 --> 01:40:54,309 But I would say, for most research, for the most part, 2393 01:40:54,310 --> 01:40:57,390 I think use the most powerful tools you have available, 2394 01:40:57,390 --> 01:40:58,910 but check your expectations. 2395 01:41:03,210 --> 01:41:07,630 Little story actually on that side was trivia question. 2396 01:41:07,630 --> 01:41:10,010 Poorest country in Western Europe. 2397 01:41:10,010 --> 01:41:11,270 Anybody know? 2398 01:41:11,270 --> 01:41:12,130 Serbia? 2399 01:41:12,130 --> 01:41:12,909 What's that? 2400 01:41:12,909 --> 01:41:13,819 Or Western. 2401 01:41:13,819 --> 01:41:16,799 Western Europe is Wales. 2402 01:41:16,800 --> 01:41:19,260 So I actually did my undergraduate in Wales, 2403 01:41:19,260 --> 01:41:22,119 and I went back to do some lectures in the university 2404 01:41:22,119 --> 01:41:23,039 there. 2405 01:41:23,039 --> 01:41:26,300 And I met with a researcher there, 2406 01:41:26,300 --> 01:41:29,480 and he was doing research into brain cancer 2407 01:41:29,479 --> 01:41:32,119 using computer imagery and using various types of computer 2408 01:41:32,119 --> 01:41:32,731 imagery. 2409 01:41:32,731 --> 01:41:34,439 And I asked him, well, what's the biggest 2410 01:41:34,439 --> 01:41:35,939 problem that you have? 2411 01:41:35,939 --> 01:41:38,099 What's the biggest blocker for your research? 2412 01:41:38,100 --> 01:41:39,960 And this is about eight years ago. 2413 01:41:39,960 --> 01:41:43,760 And his answer was access to a GPU. 2414 01:41:43,760 --> 01:41:46,840 And because for him to be able to train his models 2415 01:41:46,840 --> 01:41:50,079 and run his models, he needed to be able to access a GPU. 2416 01:41:50,079 --> 01:41:52,960 And the department that he was in 2417 01:41:52,960 --> 01:41:55,520 had one GPU between 10 researchers, 2418 01:41:55,520 --> 01:41:57,760 which meant that everybody got it for half a day. 2419 01:41:57,760 --> 01:41:59,467 Monday through Friday, and his half a day 2420 01:41:59,467 --> 01:42:00,800 was Tuesday afternoon. 2421 01:42:00,800 --> 01:42:02,760 So in his case, he would spend the entire time 2422 01:42:02,760 --> 01:42:05,000 that wasn't Tuesday afternoon preparing everything 2423 01:42:05,000 --> 01:42:07,100 for his model run or his model training 2424 01:42:07,100 --> 01:42:08,100 or everything like that. 2425 01:42:08,100 --> 01:42:11,020 And then Tuesday afternoon, once he had access to the GPU, 2426 01:42:11,020 --> 01:42:12,780 then he would do the training. 2427 01:42:12,779 --> 01:42:14,237 And then he would hope in that time 2428 01:42:14,238 --> 01:42:16,655 that he would train his model and he would get the results 2429 01:42:16,654 --> 01:42:17,340 that he wanted. 2430 01:42:17,340 --> 01:42:20,440 Otherwise, he'd have to wait a week to get access to the GPU 2431 01:42:20,439 --> 01:42:21,399 again. 2432 01:42:21,399 --> 01:42:23,439 And then I showed him Google Colab. 2433 01:42:23,439 --> 01:42:25,599 Anybody ever used Google Colab? 2434 01:42:25,600 --> 01:42:27,960 And you can have a GPU in the cloud 2435 01:42:27,960 --> 01:42:29,579 for free with that kind of thing. 2436 01:42:29,579 --> 01:42:32,760 And the poor guy's brain melted that-- 2437 01:42:32,760 --> 01:42:34,800 because I took out my phone, and I showed him 2438 01:42:34,800 --> 01:42:37,180 a notebook running on my phone in Google Colab 2439 01:42:37,180 --> 01:42:38,180 and training it on that. 2440 01:42:38,180 --> 01:42:41,240 And it changed everything for him research wise. 2441 01:42:41,239 --> 01:42:44,099 And now it was a case of-- and this was with Colab. 2442 01:42:44,100 --> 01:42:46,520 He had much more than he had with his shared GPU. 2443 01:42:46,520 --> 01:42:49,477 So I think for someone like him, machine learning 2444 01:42:49,476 --> 01:42:51,059 was an important part of his research, 2445 01:42:51,060 --> 01:42:55,400 but he was so gated on it that the ability to widen access 2446 01:42:55,399 --> 01:42:57,927 to that ended up really, really advancing his research. 2447 01:42:57,927 --> 01:42:59,220 I don't know where it ended up. 2448 01:42:59,220 --> 01:43:00,220 I don't know what he has done. 2449 01:43:00,220 --> 01:43:01,740 It has been a few years since then. 2450 01:43:01,739 --> 01:43:06,920 But that story just came to mind when you asked the question. 2451 01:43:06,920 --> 01:43:09,680 Any more questions? 2452 01:43:09,680 --> 01:43:11,210 Feel free to ask me anything. 2453 01:43:14,390 --> 01:43:14,890 Oh, yeah. 2454 01:43:14,890 --> 01:43:15,950 At the front here. 2455 01:43:15,949 --> 01:43:17,510 It's more of a general question. 2456 01:43:17,510 --> 01:43:21,270 You talked about AI helping food and beverage use. 2457 01:43:21,270 --> 01:43:25,590 What do you think AI would be a force of social equality 2458 01:43:25,590 --> 01:43:27,470 or social inequality? 2459 01:43:27,470 --> 01:43:31,909 So can AI be a force of social equality or social inequality? 2460 01:43:31,909 --> 01:43:34,909 I think the answer to that is yes. 2461 01:43:34,909 --> 01:43:37,349 It can be both, and it can be neither. 2462 01:43:37,350 --> 01:43:39,470 I mean, I think that ultimately, the idea 2463 01:43:39,470 --> 01:43:45,310 is that if in my opinion, any tool can be used for any means, 2464 01:43:45,310 --> 01:43:48,310 so the important thing is to educate and inspire people 2465 01:43:48,310 --> 01:43:51,230 towards using things for the correct means. 2466 01:43:51,229 --> 01:43:53,369 There's only so much governance can be applied. 2467 01:43:53,369 --> 01:43:56,269 And sometimes governance can cause more problems 2468 01:43:56,270 --> 01:43:58,150 than it solves. 2469 01:43:58,149 --> 01:44:03,349 So I always love to live my life by assuming good intent 2470 01:44:03,350 --> 01:44:05,570 but preparing for bad intent. 2471 01:44:05,569 --> 01:44:07,069 And in the case of AI, I don't think 2472 01:44:07,069 --> 01:44:09,527 there's any difference there that everything that I will do 2473 01:44:09,528 --> 01:44:12,240 and everything that I would advise is assuming good intent, 2474 01:44:12,239 --> 01:44:14,420 that people would use it for good things, 2475 01:44:14,420 --> 01:44:18,100 but also to be prepared for it to be misused. 2476 01:44:18,100 --> 01:44:20,660 The bad examples that I showed earlier on, I think 2477 01:44:20,659 --> 01:44:24,500 were good intent rather than bad intent. 2478 01:44:24,500 --> 01:44:26,739 And most mistakes that I see that are 2479 01:44:26,739 --> 01:44:29,260 good intent being used mistakenly as 2480 01:44:29,260 --> 01:44:30,400 opposed to bad intent. 2481 01:44:30,399 --> 01:44:33,199 But I would say that's the only mantra that I can-- 2482 01:44:33,199 --> 01:44:35,979 the only advice that I can give and that kind of thing is always 2483 01:44:35,979 --> 01:44:40,379 assume good intent, but prepare for bad intent. 2484 01:44:40,380 --> 01:44:42,420 The AI itself has no choice. 2485 01:44:42,420 --> 01:44:43,840 It's how people use it. 2486 01:44:46,659 --> 01:44:49,659 Andrew, did you want closing comments or-- 2487 01:44:49,659 --> 01:44:53,579 I think we were running out [INAUDIBLE] time. 2488 01:44:53,579 --> 01:44:55,738 But thank you for this. 2489 01:44:55,738 --> 01:44:56,280 Really great. 2490 01:44:56,279 --> 01:44:57,904 Thanks, everyone, for all the questions 2491 01:44:57,904 --> 01:45:00,019 on those creative solutions. 2492 01:45:00,020 --> 01:45:00,520 All right. 2493 01:45:00,520 --> 01:45:01,240 Thank you, Andrew. 2494 01:45:01,239 --> 01:45:01,739 Thanks. 2495 01:45:01,739 --> 01:45:04,289 [APPLAUSE]