1 00:00:06,719 --> 00:00:13,759 Hello and welcome to the program. The AI 2 00:00:10,080 --> 00:00:16,320 revolution runs on electricity. Lots of 3 00:00:13,759 --> 00:00:18,880 it. Vast amounts of it. The data centers 4 00:00:16,320 --> 00:00:21,679 being built right now to power the next 5 00:00:18,879 --> 00:00:25,038 generation of AI consume as much energy 6 00:00:21,679 --> 00:00:27,118 as a small city. Last year, the world's 7 00:00:25,039 --> 00:00:30,400 biggest tech companies spent more than 8 00:00:27,118 --> 00:00:32,799 $400 billion building them. And they 9 00:00:30,399 --> 00:00:34,558 need more of them, many more of them. 10 00:00:32,799 --> 00:00:37,280 But there's a problem. In the United 11 00:00:34,558 --> 00:00:39,280 States, four in 10 of the data centers 12 00:00:37,280 --> 00:00:42,000 that are being planned for this year are 13 00:00:39,280 --> 00:00:44,879 at serious risk of delay. Not enough 14 00:00:42,000 --> 00:00:48,399 power, not enough equipment, and not 15 00:00:44,878 --> 00:00:51,039 enough people to build them. Why? Also 16 00:00:48,399 --> 00:00:53,359 on the program this week, Coachella, the 17 00:00:51,039 --> 00:00:55,679 world's most glamorous music festival. 18 00:00:53,359 --> 00:00:57,759 But some of the online influences you 19 00:00:55,679 --> 00:01:00,799 may have seen pictured in the California 20 00:00:57,759 --> 00:01:03,439 desert alongside the biggest stars were 21 00:01:00,799 --> 00:01:05,920 fake, not real. They don't exist. AI 22 00:01:03,439 --> 00:01:08,478 generated to promote brands and to make 23 00:01:05,920 --> 00:01:11,200 money. Plus, we'll also talk this week 24 00:01:08,478 --> 00:01:14,640 about the late Val Kilmer appearing in a 25 00:01:11,200 --> 00:01:16,719 new film one year after his death. with 26 00:01:14,640 --> 00:01:19,359 us this week to talk about it, Palmy 27 00:01:16,719 --> 00:01:20,959 Olsen, uh who is a technology columnist 28 00:01:19,359 --> 00:01:21,599 at Bloomberg. Welcome to you. 29 00:01:20,959 --> 00:01:24,559 >> Thank you. 30 00:01:21,599 --> 00:01:26,798 >> Also here, Dr. Sasha Luchoni, a computer 31 00:01:24,560 --> 00:01:29,359 scientist specializing in AI and its 32 00:01:26,799 --> 00:01:31,520 environmental impact and also in the 33 00:01:29,359 --> 00:01:34,560 studio with us. Uh good to have her back 34 00:01:31,519 --> 00:01:36,959 here. Dr. Stephanie Hair, colleague, uh 35 00:01:34,560 --> 00:01:39,680 author and AI expert to give you your 36 00:01:36,959 --> 00:01:41,919 full title. Um Pommy, let us start with 37 00:01:39,680 --> 00:01:44,240 this issue of data centers. Um we're 38 00:01:41,920 --> 00:01:46,399 building lots of them. They are powering 39 00:01:44,239 --> 00:01:48,798 the AI revolution. 40 00:01:46,399 --> 00:01:51,280 Why are so many of them on hold? 41 00:01:48,799 --> 00:01:52,960 >> Yeah. So this is really an issue of 42 00:01:51,280 --> 00:01:54,799 bottlenecks and you sort of alluded to 43 00:01:52,959 --> 00:01:57,599 it in your introduction. There's the 44 00:01:54,799 --> 00:01:59,840 issue of power. So getting access to 45 00:01:57,599 --> 00:02:01,839 actual power on electricity grids that 46 00:01:59,840 --> 00:02:03,439 are very very old and haven't received a 47 00:02:01,840 --> 00:02:06,240 lot of investment over many decades in 48 00:02:03,438 --> 00:02:07,519 the United States. Um also equipment, 49 00:02:06,239 --> 00:02:09,519 getting access to things like 50 00:02:07,519 --> 00:02:11,038 transformers or switch gear, which are 51 00:02:09,520 --> 00:02:13,280 the types of equipments you need to 52 00:02:11,038 --> 00:02:14,958 actually run data centers. There's also 53 00:02:13,280 --> 00:02:16,479 a huge bottleneck getting them. It could 54 00:02:14,959 --> 00:02:18,080 take up to five years to get some of 55 00:02:16,479 --> 00:02:20,719 that equipment, particularly because a 56 00:02:18,080 --> 00:02:22,719 lot of it comes from China and recent US 57 00:02:20,719 --> 00:02:24,318 tariffs on Chinese goods has made that 58 00:02:22,719 --> 00:02:25,919 even more difficult. And then there's 59 00:02:24,318 --> 00:02:28,639 also just getting the people, the 60 00:02:25,919 --> 00:02:30,159 talent, um the electricians um and the 61 00:02:28,639 --> 00:02:33,039 people with the skill set who can 62 00:02:30,159 --> 00:02:36,318 actually uh construct and run these data 63 00:02:33,039 --> 00:02:38,560 centers. Um so all those things combined 64 00:02:36,318 --> 00:02:42,159 have meant that at a time when there is 65 00:02:38,560 --> 00:02:44,000 this rapacious demand for um for energy 66 00:02:42,159 --> 00:02:47,120 from tech companies, it's actually very 67 00:02:44,000 --> 00:02:49,199 difficult to build them fast enough. 68 00:02:47,120 --> 00:02:51,680 >> Does that problem become more acute in 69 00:02:49,199 --> 00:02:53,598 the energy crisis we're in currently? 70 00:02:51,680 --> 00:02:56,080 >> I think it does to some extent and it's 71 00:02:53,598 --> 00:02:57,919 more of a problem for the so-called 72 00:02:56,080 --> 00:03:03,519 hyperscalers, the big tech companies 73 00:02:57,919 --> 00:03:04,958 like Meta, um Alphabet, Amazon, etc. who 74 00:03:03,519 --> 00:03:07,039 are actually the ones who have to shell 75 00:03:04,959 --> 00:03:10,080 out for these energy costs. And so 76 00:03:07,039 --> 00:03:12,479 they're setting up um these kind of mini 77 00:03:10,080 --> 00:03:15,599 nuclear reactors that can actually 78 00:03:12,479 --> 00:03:18,799 provide um energy specifically for the 79 00:03:15,598 --> 00:03:20,878 data center um and use renewable sources 80 00:03:18,800 --> 00:03:23,280 instead of gas. But at the moment even 81 00:03:20,878 --> 00:03:25,199 getting those up and running is 82 00:03:23,280 --> 00:03:26,560 logistically very very difficult. It's 83 00:03:25,199 --> 00:03:29,119 time consuming. There aren't any 84 00:03:26,560 --> 00:03:31,360 actually that are operational just yet. 85 00:03:29,120 --> 00:03:33,120 and and so right now I think the main 86 00:03:31,360 --> 00:03:35,519 source of energy is gas. 87 00:03:33,120 --> 00:03:36,878 >> Should we give our viewers just a scale 88 00:03:35,519 --> 00:03:38,158 of what we're talking about here? Scott 89 00:03:36,878 --> 00:03:40,000 Galloway who might be coming on the 90 00:03:38,158 --> 00:03:44,000 program next week over in New York, he 91 00:03:40,000 --> 00:03:47,280 says Open AI alone, Stephanie will need 92 00:03:44,000 --> 00:03:50,479 20% of current US electricity capacity 93 00:03:47,280 --> 00:03:52,318 at $10 trillion. 94 00:03:50,479 --> 00:03:54,158 >> That's extraordinary. It's extraordinary 95 00:03:52,318 --> 00:03:56,158 for a company that is yet to turn a 96 00:03:54,158 --> 00:03:59,280 profit and which is under huge pressure 97 00:03:56,158 --> 00:04:02,000 to demonstrate value ahead of an IPO as 98 00:03:59,280 --> 00:04:03,680 well. So it's just worth saying that all 99 00:04:02,000 --> 00:04:06,318 of these companies, not just OpenAI, 100 00:04:03,680 --> 00:04:08,239 were making very big promises about 101 00:04:06,318 --> 00:04:10,719 their data structure buildout plans as 102 00:04:08,239 --> 00:04:13,759 of last year. We've already seen the US 103 00:04:10,719 --> 00:04:15,598 UK technology deal that's on hold. We've 104 00:04:13,759 --> 00:04:20,319 seen OpenAI have to pull out of some of 105 00:04:15,598 --> 00:04:21,759 its Stargate, the big US data structure 106 00:04:20,319 --> 00:04:23,918 plan. and they've had to pull out on 107 00:04:21,759 --> 00:04:25,439 some of those things. Maybe we are 108 00:04:23,918 --> 00:04:27,918 walking back. So that question of will 109 00:04:25,439 --> 00:04:29,439 the AI bubble burst? It might not burst. 110 00:04:27,918 --> 00:04:30,959 It might just sort of slightly 111 00:04:29,439 --> 00:04:32,160 >> situation, isn't it? When you think that 112 00:04:30,959 --> 00:04:35,439 actually if you're going to plug these 113 00:04:32,160 --> 00:04:37,199 into the grid and you don't know whether 114 00:04:35,439 --> 00:04:39,680 these companies are going to survive in 115 00:04:37,199 --> 00:04:42,080 the scale or perhaps they're even bigger 116 00:04:39,680 --> 00:04:44,079 than they are right now, it you it's a 117 00:04:42,079 --> 00:04:45,599 difficult thing to plan. 118 00:04:44,079 --> 00:04:47,279 >> It's a difficult thing to plan when 119 00:04:45,600 --> 00:04:48,400 you're not a planned economy. Which is 120 00:04:47,279 --> 00:04:50,559 why when we're looking at the data 121 00:04:48,399 --> 00:04:52,159 center roll out in a country like China 122 00:04:50,560 --> 00:04:54,319 and comparing that to the United States 123 00:04:52,160 --> 00:04:55,439 or indeed here in Europe, we get very 124 00:04:54,319 --> 00:04:58,159 different pictures. 125 00:04:55,439 --> 00:05:00,478 >> Right. Um Dr. Luchi, Sasha, I'm going to 126 00:04:58,160 --> 00:05:02,479 call you Sasha. Does it make sense for 127 00:05:00,478 --> 00:05:05,120 the US president to be so vehemently 128 00:05:02,478 --> 00:05:07,680 opposed to renewable energy given the 129 00:05:05,120 --> 00:05:09,840 scale that we're talking? 130 00:05:07,680 --> 00:05:11,680 Well, the problem is is that the data 131 00:05:09,839 --> 00:05:13,679 centers are being built so quickly that 132 00:05:11,680 --> 00:05:15,759 renewable capacity has trouble keeping 133 00:05:13,680 --> 00:05:18,079 up, especially in rural areas, 134 00:05:15,759 --> 00:05:19,759 especially out outside of places where 135 00:05:18,079 --> 00:05:21,279 renewables are are are typically the 136 00:05:19,759 --> 00:05:24,080 case. So, I think that currently the 137 00:05:21,279 --> 00:05:26,399 emphasis is build faster, build bigger, 138 00:05:24,079 --> 00:05:28,639 and they don't want to wait around for 139 00:05:26,399 --> 00:05:30,879 solar or wind, which is why um 140 00:05:28,639 --> 00:05:32,639 essentially most uh the data centers 141 00:05:30,879 --> 00:05:34,079 that are coming online as quickly are 142 00:05:32,639 --> 00:05:35,918 essentially bringing in turbines on the 143 00:05:34,079 --> 00:05:37,758 back of trucks, natural gas. It's it's 144 00:05:35,918 --> 00:05:39,519 like bring your bring your own energy uh 145 00:05:37,759 --> 00:05:39,840 essentially and most of that is and they 146 00:05:39,519 --> 00:05:40,799 have the money 147 00:05:39,839 --> 00:05:41,519 >> is non-renewable. 148 00:05:40,800 --> 00:05:42,960 >> I mean these companies 149 00:05:41,519 --> 00:05:44,000 >> they have the money 150 00:05:42,959 --> 00:05:45,359 >> they have the money but actually 151 00:05:44,000 --> 00:05:46,959 currently there's a bottleneck even when 152 00:05:45,360 --> 00:05:48,879 you have the money because there's not 153 00:05:46,959 --> 00:05:50,478 enough turbines to to power all these 154 00:05:48,879 --> 00:05:52,959 data centers because there's there's a 155 00:05:50,478 --> 00:05:54,560 backlog nowadays and and even these uh 156 00:05:52,959 --> 00:05:57,198 these turbines can't be produced fast 157 00:05:54,560 --> 00:05:59,439 enough to to respond to demand. Palmy at 158 00:05:57,199 --> 00:06:02,478 Bloomberg recently you highlighted an 159 00:05:59,439 --> 00:06:03,680 issue in Northern Spain um with the data 160 00:06:02,478 --> 00:06:05,599 center build out there which has 161 00:06:03,680 --> 00:06:06,959 actually been held up as a a model for 162 00:06:05,600 --> 00:06:09,840 the rest of Europe but for the people 163 00:06:06,959 --> 00:06:12,318 who live around these projects the 164 00:06:09,839 --> 00:06:14,000 reality is sometimes very different why 165 00:06:12,319 --> 00:06:15,520 >> I think it's a common story we're also 166 00:06:14,000 --> 00:06:17,680 seeing in the United States a lot of 167 00:06:15,519 --> 00:06:19,038 push back from local residents in areas 168 00:06:17,680 --> 00:06:21,439 where companies want to build data 169 00:06:19,038 --> 00:06:24,478 centers and in northern Spain um there 170 00:06:21,439 --> 00:06:27,839 have been uh the situation is that AWS 171 00:06:24,478 --> 00:06:31,038 which is the cloud business of Amazon um 172 00:06:27,839 --> 00:06:33,119 sent letters to local people um saying 173 00:06:31,038 --> 00:06:35,839 we want to buy your land, giving them 174 00:06:33,120 --> 00:06:38,959 sometimes 4 days notice to say yes or 175 00:06:35,839 --> 00:06:40,560 no. Um and some of these people in in in 176 00:06:38,959 --> 00:06:42,959 northern Spain actually thought it was a 177 00:06:40,560 --> 00:06:44,560 scam at first. Um one lady went to her 178 00:06:42,959 --> 00:06:47,758 local town hall and even they didn't 179 00:06:44,560 --> 00:06:50,879 know. So, it's a real um kind of land 180 00:06:47,759 --> 00:06:53,919 grab almost to try and get land that is 181 00:06:50,879 --> 00:06:56,240 relatively cheap in an area where energy 182 00:06:53,918 --> 00:06:58,399 costs are relatively low and that are 183 00:06:56,240 --> 00:07:00,478 sparssely populated as well. It seems 184 00:06:58,399 --> 00:07:02,638 like an ideal situation for building a 185 00:07:00,478 --> 00:07:04,719 data center, but at the same time there 186 00:07:02,639 --> 00:07:06,160 is the reality for people who do live 187 00:07:04,720 --> 00:07:08,720 there and there are people who live 188 00:07:06,160 --> 00:07:10,639 there um that they have to give up that 189 00:07:08,720 --> 00:07:12,319 land or maybe the suddenly you've got 190 00:07:10,639 --> 00:07:14,400 this eyes sore in a place that you've 191 00:07:12,319 --> 00:07:15,919 lived in for many generations. If you're 192 00:07:14,399 --> 00:07:18,079 in a community like that though and 193 00:07:15,918 --> 00:07:20,959 you've already struggled to to get 194 00:07:18,079 --> 00:07:22,399 natural resources or to to to get 195 00:07:20,959 --> 00:07:24,560 electricity to get yourselves on the 196 00:07:22,399 --> 00:07:26,719 grid, does the arrival of a big AI 197 00:07:24,560 --> 00:07:28,639 company help in that process? Perhaps it 198 00:07:26,720 --> 00:07:31,520 it could help a community 199 00:07:28,639 --> 00:07:33,759 >> in some respects. And the funny part in 200 00:07:31,519 --> 00:07:36,478 that is that governments um local 201 00:07:33,759 --> 00:07:38,960 governments often uh frame data center 202 00:07:36,478 --> 00:07:40,959 buildouts as being great for jobs. Yeah. 203 00:07:38,959 --> 00:07:43,198 But I think you're conflating in that 204 00:07:40,959 --> 00:07:45,120 situation permanent jobs with 205 00:07:43,199 --> 00:07:46,639 construction jobs which are temporary. 206 00:07:45,120 --> 00:07:47,598 And so when you build out a data center, 207 00:07:46,639 --> 00:07:49,120 you're you're going to hire me. 208 00:07:47,598 --> 00:07:50,639 >> They're not are they not necessarily big 209 00:07:49,120 --> 00:07:52,319 employers once the kit is there? 210 00:07:50,639 --> 00:07:54,079 >> No, I think in a typical data center you 211 00:07:52,319 --> 00:07:56,639 might have about 100 people, most of 212 00:07:54,079 --> 00:07:58,399 them cleaners and security people. Um 213 00:07:56,639 --> 00:08:00,160 whereas for the buildout, sure, 214 00:07:58,399 --> 00:08:01,839 hundreds, maybe thousands of people, but 215 00:08:00,160 --> 00:08:02,960 then that's only temporary. 216 00:08:01,839 --> 00:08:04,478 >> All right, I'm going to bring in an 217 00:08:02,959 --> 00:08:06,560 audience question quite early into the 218 00:08:04,478 --> 00:08:08,000 program this week because it it it fits 219 00:08:06,560 --> 00:08:10,959 what you're talking about. It's from 220 00:08:08,000 --> 00:08:12,879 James in the UK. He says, "Sasha, AI 221 00:08:10,959 --> 00:08:15,038 companies continue to minimize this 222 00:08:12,879 --> 00:08:17,280 environmental impact." He points 223 00:08:15,038 --> 00:08:20,318 specifically to Sam Alman's recent claim 224 00:08:17,279 --> 00:08:22,318 that AI's water usage is minimal. James 225 00:08:20,319 --> 00:08:24,000 says that's simply not true. He also 226 00:08:22,319 --> 00:08:26,240 tells us that younger generations are 227 00:08:24,000 --> 00:08:28,160 increasingly boycotting generative AI 228 00:08:26,240 --> 00:08:30,800 for environmental reasons. So, here's 229 00:08:28,160 --> 00:08:33,200 his question. Should mainstream media be 230 00:08:30,800 --> 00:08:34,879 doing more to hold these companies to 231 00:08:33,200 --> 00:08:37,120 account? 232 00:08:34,879 --> 00:08:38,958 >> Definitely. Actually, a recent Guardian 233 00:08:37,120 --> 00:08:40,799 study found that uh the big tech 234 00:08:38,958 --> 00:08:43,119 companies were lobbying very very hard 235 00:08:40,799 --> 00:08:45,199 against transparency to make sure that I 236 00:08:43,120 --> 00:08:48,480 mean citing confidentiality to not 237 00:08:45,200 --> 00:08:50,640 include any uh energy figures or water 238 00:08:48,480 --> 00:08:52,639 figures about data centers and so we 239 00:08:50,639 --> 00:08:54,080 we're we're seeing them play dirty and I 240 00:08:52,639 --> 00:08:55,439 think it's time to ask for 241 00:08:54,080 --> 00:08:57,120 accountability and I think that 242 00:08:55,440 --> 00:08:59,200 especially in a time where people are 243 00:08:57,120 --> 00:09:00,720 increasingly sustainability conscious so 244 00:08:59,200 --> 00:09:03,440 so you know we make our decisions based 245 00:09:00,720 --> 00:09:05,360 on on the environment and and ethical 246 00:09:03,440 --> 00:09:07,279 concerns we need this information 247 00:09:05,360 --> 00:09:09,519 whether it be for choosing one AI model 248 00:09:07,278 --> 00:09:11,600 over the other for for using AI or not 249 00:09:09,519 --> 00:09:12,879 using AI, right? There's there's lots of 250 00:09:11,600 --> 00:09:14,240 decisions that we make on an everyday 251 00:09:12,879 --> 00:09:16,399 basis that we just don't have the 252 00:09:14,240 --> 00:09:18,480 information for. And especially since AI 253 00:09:16,399 --> 00:09:20,480 has become such a common technology, we 254 00:09:18,480 --> 00:09:22,000 definitely need these numbers and and 255 00:09:20,480 --> 00:09:24,000 these companies have them. It's just a 256 00:09:22,000 --> 00:09:26,080 matter of of giving them maybe positive 257 00:09:24,000 --> 00:09:26,879 and and and less positive incentives for 258 00:09:26,080 --> 00:09:28,959 sharing them. 259 00:09:26,879 --> 00:09:30,559 >> Well, well, let's let's try and choose 260 00:09:28,958 --> 00:09:32,399 to look at this positively because we're 261 00:09:30,559 --> 00:09:34,799 all using the technology. We're going to 262 00:09:32,399 --> 00:09:36,958 use it in our work. So we we need these 263 00:09:34,799 --> 00:09:39,679 companies to be successful if we're if 264 00:09:36,958 --> 00:09:41,838 we're going to employ AI fully. What 265 00:09:39,679 --> 00:09:43,838 does a responsible data center look 266 00:09:41,839 --> 00:09:45,920 like, Sasha? 267 00:09:43,839 --> 00:09:47,839 >> So you can definitely create them in a 268 00:09:45,919 --> 00:09:50,159 way that's u more integrated in the 269 00:09:47,839 --> 00:09:51,760 existing infrastructure. So currently um 270 00:09:50,159 --> 00:09:54,079 the data centers are being built out in 271 00:09:51,759 --> 00:09:55,439 a very kind of bigger is better kind of 272 00:09:54,080 --> 00:09:57,120 way. So typically they're outside of 273 00:09:55,440 --> 00:09:59,440 cities, they're huge like warehouse 274 00:09:57,120 --> 00:10:00,959 sized um but they can really be 275 00:09:59,440 --> 00:10:03,040 integrated like the smaller data centers 276 00:10:00,958 --> 00:10:04,639 can be in basement. Um, the heat can be 277 00:10:03,039 --> 00:10:06,879 reused to heat office buildings or 278 00:10:04,639 --> 00:10:08,799 university campuses. It's much easier to 279 00:10:06,879 --> 00:10:11,439 use renewable energy or or a mix at 280 00:10:08,799 --> 00:10:13,199 least of of renewable energy if if 281 00:10:11,440 --> 00:10:14,959 there's less capacity that's needed. 282 00:10:13,200 --> 00:10:16,720 >> You think it that maybe part of this 283 00:10:14,958 --> 00:10:18,319 answer is then partnering with other 284 00:10:16,720 --> 00:10:19,920 companies, 285 00:10:18,320 --> 00:10:21,519 >> partnering and rethinking the paradigm. 286 00:10:19,919 --> 00:10:23,199 So, currently it's like we need the 287 00:10:21,519 --> 00:10:25,200 biggest data centers, we need sovereign 288 00:10:23,200 --> 00:10:26,720 AI, we need, you know, bigger, let's 289 00:10:25,200 --> 00:10:28,000 build it out. Even in like I mean in 290 00:10:26,720 --> 00:10:29,440 Canada it's the same thing. We need our 291 00:10:28,000 --> 00:10:31,519 own data center. Let's let's build it 292 00:10:29,440 --> 00:10:33,040 out. But instead of thinking of that as 293 00:10:31,519 --> 00:10:34,240 as the the you know when you have a 294 00:10:33,039 --> 00:10:35,360 hammer everything's a nail I think we 295 00:10:34,240 --> 00:10:37,278 should be thinking about the nails that 296 00:10:35,360 --> 00:10:39,120 we have and thinking about okay so what 297 00:10:37,278 --> 00:10:41,039 do we need this data center for? Is it 298 00:10:39,120 --> 00:10:43,278 for a university? Is it for a private 299 00:10:41,039 --> 00:10:45,599 company? Is there a way of for example 300 00:10:43,278 --> 00:10:47,519 incentivizing uh some mix of renewables 301 00:10:45,600 --> 00:10:49,440 or or for example helping them build it 302 00:10:47,519 --> 00:10:51,839 out in a way that isn't you know bring 303 00:10:49,440 --> 00:10:53,680 your own turbine on a on a on a truck 304 00:10:51,839 --> 00:10:56,000 kind of situation. And so I think there 305 00:10:53,679 --> 00:10:58,319 are ways of being more more agile if we 306 00:10:56,000 --> 00:10:59,600 rethink our way of of doing AI. And it's 307 00:10:58,320 --> 00:11:01,519 not only for data sensors. Same thing 308 00:10:59,600 --> 00:11:02,959 for AI models. Instead of being like we 309 00:11:01,519 --> 00:11:04,639 need the biggest, we need the most 310 00:11:02,958 --> 00:11:06,799 energy intensive model for every single 311 00:11:04,639 --> 00:11:09,199 task. We can have smaller models for 312 00:11:06,799 --> 00:11:11,199 example ondevice models. Uh instead of 313 00:11:09,200 --> 00:11:12,879 having every query be dispatched to the 314 00:11:11,200 --> 00:11:14,720 cloud, we can have AI models running 315 00:11:12,879 --> 00:11:16,000 locally on our smartphones and and 316 00:11:14,720 --> 00:11:17,759 computers. So I think we should really 317 00:11:16,000 --> 00:11:20,399 be rethinking a little bit the way that 318 00:11:17,759 --> 00:11:23,838 we design and deploy AI currently. 319 00:11:20,399 --> 00:11:26,639 >> A quick question just to um satisfy my 320 00:11:23,839 --> 00:11:28,399 curiosity, Sasha. Um quick answers if 321 00:11:26,639 --> 00:11:30,399 you could. There are some country uh 322 00:11:28,399 --> 00:11:32,879 companies that are developing these air 323 00:11:30,399 --> 00:11:35,600 cooling systems to reduce water 324 00:11:32,879 --> 00:11:37,919 consumption. Do they work? 325 00:11:35,600 --> 00:11:40,959 >> Yes, but it's often a trade-off of using 326 00:11:37,919 --> 00:11:42,559 more energy and less water. So often um 327 00:11:40,958 --> 00:11:44,479 it's true that you can for example 328 00:11:42,559 --> 00:11:46,159 recycle water. So essentially water gets 329 00:11:44,480 --> 00:11:48,079 cycled through and it heats up and you 330 00:11:46,159 --> 00:11:50,078 have to cool it down. So either you need 331 00:11:48,078 --> 00:11:52,000 uh cooling towers or sometimes you know 332 00:11:50,078 --> 00:11:53,838 you cool it down with energy with 333 00:11:52,000 --> 00:11:55,200 electricity. And so it's often a 334 00:11:53,839 --> 00:11:57,120 trade-off where they're using more 335 00:11:55,200 --> 00:11:57,680 energy but but less water. It's a closed 336 00:11:57,120 --> 00:11:59,679 loop system. 337 00:11:57,679 --> 00:12:01,599 >> Yeah. And at the outset you you said 338 00:11:59,679 --> 00:12:03,679 that very often these data centers are 339 00:12:01,600 --> 00:12:06,079 outstripping what the renewable industry 340 00:12:03,679 --> 00:12:08,239 can provide for them. But there are good 341 00:12:06,078 --> 00:12:11,519 examples and I wanted to point to them 342 00:12:08,240 --> 00:12:13,919 where data centers have been cited very 343 00:12:11,519 --> 00:12:16,078 close to renewable energy. So Iceland is 344 00:12:13,919 --> 00:12:18,479 using geothermal, Norway using 345 00:12:16,078 --> 00:12:21,120 hydroelectric. Is that an example that 346 00:12:18,480 --> 00:12:23,759 other countries should be following? 347 00:12:21,120 --> 00:12:26,320 >> Yes. But I think that um very few 348 00:12:23,759 --> 00:12:28,319 countries I mean uh in in the current 349 00:12:26,320 --> 00:12:30,000 state of things have that extra capacity 350 00:12:28,320 --> 00:12:31,519 and also if these data centers continue 351 00:12:30,000 --> 00:12:33,839 to be so like for example if a data 352 00:12:31,519 --> 00:12:36,320 center uses as much energy as 100,000 353 00:12:33,839 --> 00:12:38,480 homes um there's very few grids 354 00:12:36,320 --> 00:12:40,480 renewable grids that can take that that 355 00:12:38,480 --> 00:12:42,079 can provide that energy on such a short 356 00:12:40,480 --> 00:12:43,680 notice. Even for example in Quebec where 357 00:12:42,078 --> 00:12:45,919 I live we have hydro but we don't have 358 00:12:43,679 --> 00:12:47,759 the extra capacity for you know in two 359 00:12:45,919 --> 00:12:49,679 years an extra 100 thousand homes to be 360 00:12:47,759 --> 00:12:51,439 built. It has to be gradual. And so it's 361 00:12:49,679 --> 00:12:53,759 really the timelines that often don't 362 00:12:51,440 --> 00:12:55,839 line up. And this is why natural gas is 363 00:12:53,759 --> 00:12:57,519 the cheapest, fastest solution. And and 364 00:12:55,839 --> 00:12:58,800 often there are long-term plans. Often 365 00:12:57,519 --> 00:13:00,159 it's like, well, in 10 years we're going 366 00:12:58,799 --> 00:13:02,000 to do renewables. In 10 years, we're 367 00:13:00,159 --> 00:13:04,078 going to do this. But in the meantime, 368 00:13:02,000 --> 00:13:05,679 it adds a lot of emissions. 369 00:13:04,078 --> 00:13:07,519 >> Okay. Well, you might have questions on 370 00:13:05,679 --> 00:13:08,719 what you've been hearing about data 371 00:13:07,519 --> 00:13:12,240 centers. You might have some strong 372 00:13:08,720 --> 00:13:14,000 thoughts on it. AI decoded atbc.co.uk. 373 00:13:12,240 --> 00:13:15,839 Now, since Stephanie has been focusing 374 00:13:14,000 --> 00:13:17,839 on clarity and regulation, I've got a 375 00:13:15,839 --> 00:13:20,720 story for her. Um, let me show you some 376 00:13:17,839 --> 00:13:23,920 images. Uh, these are images that look 377 00:13:20,720 --> 00:13:27,160 entirely real, but the people in them 378 00:13:23,919 --> 00:13:27,159 are fake. 379 00:13:27,440 --> 00:13:32,399 >> My Coachella week was so much fun. Let 380 00:13:30,399 --> 00:13:34,399 me take you around. It's a secret. You 381 00:13:32,399 --> 00:13:36,000 can be in Coachella as me just by a few 382 00:13:34,399 --> 00:13:38,639 prompts. Stay with me till the end for 383 00:13:36,000 --> 00:13:38,639 the prompts. 384 00:13:42,720 --> 00:13:49,519 They are computerenerated influencers 385 00:13:46,000 --> 00:13:52,078 uh who were seen photographed on 386 00:13:49,519 --> 00:13:56,159 Instagram alongside some of the most 387 00:13:52,078 --> 00:13:57,838 famous people at Coachella uh which is 388 00:13:56,159 --> 00:14:00,159 those in the know will know is this very 389 00:13:57,839 --> 00:14:03,600 trendy music festival in the desert in 390 00:14:00,159 --> 00:14:05,759 one of the desert valleys in California. 391 00:14:03,600 --> 00:14:09,040 How many of those engaging in these 392 00:14:05,759 --> 00:14:11,198 photographs knew that uh the people they 393 00:14:09,039 --> 00:14:12,639 were pictured alongside were fake? I 394 00:14:11,198 --> 00:14:14,559 would suggest not very many. I'm not 395 00:14:12,639 --> 00:14:16,959 even sure that Coachella knew that there 396 00:14:14,559 --> 00:14:18,319 were fake influences uh in the crowd. 397 00:14:16,958 --> 00:14:21,119 Stephanie, we've talked about this on 398 00:14:18,320 --> 00:14:23,199 the program before about AI generated 399 00:14:21,120 --> 00:14:25,679 beauty, the impact it has on young 400 00:14:23,198 --> 00:14:26,958 people. This for me, actually, I was 401 00:14:25,679 --> 00:14:28,719 reading about it this week. This feels 402 00:14:26,958 --> 00:14:30,799 like the next chapter of that. 403 00:14:28,720 --> 00:14:32,480 >> Yeah. And again, the law is just not fit 404 00:14:30,799 --> 00:14:33,919 for purpose on this. I think we're 405 00:14:32,480 --> 00:14:36,079 really going to have to get to a point 406 00:14:33,919 --> 00:14:37,759 where we have laws on the books that say 407 00:14:36,078 --> 00:14:39,838 if you have someone that's pre 408 00:14:37,759 --> 00:14:42,159 pretending to be a human being, it has 409 00:14:39,839 --> 00:14:43,600 to be labeled. It just has to because 410 00:14:42,159 --> 00:14:44,958 you're dealing with children first of 411 00:14:43,600 --> 00:14:47,519 all, so like anyone that's under the age 412 00:14:44,958 --> 00:14:49,359 of 18 needs to be protected, but you've 413 00:14:47,519 --> 00:14:50,720 also dealing with older people. You're 414 00:14:49,360 --> 00:14:53,120 also dealing with the potential for 415 00:14:50,720 --> 00:14:55,680 scam, for fraud, for misinformation and 416 00:14:53,120 --> 00:14:56,639 disinformation. So this would just solve 417 00:14:55,679 --> 00:14:58,239 a lot of things. 418 00:14:56,639 --> 00:15:00,480 >> Pommy, who's behind these images? What 419 00:14:58,240 --> 00:15:03,198 do they what do they want? From what I 420 00:15:00,480 --> 00:15:05,039 understand, it is mostly agencies. It's 421 00:15:03,198 --> 00:15:06,639 not, you know, it's not a cottage 422 00:15:05,039 --> 00:15:08,799 industry of people working from home. 423 00:15:06,639 --> 00:15:11,039 There are, you know, agencies most of 424 00:15:08,799 --> 00:15:14,319 the time in in Europe, in places like 425 00:15:11,039 --> 00:15:16,399 London, um, and on the continent who are 426 00:15:14,320 --> 00:15:18,079 producing these as branding exercises 427 00:15:16,399 --> 00:15:20,639 and as an opportunity for a brand to get 428 00:15:18,078 --> 00:15:24,000 a sponsorship. But for Yeah, absolutely. 429 00:15:20,639 --> 00:15:27,120 Um, if you think about it, an influencer 430 00:15:24,000 --> 00:15:28,879 um who has a brand sponsorship deal will 431 00:15:27,120 --> 00:15:30,560 be quite costly because if they want to 432 00:15:28,879 --> 00:15:33,120 go to Coachella, they want to go 433 00:15:30,559 --> 00:15:34,958 business class maybe. Um, they want to 434 00:15:33,120 --> 00:15:36,959 get a hotel, they want to get some other 435 00:15:34,958 --> 00:15:39,439 freebies. But if you have an influencer 436 00:15:36,958 --> 00:15:42,159 who you are sponsoring to hold your can 437 00:15:39,440 --> 00:15:44,639 of whatever in the photograph, um, 438 00:15:42,159 --> 00:15:46,879 they're not going to have a bad day or 439 00:15:44,639 --> 00:15:48,399 get old or look weird in the photo. 440 00:15:46,879 --> 00:15:50,078 They're always going to look great. It's 441 00:15:48,399 --> 00:15:51,600 funny, ahead of this program, I actually 442 00:15:50,078 --> 00:15:55,039 looked at some of these influencers who 443 00:15:51,600 --> 00:15:57,278 were in Coachella in Coachella and it 444 00:15:55,039 --> 00:15:59,198 was amazing like one of them had about 445 00:15:57,278 --> 00:16:02,240 170,000 followers and had it was 446 00:15:59,198 --> 00:16:05,599 pictures of her with Justin Bieber um 447 00:16:02,240 --> 00:16:07,360 with the Kardashians with Madonna and no 448 00:16:05,600 --> 00:16:09,440 one in the comments was saying this 449 00:16:07,360 --> 00:16:12,079 isn't real. They were all the comments 450 00:16:09,440 --> 00:16:14,320 were kind of congratulatory 451 00:16:12,078 --> 00:16:16,879 um and there was no disclosure at all on 452 00:16:14,320 --> 00:16:18,560 the Instagram profile that it was AI 453 00:16:16,879 --> 00:16:20,078 generated. So I think a lot of people in 454 00:16:18,559 --> 00:16:20,879 good faith would look at it and think 455 00:16:20,078 --> 00:16:22,799 this is really 456 00:16:20,879 --> 00:16:24,399 >> the obvious problem is Sasha that the 457 00:16:22,799 --> 00:16:27,039 very famous person who's gone to 458 00:16:24,399 --> 00:16:29,198 Coachella can say to someone who might 459 00:16:27,039 --> 00:16:30,639 be advertising kryptonite next to them 460 00:16:29,198 --> 00:16:32,399 look I don't want to be advertising 461 00:16:30,639 --> 00:16:34,720 kryptonite and they can push them away. 462 00:16:32,399 --> 00:16:37,278 They have no choice. They have no say in 463 00:16:34,720 --> 00:16:39,120 in an AI generated person being put next 464 00:16:37,278 --> 00:16:41,278 to them in a photograph that they pose 465 00:16:39,120 --> 00:16:43,360 for unknowingly. 466 00:16:41,278 --> 00:16:46,320 >> Yeah. In a in a world of AI agents, 467 00:16:43,360 --> 00:16:48,079 humans lose their own agency. I think to 468 00:16:46,320 --> 00:16:49,360 some extent and especially famous people 469 00:16:48,078 --> 00:16:51,519 because there's so many likenesses of 470 00:16:49,360 --> 00:16:53,680 them on the internet that it's very very 471 00:16:51,519 --> 00:16:55,679 easy to generate a false image or video 472 00:16:53,679 --> 00:16:57,758 now of of a celebrity. 473 00:16:55,679 --> 00:17:00,799 >> Didn't Stephanie, didn't we talk about 474 00:16:57,759 --> 00:17:02,800 New York bringing in new regulation to 475 00:17:00,799 --> 00:17:04,720 stop this? I think you had to put on 476 00:17:02,799 --> 00:17:07,198 your website whether you were you were 477 00:17:04,720 --> 00:17:08,798 using an AI generated influencer, but 478 00:17:07,199 --> 00:17:10,959 there's I mean some of these pictures 479 00:17:08,798 --> 00:17:11,918 from Coachella do do that, but plenty of 480 00:17:10,959 --> 00:17:13,439 them don't. 481 00:17:11,919 --> 00:17:14,720 >> Yeah. And that's the enforcement thing. 482 00:17:13,439 --> 00:17:15,199 Like there are all sorts of laws that 483 00:17:14,720 --> 00:17:16,798 are obvious 484 00:17:15,199 --> 00:17:19,759 >> from state to state is different. Right. 485 00:17:16,798 --> 00:17:22,959 >> Exactly. And how you know whose job is 486 00:17:19,759 --> 00:17:24,400 it to police that and how are they able 487 00:17:22,959 --> 00:17:25,919 to get the accountability that they 488 00:17:24,400 --> 00:17:27,519 need? So again this is a case of if you 489 00:17:25,919 --> 00:17:29,440 were to take them to court that's going 490 00:17:27,519 --> 00:17:31,519 to take years right it's going to cost a 491 00:17:29,440 --> 00:17:32,480 lot of money etc. So it's kind of like 492 00:17:31,519 --> 00:17:34,079 everything that we saw about 493 00:17:32,480 --> 00:17:35,440 accountability with social media not 494 00:17:34,079 --> 00:17:37,119 being very effective. 495 00:17:35,440 --> 00:17:39,679 >> I mean Coachella themselves could just 496 00:17:37,119 --> 00:17:40,558 say enough. You can't do this. It's up 497 00:17:39,679 --> 00:17:42,400 to the organizer. 498 00:17:40,558 --> 00:17:43,839 >> They absolutely could. And I think, you 499 00:17:42,400 --> 00:17:45,600 know, what you mentioned earlier about 500 00:17:43,839 --> 00:17:47,678 the uh the celebrities, you might want 501 00:17:45,599 --> 00:17:49,359 to just play the world's tiniest violin 502 00:17:47,679 --> 00:17:51,200 for these people for being in these 503 00:17:49,359 --> 00:17:53,439 photographs and potentially compromising 504 00:17:51,200 --> 00:17:55,519 situations. But I mean, ultimately, if 505 00:17:53,440 --> 00:17:57,840 they do get upset and if the brands get 506 00:17:55,519 --> 00:17:59,839 upset, I think that's perhaps going to 507 00:17:57,839 --> 00:18:03,119 be potentially even more effective than 508 00:17:59,839 --> 00:18:04,240 actual regulation in um in getting 509 00:18:03,119 --> 00:18:05,759 enforcement. So, 510 00:18:04,240 --> 00:18:08,000 >> what about the platforms though? I mean, 511 00:18:05,759 --> 00:18:09,200 in Instagram, Tik Tok, they're profiting 512 00:18:08,000 --> 00:18:11,440 from these engagements. 513 00:18:09,200 --> 00:18:13,519 >> Yes. And um technically on these 514 00:18:11,440 --> 00:18:15,759 platforms you're you are supposed to 515 00:18:13,519 --> 00:18:18,079 disclose if something is AI generated. 516 00:18:15,759 --> 00:18:22,720 The fact is nobody actually follows that 517 00:18:18,079 --> 00:18:24,558 rule. And Meta does have um automated 518 00:18:22,720 --> 00:18:26,319 systems that will try and look for 519 00:18:24,558 --> 00:18:28,960 things that are AI generated and tag 520 00:18:26,319 --> 00:18:30,639 them. But it's an almost impossible task 521 00:18:28,960 --> 00:18:33,440 because there's hundreds of thousands of 522 00:18:30,640 --> 00:18:35,200 posts made every day and many many many 523 00:18:33,440 --> 00:18:37,440 are slipping through the net. Now the 524 00:18:35,200 --> 00:18:41,279 thing is it is possible if they really 525 00:18:37,440 --> 00:18:43,919 wanted to. There are ways to put um a 526 00:18:41,279 --> 00:18:48,240 cryptographic signature on actual 527 00:18:43,919 --> 00:18:50,400 photographic images um called CP2A. Um 528 00:18:48,240 --> 00:18:52,319 but that's just not something that the 529 00:18:50,400 --> 00:18:54,240 tech companies are investing in because 530 00:18:52,319 --> 00:18:56,079 if you think about it, there is a 531 00:18:54,240 --> 00:18:57,599 commercial incentive to just let this 532 00:18:56,079 --> 00:18:59,439 carry on because to to your point 533 00:18:57,599 --> 00:19:02,480 earlier, do people actually like this? 534 00:18:59,440 --> 00:19:04,558 Weirdly, the public don't hate AI 535 00:19:02,480 --> 00:19:06,640 avatars. They're kind of okay with it 536 00:19:04,558 --> 00:19:07,839 >> if they know. I mean, how many people 537 00:19:06,640 --> 00:19:09,200 really know? because some of these are 538 00:19:07,839 --> 00:19:10,240 really good and they're getting better. 539 00:19:09,200 --> 00:19:11,759 >> I think that's just going to make it 540 00:19:10,240 --> 00:19:13,440 harder to to deal with. 541 00:19:11,759 --> 00:19:15,519 >> So weird. Everyone talks about being 542 00:19:13,440 --> 00:19:17,038 authentic and brands are all about your 543 00:19:15,519 --> 00:19:19,759 values and then they do this stuff 544 00:19:17,038 --> 00:19:20,240 that's so fake and people eat it up. 545 00:19:19,759 --> 00:19:21,839 >> Weird. 546 00:19:20,240 --> 00:19:23,839 >> Do we still call California the Wild 547 00:19:21,839 --> 00:19:25,038 West? No, maybe not. 548 00:19:23,839 --> 00:19:26,798 >> But this is the Wild West. 549 00:19:25,038 --> 00:19:28,720 >> Yeah, absolutely. 550 00:19:26,798 --> 00:19:30,240 >> Now, the late Val Kilmer was one of the 551 00:19:28,720 --> 00:19:32,240 greats. Do we agree on that? 552 00:19:30,240 --> 00:19:33,200 >> Yeah. Top Gun Batman forever. 553 00:19:32,240 --> 00:19:35,519 >> Real genius. 554 00:19:33,200 --> 00:19:37,440 >> Yeah. The Doors. I liked him in that. uh 555 00:19:35,519 --> 00:19:39,359 one of the Hollywood uh greats, one of 556 00:19:37,440 --> 00:19:41,200 the versatile actors of Hollywood as 557 00:19:39,359 --> 00:19:43,519 well. And he died a year ago, as many of 558 00:19:41,200 --> 00:19:46,080 you will know, age 65 after a long and 559 00:19:43,519 --> 00:19:48,879 sad battle with throat cancer. But he 560 00:19:46,079 --> 00:19:50,798 had been cast in a film uh a few years 561 00:19:48,880 --> 00:19:53,919 earlier. It's called As Deep as the 562 00:19:50,798 --> 00:19:56,960 Grave. It's a historical drama about uh 563 00:19:53,919 --> 00:19:58,400 the American Southwest. And uh of 564 00:19:56,960 --> 00:20:00,640 course, he didn't make it to set because 565 00:19:58,400 --> 00:20:03,120 he was he was too ill at the end. But 566 00:20:00,640 --> 00:20:06,240 this week, the trailer for that movie 567 00:20:03,119 --> 00:20:08,558 debuted at Cineacon in Las Vegas. He's 568 00:20:06,240 --> 00:20:10,400 in it. And every scene in which he is in 569 00:20:08,558 --> 00:20:16,599 and every line that he speaks, of 570 00:20:10,400 --> 00:20:16,600 course, is generated entirely by AI. 571 00:20:18,558 --> 00:20:22,759 >> Is me. 572 00:20:29,839 --> 00:20:35,639 Hey, hey, 573 00:20:32,640 --> 00:20:35,640 hey. 574 00:20:55,119 --> 00:20:58,119 Hey, 575 00:21:04,960 --> 00:21:09,600 >> don't fear the dead and don't fear me. 576 00:21:07,759 --> 00:21:12,400 His children gave their blessing to this 577 00:21:09,599 --> 00:21:14,000 Stephanie. Um, and just so everybody 578 00:21:12,400 --> 00:21:16,240 knows, the filmmakers followed the 579 00:21:14,000 --> 00:21:20,000 guidelines, spoke to the unions. Kilmer, 580 00:21:16,240 --> 00:21:22,720 in fact, himself uh embraced AI in his 581 00:21:20,000 --> 00:21:25,200 final movie, Top Gun Maverick. uh his 582 00:21:22,720 --> 00:21:28,079 his voice was recreated by AI, so he 583 00:21:25,200 --> 00:21:30,240 wasn't oblivious to this. 584 00:21:28,079 --> 00:21:33,359 Is this the blueprint, do you think, for 585 00:21:30,240 --> 00:21:35,519 for for AI in Hollywood, an ethical way 586 00:21:33,359 --> 00:21:37,839 of using it, or does it for you open a 587 00:21:35,519 --> 00:21:40,240 door that we can't close? 588 00:21:37,839 --> 00:21:42,079 >> I think it's just about choice. So, I 589 00:21:40,240 --> 00:21:43,759 like the idea that if directors and 590 00:21:42,079 --> 00:21:46,000 other artists want to experiment with 591 00:21:43,759 --> 00:21:47,200 AI, that they are doing so mindfully, 592 00:21:46,000 --> 00:21:49,279 that they're trying to come up with an 593 00:21:47,200 --> 00:21:51,200 ethical standard that is no doubt going 594 00:21:49,279 --> 00:21:53,519 to be discussed and may eventually be 595 00:21:51,200 --> 00:21:55,840 formalized. I also think it's really 596 00:21:53,519 --> 00:21:58,558 important for any creative person who 597 00:21:55,839 --> 00:22:00,720 doesn't want their likeness, their 598 00:21:58,558 --> 00:22:02,960 biometrics or their creative output to 599 00:22:00,720 --> 00:22:05,759 be used in this way to be able to say 600 00:22:02,960 --> 00:22:07,360 no. Right? So in that case for anybody 601 00:22:05,759 --> 00:22:09,599 who's a Hollywood actor listening and 602 00:22:07,359 --> 00:22:11,918 watching our show as we know they are 603 00:22:09,599 --> 00:22:12,959 >> um they would want to be all the way 604 00:22:11,919 --> 00:22:14,320 they want to be speaking with their 605 00:22:12,960 --> 00:22:16,480 agents and with their team and their 606 00:22:14,319 --> 00:22:17,918 lawyers to be really clear about that. 607 00:22:16,480 --> 00:22:19,759 So you know how do you want your 608 00:22:17,919 --> 00:22:21,360 likeness being used while you're alive 609 00:22:19,759 --> 00:22:23,038 and then how do you want it being used 610 00:22:21,359 --> 00:22:25,119 after your death. So in this case Bal 611 00:22:23,038 --> 00:22:26,640 Kilmer's children were fine with it. His 612 00:22:25,119 --> 00:22:28,000 estate's fine with it and everything was 613 00:22:26,640 --> 00:22:30,559 done with everyone being I think as 614 00:22:28,000 --> 00:22:32,640 ethical as they can be. Other actors 615 00:22:30,558 --> 00:22:34,158 have made different choices. C can I 616 00:22:32,640 --> 00:22:35,919 just say Stephanie for the record that 617 00:22:34,159 --> 00:22:38,480 if you're going to use my likeness for 618 00:22:35,919 --> 00:22:40,000 AI decoded into the future I am happy 619 00:22:38,480 --> 00:22:42,640 for that so long as you pay the 620 00:22:40,000 --> 00:22:44,159 royalties to my estate just on the 621 00:22:42,640 --> 00:22:46,000 record so we're all cleared. 622 00:22:44,159 --> 00:22:48,960 >> Um Sasha I mean obviously Hollywood 623 00:22:46,000 --> 00:22:50,960 Hollywood actors are are a gift aren't 624 00:22:48,960 --> 00:22:53,279 they for AI because there's there's 625 00:22:50,960 --> 00:22:55,679 hundreds of hours of film of them. 626 00:22:53,279 --> 00:22:58,240 They've been in lots of performances. Uh 627 00:22:55,679 --> 00:23:01,038 and so in fact I think this performance 628 00:22:58,240 --> 00:23:04,079 that Kilmer in of this movie was it was 629 00:23:01,038 --> 00:23:06,319 reconstructed from 40 films hundreds of 630 00:23:04,079 --> 00:23:09,279 hours of footage. So as long as you 631 00:23:06,319 --> 00:23:10,558 signed up to this sky's is the limit. 632 00:23:09,279 --> 00:23:13,038 >> I think that's a very hard ethical 633 00:23:10,558 --> 00:23:15,038 question. Um I've seen a lot of papers. 634 00:23:13,038 --> 00:23:16,640 I even saw a theater play on this topic 635 00:23:15,038 --> 00:23:18,558 especially after death actually. I think 636 00:23:16,640 --> 00:23:20,480 that's a really important point. Uh who 637 00:23:18,558 --> 00:23:22,879 can opt in? How can you opt out if 638 00:23:20,480 --> 00:23:24,079 you're dead? Um, and also what does this 639 00:23:22,880 --> 00:23:25,520 mean for the community? Because if 640 00:23:24,079 --> 00:23:27,678 there's peer pressure, for example, and 641 00:23:25,519 --> 00:23:29,200 I think that actually AI was one of the 642 00:23:27,679 --> 00:23:30,720 one of the sticking points during the 643 00:23:29,200 --> 00:23:32,319 strikes a couple of years ago, right? 644 00:23:30,720 --> 00:23:33,679 Uh, to what extent is there is there 645 00:23:32,319 --> 00:23:35,918 union pressure? Is there community 646 00:23:33,679 --> 00:23:38,640 pressure to opt in? Can you can you 647 00:23:35,919 --> 00:23:40,480 continue opting out in this new new 648 00:23:38,640 --> 00:23:41,919 world, right? And it's it's similar to 649 00:23:40,480 --> 00:23:43,360 what a lot of workers are facing as 650 00:23:41,919 --> 00:23:44,880 well. Um, I'm hearing a lot of people 651 00:23:43,359 --> 00:23:46,558 being like, well, I'm forced to use AI 652 00:23:44,880 --> 00:23:48,640 in my in my workplace. We even have 653 00:23:46,558 --> 00:23:49,839 dashboards for tracking it. And so, it's 654 00:23:48,640 --> 00:23:52,720 really this pressure that we're seeing 655 00:23:49,839 --> 00:23:54,879 to use AI. And I think that that does uh 656 00:23:52,720 --> 00:23:56,240 uh make people give up some of their 657 00:23:54,880 --> 00:23:57,600 individual choices if they feel 658 00:23:56,240 --> 00:23:59,919 pressured to. So for example, if you're 659 00:23:57,599 --> 00:24:01,119 a young actor and you want to uh you 660 00:23:59,919 --> 00:24:02,640 know make your make a name for yourself, 661 00:24:01,119 --> 00:24:04,158 but you don't want to use AI but you 662 00:24:02,640 --> 00:24:05,840 have this peer pressure around you, can 663 00:24:04,159 --> 00:24:08,400 you really opt out without having a 664 00:24:05,839 --> 00:24:10,399 negative impact on your career? 665 00:24:08,400 --> 00:24:13,440 >> Yeah. Yeah. And and I think um Hollywood 666 00:24:10,400 --> 00:24:16,000 has already this history of recycling 667 00:24:13,440 --> 00:24:17,840 old films and making sequels and making 668 00:24:16,000 --> 00:24:19,519 remakes. And I think there is already 669 00:24:17,839 --> 00:24:21,918 this tendency to want to maximize 670 00:24:19,519 --> 00:24:24,639 profits by going back to whatever works. 671 00:24:21,919 --> 00:24:28,240 And if that's the incentive driving the 672 00:24:24,640 --> 00:24:30,880 remake of an actor who has died, I think 673 00:24:28,240 --> 00:24:32,960 in the end that can actually put younger 674 00:24:30,880 --> 00:24:34,720 fresh talent out of work if it's just 675 00:24:32,960 --> 00:24:36,480 the kind of the same icons appearing 676 00:24:34,720 --> 00:24:39,038 over and over again for the next 100 677 00:24:36,480 --> 00:24:41,200 years. Um, and to Sasha's point as well, 678 00:24:39,038 --> 00:24:43,119 I think where that pressure comes from, 679 00:24:41,200 --> 00:24:45,120 like I've spoken to a company that does 680 00:24:43,119 --> 00:24:48,479 virtual reality concerts or Avatar 681 00:24:45,119 --> 00:24:51,278 concerts, um, and they have had pressure 682 00:24:48,480 --> 00:24:55,120 from the families of artists who have 683 00:24:51,278 --> 00:24:57,278 died to try and recreate the the the 684 00:24:55,119 --> 00:24:58,798 deceased artist for a concert, 685 00:24:57,278 --> 00:24:59,359 >> not knowing whether there's consent or 686 00:24:58,798 --> 00:25:01,359 not for that. 687 00:24:59,359 --> 00:25:02,319 >> And it's a very gray area. It's a very 688 00:25:01,359 --> 00:25:06,079 because we're talking about people who 689 00:25:02,319 --> 00:25:07,678 died maybe 10 20 years ago, icons um 690 00:25:06,079 --> 00:25:09,199 putting pressure on them as a it's a new 691 00:25:07,679 --> 00:25:10,320 revenue source for the family left 692 00:25:09,200 --> 00:25:13,278 behind for the estate. 693 00:25:10,319 --> 00:25:15,359 >> We're out of time. Pie uh Sasha 694 00:25:13,278 --> 00:25:18,240 Stephanie, thank you very much indeed. 695 00:25:15,359 --> 00:25:20,158 Um uh AI decoded next week. Uh we think 696 00:25:18,240 --> 00:25:21,679 Scott Galloway is coming on. I'm putting 697 00:25:20,159 --> 00:25:24,400 that out there so he does come on next 698 00:25:21,679 --> 00:25:25,840 week. Um so do tune in for that. If you 699 00:25:24,400 --> 00:25:29,919 have any thoughts on anything we've 700 00:25:25,839 --> 00:25:32,000 discussed, uh do email us aidbc.co.uk. 701 00:25:29,919 --> 00:25:35,120 UK. And I'm going to put on screen for 702 00:25:32,000 --> 00:25:36,798 you the QR code for the AI decoded 703 00:25:35,119 --> 00:25:38,158 playlist which is on YouTube. Some of 704 00:25:36,798 --> 00:25:39,918 you struggling to find it. There it is. 705 00:25:38,159 --> 00:25:42,559 If you scan the QR code, you'll be able 706 00:25:39,919 --> 00:25:44,480 to find it. All the backup episodes are 707 00:25:42,558 --> 00:25:45,759 there. So, do take a look at that. And 708 00:25:44,480 --> 00:25:48,720 don't forget if you want to watch us 709 00:25:45,759 --> 00:25:50,158 again, we are on the BBC i Player. Uh, 710 00:25:48,720 --> 00:25:51,759 that's all the housekeeping. Thank you 711 00:25:50,159 --> 00:25:53,278 very much for watching. Thank you to our 712 00:25:51,759 --> 00:25:56,278 guests this week. We'll see you next 713 00:25:53,278 --> 00:25:56,278 time.