[00:06] Hello and welcome to the program. The AI [00:10] revolution runs on electricity. Lots of [00:13] it. Vast amounts of it. The data centers [00:16] being built right now to power the next [00:18] generation of AI consume as much energy [00:21] as a small city. Last year, the world's [00:25] biggest tech companies spent more than [00:27] $400 billion building them. And they [00:30] need more of them, many more of them. [00:32] But there's a problem. In the United [00:34] States, four in 10 of the data centers [00:37] that are being planned for this year are [00:39] at serious risk of delay. Not enough [00:42] power, not enough equipment, and not [00:44] enough people to build them. Why? Also [00:48] on the program this week, Coachella, the [00:51] world's most glamorous music festival. [00:53] But some of the online influences you [00:55] may have seen pictured in the California [00:57] desert alongside the biggest stars were [01:00] fake, not real. They don't exist. AI [01:03] generated to promote brands and to make [01:05] money. Plus, we'll also talk this week [01:08] about the late Val Kilmer appearing in a [01:11] new film one year after his death. with [01:14] us this week to talk about it, Palmy [01:16] Olsen, uh who is a technology columnist [01:19] at Bloomberg. Welcome to you. [01:20] >> Thank you. [01:21] >> Also here, Dr. Sasha Luchoni, a computer [01:24] scientist specializing in AI and its [01:26] environmental impact and also in the [01:29] studio with us. Uh good to have her back [01:31] here. Dr. Stephanie Hair, colleague, uh [01:34] author and AI expert to give you your [01:36] full title. Um Pommy, let us start with [01:39] this issue of data centers. Um we're [01:41] building lots of them. They are powering [01:44] the AI revolution. [01:46] Why are so many of them on hold? [01:48] >> Yeah. So this is really an issue of [01:51] bottlenecks and you sort of alluded to [01:52] it in your introduction. There's the [01:54] issue of power. So getting access to [01:57] actual power on electricity grids that [01:59] are very very old and haven't received a [02:01] lot of investment over many decades in [02:03] the United States. Um also equipment, [02:06] getting access to things like [02:07] transformers or switch gear, which are [02:09] the types of equipments you need to [02:11] actually run data centers. There's also [02:13] a huge bottleneck getting them. It could [02:14] take up to five years to get some of [02:16] that equipment, particularly because a [02:18] lot of it comes from China and recent US [02:20] tariffs on Chinese goods has made that [02:22] even more difficult. And then there's [02:24] also just getting the people, the [02:25] talent, um the electricians um and the [02:28] people with the skill set who can [02:30] actually uh construct and run these data [02:33] centers. Um so all those things combined [02:36] have meant that at a time when there is [02:38] this rapacious demand for um for energy [02:42] from tech companies, it's actually very [02:44] difficult to build them fast enough. [02:47] >> Does that problem become more acute in [02:49] the energy crisis we're in currently? [02:51] >> I think it does to some extent and it's [02:53] more of a problem for the so-called [02:56] hyperscalers, the big tech companies [02:57] like Meta, um Alphabet, Amazon, etc. who [03:03] are actually the ones who have to shell [03:04] out for these energy costs. And so [03:07] they're setting up um these kind of mini [03:10] nuclear reactors that can actually [03:12] provide um energy specifically for the [03:15] data center um and use renewable sources [03:18] instead of gas. But at the moment even [03:20] getting those up and running is [03:23] logistically very very difficult. It's [03:25] time consuming. There aren't any [03:26] actually that are operational just yet. [03:29] and and so right now I think the main [03:31] source of energy is gas. [03:33] >> Should we give our viewers just a scale [03:35] of what we're talking about here? Scott [03:36] Galloway who might be coming on the [03:38] program next week over in New York, he [03:40] says Open AI alone, Stephanie will need [03:44] 20% of current US electricity capacity [03:47] at $10 trillion. [03:50] >> That's extraordinary. It's extraordinary [03:52] for a company that is yet to turn a [03:54] profit and which is under huge pressure [03:56] to demonstrate value ahead of an IPO as [03:59] well. So it's just worth saying that all [04:02] of these companies, not just OpenAI, [04:03] were making very big promises about [04:06] their data structure buildout plans as [04:08] of last year. We've already seen the US [04:10] UK technology deal that's on hold. We've [04:13] seen OpenAI have to pull out of some of [04:15] its Stargate, the big US data structure [04:20] plan. and they've had to pull out on [04:21] some of those things. Maybe we are [04:23] walking back. So that question of will [04:25] the AI bubble burst? It might not burst. [04:27] It might just sort of slightly [04:29] >> situation, isn't it? When you think that [04:30] actually if you're going to plug these [04:32] into the grid and you don't know whether [04:35] these companies are going to survive in [04:37] the scale or perhaps they're even bigger [04:39] than they are right now, it you it's a [04:42] difficult thing to plan. [04:44] >> It's a difficult thing to plan when [04:45] you're not a planned economy. Which is [04:47] why when we're looking at the data [04:48] center roll out in a country like China [04:50] and comparing that to the United States [04:52] or indeed here in Europe, we get very [04:54] different pictures. [04:55] >> Right. Um Dr. Luchi, Sasha, I'm going to [04:58] call you Sasha. Does it make sense for [05:00] the US president to be so vehemently [05:02] opposed to renewable energy given the [05:05] scale that we're talking? [05:07] Well, the problem is is that the data [05:09] centers are being built so quickly that [05:11] renewable capacity has trouble keeping [05:13] up, especially in rural areas, [05:15] especially out outside of places where [05:18] renewables are are are typically the [05:19] case. So, I think that currently the [05:21] emphasis is build faster, build bigger, [05:24] and they don't want to wait around for [05:26] solar or wind, which is why um [05:28] essentially most uh the data centers [05:30] that are coming online as quickly are [05:32] essentially bringing in turbines on the [05:34] back of trucks, natural gas. It's it's [05:35] like bring your bring your own energy uh [05:37] essentially and most of that is and they [05:39] have the money [05:39] >> is non-renewable. [05:40] >> I mean these companies [05:41] >> they have the money [05:42] >> they have the money but actually [05:44] currently there's a bottleneck even when [05:45] you have the money because there's not [05:46] enough turbines to to power all these [05:48] data centers because there's there's a [05:50] backlog nowadays and and even these uh [05:52] these turbines can't be produced fast [05:54] enough to to respond to demand. Palmy at [05:57] Bloomberg recently you highlighted an [05:59] issue in Northern Spain um with the data [06:02] center build out there which has [06:03] actually been held up as a a model for [06:05] the rest of Europe but for the people [06:06] who live around these projects the [06:09] reality is sometimes very different why [06:12] >> I think it's a common story we're also [06:14] seeing in the United States a lot of [06:15] push back from local residents in areas [06:17] where companies want to build data [06:19] centers and in northern Spain um there [06:21] have been uh the situation is that AWS [06:24] which is the cloud business of Amazon um [06:27] sent letters to local people um saying [06:31] we want to buy your land, giving them [06:33] sometimes 4 days notice to say yes or [06:35] no. Um and some of these people in in in [06:38] northern Spain actually thought it was a [06:40] scam at first. Um one lady went to her [06:42] local town hall and even they didn't [06:44] know. So, it's a real um kind of land [06:47] grab almost to try and get land that is [06:50] relatively cheap in an area where energy [06:53] costs are relatively low and that are [06:56] sparssely populated as well. It seems [06:58] like an ideal situation for building a [07:00] data center, but at the same time there [07:02] is the reality for people who do live [07:04] there and there are people who live [07:06] there um that they have to give up that [07:08] land or maybe the suddenly you've got [07:10] this eyes sore in a place that you've [07:12] lived in for many generations. If you're [07:14] in a community like that though and [07:15] you've already struggled to to get [07:18] natural resources or to to to get [07:20] electricity to get yourselves on the [07:22] grid, does the arrival of a big AI [07:24] company help in that process? Perhaps it [07:26] it could help a community [07:28] >> in some respects. And the funny part in [07:31] that is that governments um local [07:33] governments often uh frame data center [07:36] buildouts as being great for jobs. Yeah. [07:38] But I think you're conflating in that [07:40] situation permanent jobs with [07:43] construction jobs which are temporary. [07:45] And so when you build out a data center, [07:46] you're you're going to hire me. [07:47] >> They're not are they not necessarily big [07:49] employers once the kit is there? [07:50] >> No, I think in a typical data center you [07:52] might have about 100 people, most of [07:54] them cleaners and security people. Um [07:56] whereas for the buildout, sure, [07:58] hundreds, maybe thousands of people, but [08:00] then that's only temporary. [08:01] >> All right, I'm going to bring in an [08:02] audience question quite early into the [08:04] program this week because it it it fits [08:06] what you're talking about. It's from [08:08] James in the UK. He says, "Sasha, AI [08:10] companies continue to minimize this [08:12] environmental impact." He points [08:15] specifically to Sam Alman's recent claim [08:17] that AI's water usage is minimal. James [08:20] says that's simply not true. He also [08:22] tells us that younger generations are [08:24] increasingly boycotting generative AI [08:26] for environmental reasons. So, here's [08:28] his question. Should mainstream media be [08:30] doing more to hold these companies to [08:33] account? [08:34] >> Definitely. Actually, a recent Guardian [08:37] study found that uh the big tech [08:38] companies were lobbying very very hard [08:40] against transparency to make sure that I [08:43] mean citing confidentiality to not [08:45] include any uh energy figures or water [08:48] figures about data centers and so we [08:50] we're we're seeing them play dirty and I [08:52] think it's time to ask for [08:54] accountability and I think that [08:55] especially in a time where people are [08:57] increasingly sustainability conscious so [08:59] so you know we make our decisions based [09:00] on on the environment and and ethical [09:03] concerns we need this information [09:05] whether it be for choosing one AI model [09:07] over the other for for using AI or not [09:09] using AI, right? There's there's lots of [09:11] decisions that we make on an everyday [09:12] basis that we just don't have the [09:14] information for. And especially since AI [09:16] has become such a common technology, we [09:18] definitely need these numbers and and [09:20] these companies have them. It's just a [09:22] matter of of giving them maybe positive [09:24] and and and less positive incentives for [09:26] sharing them. [09:26] >> Well, well, let's let's try and choose [09:28] to look at this positively because we're [09:30] all using the technology. We're going to [09:32] use it in our work. So we we need these [09:34] companies to be successful if we're if [09:36] we're going to employ AI fully. What [09:39] does a responsible data center look [09:41] like, Sasha? [09:43] >> So you can definitely create them in a [09:45] way that's u more integrated in the [09:47] existing infrastructure. So currently um [09:50] the data centers are being built out in [09:51] a very kind of bigger is better kind of [09:54] way. So typically they're outside of [09:55] cities, they're huge like warehouse [09:57] sized um but they can really be [09:59] integrated like the smaller data centers [10:00] can be in basement. Um, the heat can be [10:03] reused to heat office buildings or [10:04] university campuses. It's much easier to [10:06] use renewable energy or or a mix at [10:08] least of of renewable energy if if [10:11] there's less capacity that's needed. [10:13] >> You think it that maybe part of this [10:14] answer is then partnering with other [10:16] companies, [10:18] >> partnering and rethinking the paradigm. [10:19] So, currently it's like we need the [10:21] biggest data centers, we need sovereign [10:23] AI, we need, you know, bigger, let's [10:25] build it out. Even in like I mean in [10:26] Canada it's the same thing. We need our [10:28] own data center. Let's let's build it [10:29] out. But instead of thinking of that as [10:31] as the the you know when you have a [10:33] hammer everything's a nail I think we [10:34] should be thinking about the nails that [10:35] we have and thinking about okay so what [10:37] do we need this data center for? Is it [10:39] for a university? Is it for a private [10:41] company? Is there a way of for example [10:43] incentivizing uh some mix of renewables [10:45] or or for example helping them build it [10:47] out in a way that isn't you know bring [10:49] your own turbine on a on a on a truck [10:51] kind of situation. And so I think there [10:53] are ways of being more more agile if we [10:56] rethink our way of of doing AI. And it's [10:58] not only for data sensors. Same thing [10:59] for AI models. Instead of being like we [11:01] need the biggest, we need the most [11:02] energy intensive model for every single [11:04] task. We can have smaller models for [11:06] example ondevice models. Uh instead of [11:09] having every query be dispatched to the [11:11] cloud, we can have AI models running [11:12] locally on our smartphones and and [11:14] computers. So I think we should really [11:16] be rethinking a little bit the way that [11:17] we design and deploy AI currently. [11:20] >> A quick question just to um satisfy my [11:23] curiosity, Sasha. Um quick answers if [11:26] you could. There are some country uh [11:28] companies that are developing these air [11:30] cooling systems to reduce water [11:32] consumption. Do they work? [11:35] >> Yes, but it's often a trade-off of using [11:37] more energy and less water. So often um [11:40] it's true that you can for example [11:42] recycle water. So essentially water gets [11:44] cycled through and it heats up and you [11:46] have to cool it down. So either you need [11:48] uh cooling towers or sometimes you know [11:50] you cool it down with energy with [11:52] electricity. And so it's often a [11:53] trade-off where they're using more [11:55] energy but but less water. It's a closed [11:57] loop system. [11:57] >> Yeah. And at the outset you you said [11:59] that very often these data centers are [12:01] outstripping what the renewable industry [12:03] can provide for them. But there are good [12:06] examples and I wanted to point to them [12:08] where data centers have been cited very [12:11] close to renewable energy. So Iceland is [12:13] using geothermal, Norway using [12:16] hydroelectric. Is that an example that [12:18] other countries should be following? [12:21] >> Yes. But I think that um very few [12:23] countries I mean uh in in the current [12:26] state of things have that extra capacity [12:28] and also if these data centers continue [12:30] to be so like for example if a data [12:31] center uses as much energy as 100,000 [12:33] homes um there's very few grids [12:36] renewable grids that can take that that [12:38] can provide that energy on such a short [12:40] notice. Even for example in Quebec where [12:42] I live we have hydro but we don't have [12:43] the extra capacity for you know in two [12:45] years an extra 100 thousand homes to be [12:47] built. It has to be gradual. And so it's [12:49] really the timelines that often don't [12:51] line up. And this is why natural gas is [12:53] the cheapest, fastest solution. And and [12:55] often there are long-term plans. Often [12:57] it's like, well, in 10 years we're going [12:58] to do renewables. In 10 years, we're [13:00] going to do this. But in the meantime, [13:02] it adds a lot of emissions. [13:04] >> Okay. Well, you might have questions on [13:05] what you've been hearing about data [13:07] centers. You might have some strong [13:08] thoughts on it. AI decoded atbc.co.uk. [13:12] Now, since Stephanie has been focusing [13:14] on clarity and regulation, I've got a [13:15] story for her. Um, let me show you some [13:17] images. Uh, these are images that look [13:20] entirely real, but the people in them [13:23] are fake. [13:27] >> My Coachella week was so much fun. Let [13:30] me take you around. It's a secret. You [13:32] can be in Coachella as me just by a few [13:34] prompts. Stay with me till the end for [13:36] the prompts. [13:42] They are computerenerated influencers [13:46] uh who were seen photographed on [13:49] Instagram alongside some of the most [13:52] famous people at Coachella uh which is [13:56] those in the know will know is this very [13:57] trendy music festival in the desert in [14:00] one of the desert valleys in California. [14:03] How many of those engaging in these [14:05] photographs knew that uh the people they [14:09] were pictured alongside were fake? I [14:11] would suggest not very many. I'm not [14:12] even sure that Coachella knew that there [14:14] were fake influences uh in the crowd. [14:16] Stephanie, we've talked about this on [14:18] the program before about AI generated [14:21] beauty, the impact it has on young [14:23] people. This for me, actually, I was [14:25] reading about it this week. This feels [14:26] like the next chapter of that. [14:28] >> Yeah. And again, the law is just not fit [14:30] for purpose on this. I think we're [14:32] really going to have to get to a point [14:33] where we have laws on the books that say [14:36] if you have someone that's pre [14:37] pretending to be a human being, it has [14:39] to be labeled. It just has to because [14:42] you're dealing with children first of [14:43] all, so like anyone that's under the age [14:44] of 18 needs to be protected, but you've [14:47] also dealing with older people. You're [14:49] also dealing with the potential for [14:50] scam, for fraud, for misinformation and [14:53] disinformation. So this would just solve [14:55] a lot of things. [14:56] >> Pommy, who's behind these images? What [14:58] do they what do they want? From what I [15:00] understand, it is mostly agencies. It's [15:03] not, you know, it's not a cottage [15:05] industry of people working from home. [15:06] There are, you know, agencies most of [15:08] the time in in Europe, in places like [15:11] London, um, and on the continent who are [15:14] producing these as branding exercises [15:16] and as an opportunity for a brand to get [15:18] a sponsorship. But for Yeah, absolutely. [15:20] Um, if you think about it, an influencer [15:24] um who has a brand sponsorship deal will [15:27] be quite costly because if they want to [15:28] go to Coachella, they want to go [15:30] business class maybe. Um, they want to [15:33] get a hotel, they want to get some other [15:34] freebies. But if you have an influencer [15:36] who you are sponsoring to hold your can [15:39] of whatever in the photograph, um, [15:42] they're not going to have a bad day or [15:44] get old or look weird in the photo. [15:46] They're always going to look great. It's [15:48] funny, ahead of this program, I actually [15:50] looked at some of these influencers who [15:51] were in Coachella in Coachella and it [15:55] was amazing like one of them had about [15:57] 170,000 followers and had it was [15:59] pictures of her with Justin Bieber um [16:02] with the Kardashians with Madonna and no [16:05] one in the comments was saying this [16:07] isn't real. They were all the comments [16:09] were kind of congratulatory [16:12] um and there was no disclosure at all on [16:14] the Instagram profile that it was AI [16:16] generated. So I think a lot of people in [16:18] good faith would look at it and think [16:20] this is really [16:20] >> the obvious problem is Sasha that the [16:22] very famous person who's gone to [16:24] Coachella can say to someone who might [16:27] be advertising kryptonite next to them [16:29] look I don't want to be advertising [16:30] kryptonite and they can push them away. [16:32] They have no choice. They have no say in [16:34] in an AI generated person being put next [16:37] to them in a photograph that they pose [16:39] for unknowingly. [16:41] >> Yeah. In a in a world of AI agents, [16:43] humans lose their own agency. I think to [16:46] some extent and especially famous people [16:48] because there's so many likenesses of [16:49] them on the internet that it's very very [16:51] easy to generate a false image or video [16:53] now of of a celebrity. [16:55] >> Didn't Stephanie, didn't we talk about [16:57] New York bringing in new regulation to [17:00] stop this? I think you had to put on [17:02] your website whether you were you were [17:04] using an AI generated influencer, but [17:07] there's I mean some of these pictures [17:08] from Coachella do do that, but plenty of [17:10] them don't. [17:11] >> Yeah. And that's the enforcement thing. [17:13] Like there are all sorts of laws that [17:14] are obvious [17:15] >> from state to state is different. Right. [17:16] >> Exactly. And how you know whose job is [17:19] it to police that and how are they able [17:22] to get the accountability that they [17:24] need? So again this is a case of if you [17:25] were to take them to court that's going [17:27] to take years right it's going to cost a [17:29] lot of money etc. So it's kind of like [17:31] everything that we saw about [17:32] accountability with social media not [17:34] being very effective. [17:35] >> I mean Coachella themselves could just [17:37] say enough. You can't do this. It's up [17:39] to the organizer. [17:40] >> They absolutely could. And I think, you [17:42] know, what you mentioned earlier about [17:43] the uh the celebrities, you might want [17:45] to just play the world's tiniest violin [17:47] for these people for being in these [17:49] photographs and potentially compromising [17:51] situations. But I mean, ultimately, if [17:53] they do get upset and if the brands get [17:55] upset, I think that's perhaps going to [17:57] be potentially even more effective than [17:59] actual regulation in um in getting [18:03] enforcement. So, [18:04] >> what about the platforms though? I mean, [18:05] in Instagram, Tik Tok, they're profiting [18:08] from these engagements. [18:09] >> Yes. And um technically on these [18:11] platforms you're you are supposed to [18:13] disclose if something is AI generated. [18:15] The fact is nobody actually follows that [18:18] rule. And Meta does have um automated [18:22] systems that will try and look for [18:24] things that are AI generated and tag [18:26] them. But it's an almost impossible task [18:28] because there's hundreds of thousands of [18:30] posts made every day and many many many [18:33] are slipping through the net. Now the [18:35] thing is it is possible if they really [18:37] wanted to. There are ways to put um a [18:41] cryptographic signature on actual [18:43] photographic images um called CP2A. Um [18:48] but that's just not something that the [18:50] tech companies are investing in because [18:52] if you think about it, there is a [18:54] commercial incentive to just let this [18:56] carry on because to to your point [18:57] earlier, do people actually like this? [18:59] Weirdly, the public don't hate AI [19:02] avatars. They're kind of okay with it [19:04] >> if they know. I mean, how many people [19:06] really know? because some of these are [19:07] really good and they're getting better. [19:09] >> I think that's just going to make it [19:10] harder to to deal with. [19:11] >> So weird. Everyone talks about being [19:13] authentic and brands are all about your [19:15] values and then they do this stuff [19:17] that's so fake and people eat it up. [19:19] >> Weird. [19:20] >> Do we still call California the Wild [19:21] West? No, maybe not. [19:23] >> But this is the Wild West. [19:25] >> Yeah, absolutely. [19:26] >> Now, the late Val Kilmer was one of the [19:28] greats. Do we agree on that? [19:30] >> Yeah. Top Gun Batman forever. [19:32] >> Real genius. [19:33] >> Yeah. The Doors. I liked him in that. uh [19:35] one of the Hollywood uh greats, one of [19:37] the versatile actors of Hollywood as [19:39] well. And he died a year ago, as many of [19:41] you will know, age 65 after a long and [19:43] sad battle with throat cancer. But he [19:46] had been cast in a film uh a few years [19:48] earlier. It's called As Deep as the [19:50] Grave. It's a historical drama about uh [19:53] the American Southwest. And uh of [19:56] course, he didn't make it to set because [19:58] he was he was too ill at the end. But [20:00] this week, the trailer for that movie [20:03] debuted at Cineacon in Las Vegas. He's [20:06] in it. And every scene in which he is in [20:08] and every line that he speaks, of [20:10] course, is generated entirely by AI. [20:18] >> Is me. [20:29] Hey, hey, [20:32] hey. [20:55] Hey, [21:04] >> don't fear the dead and don't fear me. [21:07] His children gave their blessing to this [21:09] Stephanie. Um, and just so everybody [21:12] knows, the filmmakers followed the [21:14] guidelines, spoke to the unions. Kilmer, [21:16] in fact, himself uh embraced AI in his [21:20] final movie, Top Gun Maverick. uh his [21:22] his voice was recreated by AI, so he [21:25] wasn't oblivious to this. [21:28] Is this the blueprint, do you think, for [21:30] for for AI in Hollywood, an ethical way [21:33] of using it, or does it for you open a [21:35] door that we can't close? [21:37] >> I think it's just about choice. So, I [21:40] like the idea that if directors and [21:42] other artists want to experiment with [21:43] AI, that they are doing so mindfully, [21:46] that they're trying to come up with an [21:47] ethical standard that is no doubt going [21:49] to be discussed and may eventually be [21:51] formalized. I also think it's really [21:53] important for any creative person who [21:55] doesn't want their likeness, their [21:58] biometrics or their creative output to [22:00] be used in this way to be able to say [22:02] no. Right? So in that case for anybody [22:05] who's a Hollywood actor listening and [22:07] watching our show as we know they are [22:09] >> um they would want to be all the way [22:11] they want to be speaking with their [22:12] agents and with their team and their [22:14] lawyers to be really clear about that. [22:16] So you know how do you want your [22:17] likeness being used while you're alive [22:19] and then how do you want it being used [22:21] after your death. So in this case Bal [22:23] Kilmer's children were fine with it. His [22:25] estate's fine with it and everything was [22:26] done with everyone being I think as [22:28] ethical as they can be. Other actors [22:30] have made different choices. C can I [22:32] just say Stephanie for the record that [22:34] if you're going to use my likeness for [22:35] AI decoded into the future I am happy [22:38] for that so long as you pay the [22:40] royalties to my estate just on the [22:42] record so we're all cleared. [22:44] >> Um Sasha I mean obviously Hollywood [22:46] Hollywood actors are are a gift aren't [22:48] they for AI because there's there's [22:50] hundreds of hours of film of them. [22:53] They've been in lots of performances. Uh [22:55] and so in fact I think this performance [22:58] that Kilmer in of this movie was it was [23:01] reconstructed from 40 films hundreds of [23:04] hours of footage. So as long as you [23:06] signed up to this sky's is the limit. [23:09] >> I think that's a very hard ethical [23:10] question. Um I've seen a lot of papers. [23:13] I even saw a theater play on this topic [23:15] especially after death actually. I think [23:16] that's a really important point. Uh who [23:18] can opt in? How can you opt out if [23:20] you're dead? Um, and also what does this [23:22] mean for the community? Because if [23:24] there's peer pressure, for example, and [23:25] I think that actually AI was one of the [23:27] one of the sticking points during the [23:29] strikes a couple of years ago, right? [23:30] Uh, to what extent is there is there [23:32] union pressure? Is there community [23:33] pressure to opt in? Can you can you [23:35] continue opting out in this new new [23:38] world, right? And it's it's similar to [23:40] what a lot of workers are facing as [23:41] well. Um, I'm hearing a lot of people [23:43] being like, well, I'm forced to use AI [23:44] in my in my workplace. We even have [23:46] dashboards for tracking it. And so, it's [23:48] really this pressure that we're seeing [23:49] to use AI. And I think that that does uh [23:52] uh make people give up some of their [23:54] individual choices if they feel [23:56] pressured to. So for example, if you're [23:57] a young actor and you want to uh you [23:59] know make your make a name for yourself, [24:01] but you don't want to use AI but you [24:02] have this peer pressure around you, can [24:04] you really opt out without having a [24:05] negative impact on your career? [24:08] >> Yeah. Yeah. And and I think um Hollywood [24:10] has already this history of recycling [24:13] old films and making sequels and making [24:16] remakes. And I think there is already [24:17] this tendency to want to maximize [24:19] profits by going back to whatever works. [24:21] And if that's the incentive driving the [24:24] remake of an actor who has died, I think [24:28] in the end that can actually put younger [24:30] fresh talent out of work if it's just [24:32] the kind of the same icons appearing [24:34] over and over again for the next 100 [24:36] years. Um, and to Sasha's point as well, [24:39] I think where that pressure comes from, [24:41] like I've spoken to a company that does [24:43] virtual reality concerts or Avatar [24:45] concerts, um, and they have had pressure [24:48] from the families of artists who have [24:51] died to try and recreate the the the [24:55] deceased artist for a concert, [24:57] >> not knowing whether there's consent or [24:58] not for that. [24:59] >> And it's a very gray area. It's a very [25:01] because we're talking about people who [25:02] died maybe 10 20 years ago, icons um [25:06] putting pressure on them as a it's a new [25:07] revenue source for the family left [25:09] behind for the estate. [25:10] >> We're out of time. Pie uh Sasha [25:13] Stephanie, thank you very much indeed. [25:15] Um uh AI decoded next week. Uh we think [25:18] Scott Galloway is coming on. I'm putting [25:20] that out there so he does come on next [25:21] week. Um so do tune in for that. If you [25:24] have any thoughts on anything we've [25:25] discussed, uh do email us aidbc.co.uk. [25:29] UK. And I'm going to put on screen for [25:32] you the QR code for the AI decoded [25:35] playlist which is on YouTube. Some of [25:36] you struggling to find it. There it is. [25:38] If you scan the QR code, you'll be able [25:39] to find it. All the backup episodes are [25:42] there. So, do take a look at that. And [25:44] don't forget if you want to watch us [25:45] again, we are on the BBC i Player. Uh, [25:48] that's all the housekeeping. Thank you [25:50] very much for watching. Thank you to our [25:51] guests this week. We'll see you next [25:53] time.