WEBVTT

00:00:06.719 --> 00:00:13.759
Hello and welcome to the program. The AI

00:00:10.080 --> 00:00:16.320
revolution runs on electricity. Lots of

00:00:13.759 --> 00:00:18.880
it. Vast amounts of it. The data centers

00:00:16.320 --> 00:00:21.679
being built right now to power the next

00:00:18.879 --> 00:00:25.038
generation of AI consume as much energy

00:00:21.679 --> 00:00:27.118
as a small city. Last year, the world's

00:00:25.039 --> 00:00:30.400
biggest tech companies spent more than

00:00:27.118 --> 00:00:32.799
$400 billion building them. And they

00:00:30.399 --> 00:00:34.558
need more of them, many more of them.

00:00:32.799 --> 00:00:37.280
But there's a problem. In the United

00:00:34.558 --> 00:00:39.280
States, four in 10 of the data centers

00:00:37.280 --> 00:00:42.000
that are being planned for this year are

00:00:39.280 --> 00:00:44.879
at serious risk of delay. Not enough

00:00:42.000 --> 00:00:48.399
power, not enough equipment, and not

00:00:44.878 --> 00:00:51.039
enough people to build them. Why? Also

00:00:48.399 --> 00:00:53.359
on the program this week, Coachella, the

00:00:51.039 --> 00:00:55.679
world's most glamorous music festival.

00:00:53.359 --> 00:00:57.759
But some of the online influences you

00:00:55.679 --> 00:01:00.799
may have seen pictured in the California

00:00:57.759 --> 00:01:03.439
desert alongside the biggest stars were

00:01:00.799 --> 00:01:05.920
fake, not real. They don't exist. AI

00:01:03.439 --> 00:01:08.478
generated to promote brands and to make

00:01:05.920 --> 00:01:11.200
money. Plus, we'll also talk this week

00:01:08.478 --> 00:01:14.640
about the late Val Kilmer appearing in a

00:01:11.200 --> 00:01:16.719
new film one year after his death. with

00:01:14.640 --> 00:01:19.359
us this week to talk about it, Palmy

00:01:16.719 --> 00:01:20.959
Olsen, uh who is a technology columnist

00:01:19.359 --> 00:01:21.599
at Bloomberg. Welcome to you.

00:01:20.959 --> 00:01:24.559
>> Thank you.

00:01:21.599 --> 00:01:26.798
>> Also here, Dr. Sasha Luchoni, a computer

00:01:24.560 --> 00:01:29.359
scientist specializing in AI and its

00:01:26.799 --> 00:01:31.520
environmental impact and also in the

00:01:29.359 --> 00:01:34.560
studio with us. Uh good to have her back

00:01:31.519 --> 00:01:36.959
here. Dr. Stephanie Hair, colleague, uh

00:01:34.560 --> 00:01:39.680
author and AI expert to give you your

00:01:36.959 --> 00:01:41.919
full title. Um Pommy, let us start with

00:01:39.680 --> 00:01:44.240
this issue of data centers. Um we're

00:01:41.920 --> 00:01:46.399
building lots of them. They are powering

00:01:44.239 --> 00:01:48.798
the AI revolution.

00:01:46.399 --> 00:01:51.280
Why are so many of them on hold?

00:01:48.799 --> 00:01:52.960
>> Yeah. So this is really an issue of

00:01:51.280 --> 00:01:54.799
bottlenecks and you sort of alluded to

00:01:52.959 --> 00:01:57.599
it in your introduction. There's the

00:01:54.799 --> 00:01:59.840
issue of power. So getting access to

00:01:57.599 --> 00:02:01.839
actual power on electricity grids that

00:01:59.840 --> 00:02:03.439
are very very old and haven't received a

00:02:01.840 --> 00:02:06.240
lot of investment over many decades in

00:02:03.438 --> 00:02:07.519
the United States. Um also equipment,

00:02:06.239 --> 00:02:09.519
getting access to things like

00:02:07.519 --> 00:02:11.038
transformers or switch gear, which are

00:02:09.520 --> 00:02:13.280
the types of equipments you need to

00:02:11.038 --> 00:02:14.958
actually run data centers. There's also

00:02:13.280 --> 00:02:16.479
a huge bottleneck getting them. It could

00:02:14.959 --> 00:02:18.080
take up to five years to get some of

00:02:16.479 --> 00:02:20.719
that equipment, particularly because a

00:02:18.080 --> 00:02:22.719
lot of it comes from China and recent US

00:02:20.719 --> 00:02:24.318
tariffs on Chinese goods has made that

00:02:22.719 --> 00:02:25.919
even more difficult. And then there's

00:02:24.318 --> 00:02:28.639
also just getting the people, the

00:02:25.919 --> 00:02:30.159
talent, um the electricians um and the

00:02:28.639 --> 00:02:33.039
people with the skill set who can

00:02:30.159 --> 00:02:36.318
actually uh construct and run these data

00:02:33.039 --> 00:02:38.560
centers. Um so all those things combined

00:02:36.318 --> 00:02:42.159
have meant that at a time when there is

00:02:38.560 --> 00:02:44.000
this rapacious demand for um for energy

00:02:42.159 --> 00:02:47.120
from tech companies, it's actually very

00:02:44.000 --> 00:02:49.199
difficult to build them fast enough.

00:02:47.120 --> 00:02:51.680
>> Does that problem become more acute in

00:02:49.199 --> 00:02:53.598
the energy crisis we're in currently?

00:02:51.680 --> 00:02:56.080
>> I think it does to some extent and it's

00:02:53.598 --> 00:02:57.919
more of a problem for the so-called

00:02:56.080 --> 00:03:03.519
hyperscalers, the big tech companies

00:02:57.919 --> 00:03:04.958
like Meta, um Alphabet, Amazon, etc. who

00:03:03.519 --> 00:03:07.039
are actually the ones who have to shell

00:03:04.959 --> 00:03:10.080
out for these energy costs. And so

00:03:07.039 --> 00:03:12.479
they're setting up um these kind of mini

00:03:10.080 --> 00:03:15.599
nuclear reactors that can actually

00:03:12.479 --> 00:03:18.799
provide um energy specifically for the

00:03:15.598 --> 00:03:20.878
data center um and use renewable sources

00:03:18.800 --> 00:03:23.280
instead of gas. But at the moment even

00:03:20.878 --> 00:03:25.199
getting those up and running is

00:03:23.280 --> 00:03:26.560
logistically very very difficult. It's

00:03:25.199 --> 00:03:29.119
time consuming. There aren't any

00:03:26.560 --> 00:03:31.360
actually that are operational just yet.

00:03:29.120 --> 00:03:33.120
and and so right now I think the main

00:03:31.360 --> 00:03:35.519
source of energy is gas.

00:03:33.120 --> 00:03:36.878
>> Should we give our viewers just a scale

00:03:35.519 --> 00:03:38.158
of what we're talking about here? Scott

00:03:36.878 --> 00:03:40.000
Galloway who might be coming on the

00:03:38.158 --> 00:03:44.000
program next week over in New York, he

00:03:40.000 --> 00:03:47.280
says Open AI alone, Stephanie will need

00:03:44.000 --> 00:03:50.479
20% of current US electricity capacity

00:03:47.280 --> 00:03:52.318
at $10 trillion.

00:03:50.479 --> 00:03:54.158
>> That's extraordinary. It's extraordinary

00:03:52.318 --> 00:03:56.158
for a company that is yet to turn a

00:03:54.158 --> 00:03:59.280
profit and which is under huge pressure

00:03:56.158 --> 00:04:02.000
to demonstrate value ahead of an IPO as

00:03:59.280 --> 00:04:03.680
well. So it's just worth saying that all

00:04:02.000 --> 00:04:06.318
of these companies, not just OpenAI,

00:04:03.680 --> 00:04:08.239
were making very big promises about

00:04:06.318 --> 00:04:10.719
their data structure buildout plans as

00:04:08.239 --> 00:04:13.759
of last year. We've already seen the US

00:04:10.719 --> 00:04:15.598
UK technology deal that's on hold. We've

00:04:13.759 --> 00:04:20.319
seen OpenAI have to pull out of some of

00:04:15.598 --> 00:04:21.759
its Stargate, the big US data structure

00:04:20.319 --> 00:04:23.918
plan. and they've had to pull out on

00:04:21.759 --> 00:04:25.439
some of those things. Maybe we are

00:04:23.918 --> 00:04:27.918
walking back. So that question of will

00:04:25.439 --> 00:04:29.439
the AI bubble burst? It might not burst.

00:04:27.918 --> 00:04:30.959
It might just sort of slightly

00:04:29.439 --> 00:04:32.160
>> situation, isn't it? When you think that

00:04:30.959 --> 00:04:35.439
actually if you're going to plug these

00:04:32.160 --> 00:04:37.199
into the grid and you don't know whether

00:04:35.439 --> 00:04:39.680
these companies are going to survive in

00:04:37.199 --> 00:04:42.080
the scale or perhaps they're even bigger

00:04:39.680 --> 00:04:44.079
than they are right now, it you it's a

00:04:42.079 --> 00:04:45.599
difficult thing to plan.

00:04:44.079 --> 00:04:47.279
>> It's a difficult thing to plan when

00:04:45.600 --> 00:04:48.400
you're not a planned economy. Which is

00:04:47.279 --> 00:04:50.559
why when we're looking at the data

00:04:48.399 --> 00:04:52.159
center roll out in a country like China

00:04:50.560 --> 00:04:54.319
and comparing that to the United States

00:04:52.160 --> 00:04:55.439
or indeed here in Europe, we get very

00:04:54.319 --> 00:04:58.159
different pictures.

00:04:55.439 --> 00:05:00.478
>> Right. Um Dr. Luchi, Sasha, I'm going to

00:04:58.160 --> 00:05:02.479
call you Sasha. Does it make sense for

00:05:00.478 --> 00:05:05.120
the US president to be so vehemently

00:05:02.478 --> 00:05:07.680
opposed to renewable energy given the

00:05:05.120 --> 00:05:09.840
scale that we're talking?

00:05:07.680 --> 00:05:11.680
Well, the problem is is that the data

00:05:09.839 --> 00:05:13.679
centers are being built so quickly that

00:05:11.680 --> 00:05:15.759
renewable capacity has trouble keeping

00:05:13.680 --> 00:05:18.079
up, especially in rural areas,

00:05:15.759 --> 00:05:19.759
especially out outside of places where

00:05:18.079 --> 00:05:21.279
renewables are are are typically the

00:05:19.759 --> 00:05:24.080
case. So, I think that currently the

00:05:21.279 --> 00:05:26.399
emphasis is build faster, build bigger,

00:05:24.079 --> 00:05:28.639
and they don't want to wait around for

00:05:26.399 --> 00:05:30.879
solar or wind, which is why um

00:05:28.639 --> 00:05:32.639
essentially most uh the data centers

00:05:30.879 --> 00:05:34.079
that are coming online as quickly are

00:05:32.639 --> 00:05:35.918
essentially bringing in turbines on the

00:05:34.079 --> 00:05:37.758
back of trucks, natural gas. It's it's

00:05:35.918 --> 00:05:39.519
like bring your bring your own energy uh

00:05:37.759 --> 00:05:39.840
essentially and most of that is and they

00:05:39.519 --> 00:05:40.799
have the money

00:05:39.839 --> 00:05:41.519
>> is non-renewable.

00:05:40.800 --> 00:05:42.960
>> I mean these companies

00:05:41.519 --> 00:05:44.000
>> they have the money

00:05:42.959 --> 00:05:45.359
>> they have the money but actually

00:05:44.000 --> 00:05:46.959
currently there's a bottleneck even when

00:05:45.360 --> 00:05:48.879
you have the money because there's not

00:05:46.959 --> 00:05:50.478
enough turbines to to power all these

00:05:48.879 --> 00:05:52.959
data centers because there's there's a

00:05:50.478 --> 00:05:54.560
backlog nowadays and and even these uh

00:05:52.959 --> 00:05:57.198
these turbines can't be produced fast

00:05:54.560 --> 00:05:59.439
enough to to respond to demand. Palmy at

00:05:57.199 --> 00:06:02.478
Bloomberg recently you highlighted an

00:05:59.439 --> 00:06:03.680
issue in Northern Spain um with the data

00:06:02.478 --> 00:06:05.599
center build out there which has

00:06:03.680 --> 00:06:06.959
actually been held up as a a model for

00:06:05.600 --> 00:06:09.840
the rest of Europe but for the people

00:06:06.959 --> 00:06:12.318
who live around these projects the

00:06:09.839 --> 00:06:14.000
reality is sometimes very different why

00:06:12.319 --> 00:06:15.520
>> I think it's a common story we're also

00:06:14.000 --> 00:06:17.680
seeing in the United States a lot of

00:06:15.519 --> 00:06:19.038
push back from local residents in areas

00:06:17.680 --> 00:06:21.439
where companies want to build data

00:06:19.038 --> 00:06:24.478
centers and in northern Spain um there

00:06:21.439 --> 00:06:27.839
have been uh the situation is that AWS

00:06:24.478 --> 00:06:31.038
which is the cloud business of Amazon um

00:06:27.839 --> 00:06:33.119
sent letters to local people um saying

00:06:31.038 --> 00:06:35.839
we want to buy your land, giving them

00:06:33.120 --> 00:06:38.959
sometimes 4 days notice to say yes or

00:06:35.839 --> 00:06:40.560
no. Um and some of these people in in in

00:06:38.959 --> 00:06:42.959
northern Spain actually thought it was a

00:06:40.560 --> 00:06:44.560
scam at first. Um one lady went to her

00:06:42.959 --> 00:06:47.758
local town hall and even they didn't

00:06:44.560 --> 00:06:50.879
know. So, it's a real um kind of land

00:06:47.759 --> 00:06:53.919
grab almost to try and get land that is

00:06:50.879 --> 00:06:56.240
relatively cheap in an area where energy

00:06:53.918 --> 00:06:58.399
costs are relatively low and that are

00:06:56.240 --> 00:07:00.478
sparssely populated as well. It seems

00:06:58.399 --> 00:07:02.638
like an ideal situation for building a

00:07:00.478 --> 00:07:04.719
data center, but at the same time there

00:07:02.639 --> 00:07:06.160
is the reality for people who do live

00:07:04.720 --> 00:07:08.720
there and there are people who live

00:07:06.160 --> 00:07:10.639
there um that they have to give up that

00:07:08.720 --> 00:07:12.319
land or maybe the suddenly you've got

00:07:10.639 --> 00:07:14.400
this eyes sore in a place that you've

00:07:12.319 --> 00:07:15.919
lived in for many generations. If you're

00:07:14.399 --> 00:07:18.079
in a community like that though and

00:07:15.918 --> 00:07:20.959
you've already struggled to to get

00:07:18.079 --> 00:07:22.399
natural resources or to to to get

00:07:20.959 --> 00:07:24.560
electricity to get yourselves on the

00:07:22.399 --> 00:07:26.719
grid, does the arrival of a big AI

00:07:24.560 --> 00:07:28.639
company help in that process? Perhaps it

00:07:26.720 --> 00:07:31.520
it could help a community

00:07:28.639 --> 00:07:33.759
>> in some respects. And the funny part in

00:07:31.519 --> 00:07:36.478
that is that governments um local

00:07:33.759 --> 00:07:38.960
governments often uh frame data center

00:07:36.478 --> 00:07:40.959
buildouts as being great for jobs. Yeah.

00:07:38.959 --> 00:07:43.198
But I think you're conflating in that

00:07:40.959 --> 00:07:45.120
situation permanent jobs with

00:07:43.199 --> 00:07:46.639
construction jobs which are temporary.

00:07:45.120 --> 00:07:47.598
And so when you build out a data center,

00:07:46.639 --> 00:07:49.120
you're you're going to hire me.

00:07:47.598 --> 00:07:50.639
>> They're not are they not necessarily big

00:07:49.120 --> 00:07:52.319
employers once the kit is there?

00:07:50.639 --> 00:07:54.079
>> No, I think in a typical data center you

00:07:52.319 --> 00:07:56.639
might have about 100 people, most of

00:07:54.079 --> 00:07:58.399
them cleaners and security people. Um

00:07:56.639 --> 00:08:00.160
whereas for the buildout, sure,

00:07:58.399 --> 00:08:01.839
hundreds, maybe thousands of people, but

00:08:00.160 --> 00:08:02.960
then that's only temporary.

00:08:01.839 --> 00:08:04.478
>> All right, I'm going to bring in an

00:08:02.959 --> 00:08:06.560
audience question quite early into the

00:08:04.478 --> 00:08:08.000
program this week because it it it fits

00:08:06.560 --> 00:08:10.959
what you're talking about. It's from

00:08:08.000 --> 00:08:12.879
James in the UK. He says, "Sasha, AI

00:08:10.959 --> 00:08:15.038
companies continue to minimize this

00:08:12.879 --> 00:08:17.280
environmental impact." He points

00:08:15.038 --> 00:08:20.318
specifically to Sam Alman's recent claim

00:08:17.279 --> 00:08:22.318
that AI's water usage is minimal. James

00:08:20.319 --> 00:08:24.000
says that's simply not true. He also

00:08:22.319 --> 00:08:26.240
tells us that younger generations are

00:08:24.000 --> 00:08:28.160
increasingly boycotting generative AI

00:08:26.240 --> 00:08:30.800
for environmental reasons. So, here's

00:08:28.160 --> 00:08:33.200
his question. Should mainstream media be

00:08:30.800 --> 00:08:34.879
doing more to hold these companies to

00:08:33.200 --> 00:08:37.120
account?

00:08:34.879 --> 00:08:38.958
>> Definitely. Actually, a recent Guardian

00:08:37.120 --> 00:08:40.799
study found that uh the big tech

00:08:38.958 --> 00:08:43.119
companies were lobbying very very hard

00:08:40.799 --> 00:08:45.199
against transparency to make sure that I

00:08:43.120 --> 00:08:48.480
mean citing confidentiality to not

00:08:45.200 --> 00:08:50.640
include any uh energy figures or water

00:08:48.480 --> 00:08:52.639
figures about data centers and so we

00:08:50.639 --> 00:08:54.080
we're we're seeing them play dirty and I

00:08:52.639 --> 00:08:55.439
think it's time to ask for

00:08:54.080 --> 00:08:57.120
accountability and I think that

00:08:55.440 --> 00:08:59.200
especially in a time where people are

00:08:57.120 --> 00:09:00.720
increasingly sustainability conscious so

00:08:59.200 --> 00:09:03.440
so you know we make our decisions based

00:09:00.720 --> 00:09:05.360
on on the environment and and ethical

00:09:03.440 --> 00:09:07.279
concerns we need this information

00:09:05.360 --> 00:09:09.519
whether it be for choosing one AI model

00:09:07.278 --> 00:09:11.600
over the other for for using AI or not

00:09:09.519 --> 00:09:12.879
using AI, right? There's there's lots of

00:09:11.600 --> 00:09:14.240
decisions that we make on an everyday

00:09:12.879 --> 00:09:16.399
basis that we just don't have the

00:09:14.240 --> 00:09:18.480
information for. And especially since AI

00:09:16.399 --> 00:09:20.480
has become such a common technology, we

00:09:18.480 --> 00:09:22.000
definitely need these numbers and and

00:09:20.480 --> 00:09:24.000
these companies have them. It's just a

00:09:22.000 --> 00:09:26.080
matter of of giving them maybe positive

00:09:24.000 --> 00:09:26.879
and and and less positive incentives for

00:09:26.080 --> 00:09:28.959
sharing them.

00:09:26.879 --> 00:09:30.559
>> Well, well, let's let's try and choose

00:09:28.958 --> 00:09:32.399
to look at this positively because we're

00:09:30.559 --> 00:09:34.799
all using the technology. We're going to

00:09:32.399 --> 00:09:36.958
use it in our work. So we we need these

00:09:34.799 --> 00:09:39.679
companies to be successful if we're if

00:09:36.958 --> 00:09:41.838
we're going to employ AI fully. What

00:09:39.679 --> 00:09:43.838
does a responsible data center look

00:09:41.839 --> 00:09:45.920
like, Sasha?

00:09:43.839 --> 00:09:47.839
>> So you can definitely create them in a

00:09:45.919 --> 00:09:50.159
way that's u more integrated in the

00:09:47.839 --> 00:09:51.760
existing infrastructure. So currently um

00:09:50.159 --> 00:09:54.079
the data centers are being built out in

00:09:51.759 --> 00:09:55.439
a very kind of bigger is better kind of

00:09:54.080 --> 00:09:57.120
way. So typically they're outside of

00:09:55.440 --> 00:09:59.440
cities, they're huge like warehouse

00:09:57.120 --> 00:10:00.959
sized um but they can really be

00:09:59.440 --> 00:10:03.040
integrated like the smaller data centers

00:10:00.958 --> 00:10:04.639
can be in basement. Um, the heat can be

00:10:03.039 --> 00:10:06.879
reused to heat office buildings or

00:10:04.639 --> 00:10:08.799
university campuses. It's much easier to

00:10:06.879 --> 00:10:11.439
use renewable energy or or a mix at

00:10:08.799 --> 00:10:13.199
least of of renewable energy if if

00:10:11.440 --> 00:10:14.959
there's less capacity that's needed.

00:10:13.200 --> 00:10:16.720
>> You think it that maybe part of this

00:10:14.958 --> 00:10:18.319
answer is then partnering with other

00:10:16.720 --> 00:10:19.920
companies,

00:10:18.320 --> 00:10:21.519
>> partnering and rethinking the paradigm.

00:10:19.919 --> 00:10:23.199
So, currently it's like we need the

00:10:21.519 --> 00:10:25.200
biggest data centers, we need sovereign

00:10:23.200 --> 00:10:26.720
AI, we need, you know, bigger, let's

00:10:25.200 --> 00:10:28.000
build it out. Even in like I mean in

00:10:26.720 --> 00:10:29.440
Canada it's the same thing. We need our

00:10:28.000 --> 00:10:31.519
own data center. Let's let's build it

00:10:29.440 --> 00:10:33.040
out. But instead of thinking of that as

00:10:31.519 --> 00:10:34.240
as the the you know when you have a

00:10:33.039 --> 00:10:35.360
hammer everything's a nail I think we

00:10:34.240 --> 00:10:37.278
should be thinking about the nails that

00:10:35.360 --> 00:10:39.120
we have and thinking about okay so what

00:10:37.278 --> 00:10:41.039
do we need this data center for? Is it

00:10:39.120 --> 00:10:43.278
for a university? Is it for a private

00:10:41.039 --> 00:10:45.599
company? Is there a way of for example

00:10:43.278 --> 00:10:47.519
incentivizing uh some mix of renewables

00:10:45.600 --> 00:10:49.440
or or for example helping them build it

00:10:47.519 --> 00:10:51.839
out in a way that isn't you know bring

00:10:49.440 --> 00:10:53.680
your own turbine on a on a on a truck

00:10:51.839 --> 00:10:56.000
kind of situation. And so I think there

00:10:53.679 --> 00:10:58.319
are ways of being more more agile if we

00:10:56.000 --> 00:10:59.600
rethink our way of of doing AI. And it's

00:10:58.320 --> 00:11:01.519
not only for data sensors. Same thing

00:10:59.600 --> 00:11:02.959
for AI models. Instead of being like we

00:11:01.519 --> 00:11:04.639
need the biggest, we need the most

00:11:02.958 --> 00:11:06.799
energy intensive model for every single

00:11:04.639 --> 00:11:09.199
task. We can have smaller models for

00:11:06.799 --> 00:11:11.199
example ondevice models. Uh instead of

00:11:09.200 --> 00:11:12.879
having every query be dispatched to the

00:11:11.200 --> 00:11:14.720
cloud, we can have AI models running

00:11:12.879 --> 00:11:16.000
locally on our smartphones and and

00:11:14.720 --> 00:11:17.759
computers. So I think we should really

00:11:16.000 --> 00:11:20.399
be rethinking a little bit the way that

00:11:17.759 --> 00:11:23.838
we design and deploy AI currently.

00:11:20.399 --> 00:11:26.639
>> A quick question just to um satisfy my

00:11:23.839 --> 00:11:28.399
curiosity, Sasha. Um quick answers if

00:11:26.639 --> 00:11:30.399
you could. There are some country uh

00:11:28.399 --> 00:11:32.879
companies that are developing these air

00:11:30.399 --> 00:11:35.600
cooling systems to reduce water

00:11:32.879 --> 00:11:37.919
consumption. Do they work?

00:11:35.600 --> 00:11:40.959
>> Yes, but it's often a trade-off of using

00:11:37.919 --> 00:11:42.559
more energy and less water. So often um

00:11:40.958 --> 00:11:44.479
it's true that you can for example

00:11:42.559 --> 00:11:46.159
recycle water. So essentially water gets

00:11:44.480 --> 00:11:48.079
cycled through and it heats up and you

00:11:46.159 --> 00:11:50.078
have to cool it down. So either you need

00:11:48.078 --> 00:11:52.000
uh cooling towers or sometimes you know

00:11:50.078 --> 00:11:53.838
you cool it down with energy with

00:11:52.000 --> 00:11:55.200
electricity. And so it's often a

00:11:53.839 --> 00:11:57.120
trade-off where they're using more

00:11:55.200 --> 00:11:57.680
energy but but less water. It's a closed

00:11:57.120 --> 00:11:59.679
loop system.

00:11:57.679 --> 00:12:01.599
>> Yeah. And at the outset you you said

00:11:59.679 --> 00:12:03.679
that very often these data centers are

00:12:01.600 --> 00:12:06.079
outstripping what the renewable industry

00:12:03.679 --> 00:12:08.239
can provide for them. But there are good

00:12:06.078 --> 00:12:11.519
examples and I wanted to point to them

00:12:08.240 --> 00:12:13.919
where data centers have been cited very

00:12:11.519 --> 00:12:16.078
close to renewable energy. So Iceland is

00:12:13.919 --> 00:12:18.479
using geothermal, Norway using

00:12:16.078 --> 00:12:21.120
hydroelectric. Is that an example that

00:12:18.480 --> 00:12:23.759
other countries should be following?

00:12:21.120 --> 00:12:26.320
>> Yes. But I think that um very few

00:12:23.759 --> 00:12:28.319
countries I mean uh in in the current

00:12:26.320 --> 00:12:30.000
state of things have that extra capacity

00:12:28.320 --> 00:12:31.519
and also if these data centers continue

00:12:30.000 --> 00:12:33.839
to be so like for example if a data

00:12:31.519 --> 00:12:36.320
center uses as much energy as 100,000

00:12:33.839 --> 00:12:38.480
homes um there's very few grids

00:12:36.320 --> 00:12:40.480
renewable grids that can take that that

00:12:38.480 --> 00:12:42.079
can provide that energy on such a short

00:12:40.480 --> 00:12:43.680
notice. Even for example in Quebec where

00:12:42.078 --> 00:12:45.919
I live we have hydro but we don't have

00:12:43.679 --> 00:12:47.759
the extra capacity for you know in two

00:12:45.919 --> 00:12:49.679
years an extra 100 thousand homes to be

00:12:47.759 --> 00:12:51.439
built. It has to be gradual. And so it's

00:12:49.679 --> 00:12:53.759
really the timelines that often don't

00:12:51.440 --> 00:12:55.839
line up. And this is why natural gas is

00:12:53.759 --> 00:12:57.519
the cheapest, fastest solution. And and

00:12:55.839 --> 00:12:58.800
often there are long-term plans. Often

00:12:57.519 --> 00:13:00.159
it's like, well, in 10 years we're going

00:12:58.799 --> 00:13:02.000
to do renewables. In 10 years, we're

00:13:00.159 --> 00:13:04.078
going to do this. But in the meantime,

00:13:02.000 --> 00:13:05.679
it adds a lot of emissions.

00:13:04.078 --> 00:13:07.519
>> Okay. Well, you might have questions on

00:13:05.679 --> 00:13:08.719
what you've been hearing about data

00:13:07.519 --> 00:13:12.240
centers. You might have some strong

00:13:08.720 --> 00:13:14.000
thoughts on it. AI decoded atbc.co.uk.

00:13:12.240 --> 00:13:15.839
Now, since Stephanie has been focusing

00:13:14.000 --> 00:13:17.839
on clarity and regulation, I've got a

00:13:15.839 --> 00:13:20.720
story for her. Um, let me show you some

00:13:17.839 --> 00:13:23.920
images. Uh, these are images that look

00:13:20.720 --> 00:13:27.160
entirely real, but the people in them

00:13:23.919 --> 00:13:27.159
are fake.

00:13:27.440 --> 00:13:32.399
>> My Coachella week was so much fun. Let

00:13:30.399 --> 00:13:34.399
me take you around. It's a secret. You

00:13:32.399 --> 00:13:36.000
can be in Coachella as me just by a few

00:13:34.399 --> 00:13:38.639
prompts. Stay with me till the end for

00:13:36.000 --> 00:13:38.639
the prompts.

00:13:42.720 --> 00:13:49.519
They are computerenerated influencers

00:13:46.000 --> 00:13:52.078
uh who were seen photographed on

00:13:49.519 --> 00:13:56.159
Instagram alongside some of the most

00:13:52.078 --> 00:13:57.838
famous people at Coachella uh which is

00:13:56.159 --> 00:14:00.159
those in the know will know is this very

00:13:57.839 --> 00:14:03.600
trendy music festival in the desert in

00:14:00.159 --> 00:14:05.759
one of the desert valleys in California.

00:14:03.600 --> 00:14:09.040
How many of those engaging in these

00:14:05.759 --> 00:14:11.198
photographs knew that uh the people they

00:14:09.039 --> 00:14:12.639
were pictured alongside were fake? I

00:14:11.198 --> 00:14:14.559
would suggest not very many. I'm not

00:14:12.639 --> 00:14:16.959
even sure that Coachella knew that there

00:14:14.559 --> 00:14:18.319
were fake influences uh in the crowd.

00:14:16.958 --> 00:14:21.119
Stephanie, we've talked about this on

00:14:18.320 --> 00:14:23.199
the program before about AI generated

00:14:21.120 --> 00:14:25.679
beauty, the impact it has on young

00:14:23.198 --> 00:14:26.958
people. This for me, actually, I was

00:14:25.679 --> 00:14:28.719
reading about it this week. This feels

00:14:26.958 --> 00:14:30.799
like the next chapter of that.

00:14:28.720 --> 00:14:32.480
>> Yeah. And again, the law is just not fit

00:14:30.799 --> 00:14:33.919
for purpose on this. I think we're

00:14:32.480 --> 00:14:36.079
really going to have to get to a point

00:14:33.919 --> 00:14:37.759
where we have laws on the books that say

00:14:36.078 --> 00:14:39.838
if you have someone that's pre

00:14:37.759 --> 00:14:42.159
pretending to be a human being, it has

00:14:39.839 --> 00:14:43.600
to be labeled. It just has to because

00:14:42.159 --> 00:14:44.958
you're dealing with children first of

00:14:43.600 --> 00:14:47.519
all, so like anyone that's under the age

00:14:44.958 --> 00:14:49.359
of 18 needs to be protected, but you've

00:14:47.519 --> 00:14:50.720
also dealing with older people. You're

00:14:49.360 --> 00:14:53.120
also dealing with the potential for

00:14:50.720 --> 00:14:55.680
scam, for fraud, for misinformation and

00:14:53.120 --> 00:14:56.639
disinformation. So this would just solve

00:14:55.679 --> 00:14:58.239
a lot of things.

00:14:56.639 --> 00:15:00.480
>> Pommy, who's behind these images? What

00:14:58.240 --> 00:15:03.198
do they what do they want? From what I

00:15:00.480 --> 00:15:05.039
understand, it is mostly agencies. It's

00:15:03.198 --> 00:15:06.639
not, you know, it's not a cottage

00:15:05.039 --> 00:15:08.799
industry of people working from home.

00:15:06.639 --> 00:15:11.039
There are, you know, agencies most of

00:15:08.799 --> 00:15:14.319
the time in in Europe, in places like

00:15:11.039 --> 00:15:16.399
London, um, and on the continent who are

00:15:14.320 --> 00:15:18.079
producing these as branding exercises

00:15:16.399 --> 00:15:20.639
and as an opportunity for a brand to get

00:15:18.078 --> 00:15:24.000
a sponsorship. But for Yeah, absolutely.

00:15:20.639 --> 00:15:27.120
Um, if you think about it, an influencer

00:15:24.000 --> 00:15:28.879
um who has a brand sponsorship deal will

00:15:27.120 --> 00:15:30.560
be quite costly because if they want to

00:15:28.879 --> 00:15:33.120
go to Coachella, they want to go

00:15:30.559 --> 00:15:34.958
business class maybe. Um, they want to

00:15:33.120 --> 00:15:36.959
get a hotel, they want to get some other

00:15:34.958 --> 00:15:39.439
freebies. But if you have an influencer

00:15:36.958 --> 00:15:42.159
who you are sponsoring to hold your can

00:15:39.440 --> 00:15:44.639
of whatever in the photograph, um,

00:15:42.159 --> 00:15:46.879
they're not going to have a bad day or

00:15:44.639 --> 00:15:48.399
get old or look weird in the photo.

00:15:46.879 --> 00:15:50.078
They're always going to look great. It's

00:15:48.399 --> 00:15:51.600
funny, ahead of this program, I actually

00:15:50.078 --> 00:15:55.039
looked at some of these influencers who

00:15:51.600 --> 00:15:57.278
were in Coachella in Coachella and it

00:15:55.039 --> 00:15:59.198
was amazing like one of them had about

00:15:57.278 --> 00:16:02.240
170,000 followers and had it was

00:15:59.198 --> 00:16:05.599
pictures of her with Justin Bieber um

00:16:02.240 --> 00:16:07.360
with the Kardashians with Madonna and no

00:16:05.600 --> 00:16:09.440
one in the comments was saying this

00:16:07.360 --> 00:16:12.079
isn't real. They were all the comments

00:16:09.440 --> 00:16:14.320
were kind of congratulatory

00:16:12.078 --> 00:16:16.879
um and there was no disclosure at all on

00:16:14.320 --> 00:16:18.560
the Instagram profile that it was AI

00:16:16.879 --> 00:16:20.078
generated. So I think a lot of people in

00:16:18.559 --> 00:16:20.879
good faith would look at it and think

00:16:20.078 --> 00:16:22.799
this is really

00:16:20.879 --> 00:16:24.399
>> the obvious problem is Sasha that the

00:16:22.799 --> 00:16:27.039
very famous person who's gone to

00:16:24.399 --> 00:16:29.198
Coachella can say to someone who might

00:16:27.039 --> 00:16:30.639
be advertising kryptonite next to them

00:16:29.198 --> 00:16:32.399
look I don't want to be advertising

00:16:30.639 --> 00:16:34.720
kryptonite and they can push them away.

00:16:32.399 --> 00:16:37.278
They have no choice. They have no say in

00:16:34.720 --> 00:16:39.120
in an AI generated person being put next

00:16:37.278 --> 00:16:41.278
to them in a photograph that they pose

00:16:39.120 --> 00:16:43.360
for unknowingly.

00:16:41.278 --> 00:16:46.320
>> Yeah. In a in a world of AI agents,

00:16:43.360 --> 00:16:48.079
humans lose their own agency. I think to

00:16:46.320 --> 00:16:49.360
some extent and especially famous people

00:16:48.078 --> 00:16:51.519
because there's so many likenesses of

00:16:49.360 --> 00:16:53.680
them on the internet that it's very very

00:16:51.519 --> 00:16:55.679
easy to generate a false image or video

00:16:53.679 --> 00:16:57.758
now of of a celebrity.

00:16:55.679 --> 00:17:00.799
>> Didn't Stephanie, didn't we talk about

00:16:57.759 --> 00:17:02.800
New York bringing in new regulation to

00:17:00.799 --> 00:17:04.720
stop this? I think you had to put on

00:17:02.799 --> 00:17:07.198
your website whether you were you were

00:17:04.720 --> 00:17:08.798
using an AI generated influencer, but

00:17:07.199 --> 00:17:10.959
there's I mean some of these pictures

00:17:08.798 --> 00:17:11.918
from Coachella do do that, but plenty of

00:17:10.959 --> 00:17:13.439
them don't.

00:17:11.919 --> 00:17:14.720
>> Yeah. And that's the enforcement thing.

00:17:13.439 --> 00:17:15.199
Like there are all sorts of laws that

00:17:14.720 --> 00:17:16.798
are obvious

00:17:15.199 --> 00:17:19.759
>> from state to state is different. Right.

00:17:16.798 --> 00:17:22.959
>> Exactly. And how you know whose job is

00:17:19.759 --> 00:17:24.400
it to police that and how are they able

00:17:22.959 --> 00:17:25.919
to get the accountability that they

00:17:24.400 --> 00:17:27.519
need? So again this is a case of if you

00:17:25.919 --> 00:17:29.440
were to take them to court that's going

00:17:27.519 --> 00:17:31.519
to take years right it's going to cost a

00:17:29.440 --> 00:17:32.480
lot of money etc. So it's kind of like

00:17:31.519 --> 00:17:34.079
everything that we saw about

00:17:32.480 --> 00:17:35.440
accountability with social media not

00:17:34.079 --> 00:17:37.119
being very effective.

00:17:35.440 --> 00:17:39.679
>> I mean Coachella themselves could just

00:17:37.119 --> 00:17:40.558
say enough. You can't do this. It's up

00:17:39.679 --> 00:17:42.400
to the organizer.

00:17:40.558 --> 00:17:43.839
>> They absolutely could. And I think, you

00:17:42.400 --> 00:17:45.600
know, what you mentioned earlier about

00:17:43.839 --> 00:17:47.678
the uh the celebrities, you might want

00:17:45.599 --> 00:17:49.359
to just play the world's tiniest violin

00:17:47.679 --> 00:17:51.200
for these people for being in these

00:17:49.359 --> 00:17:53.439
photographs and potentially compromising

00:17:51.200 --> 00:17:55.519
situations. But I mean, ultimately, if

00:17:53.440 --> 00:17:57.840
they do get upset and if the brands get

00:17:55.519 --> 00:17:59.839
upset, I think that's perhaps going to

00:17:57.839 --> 00:18:03.119
be potentially even more effective than

00:17:59.839 --> 00:18:04.240
actual regulation in um in getting

00:18:03.119 --> 00:18:05.759
enforcement. So,

00:18:04.240 --> 00:18:08.000
>> what about the platforms though? I mean,

00:18:05.759 --> 00:18:09.200
in Instagram, Tik Tok, they're profiting

00:18:08.000 --> 00:18:11.440
from these engagements.

00:18:09.200 --> 00:18:13.519
>> Yes. And um technically on these

00:18:11.440 --> 00:18:15.759
platforms you're you are supposed to

00:18:13.519 --> 00:18:18.079
disclose if something is AI generated.

00:18:15.759 --> 00:18:22.720
The fact is nobody actually follows that

00:18:18.079 --> 00:18:24.558
rule. And Meta does have um automated

00:18:22.720 --> 00:18:26.319
systems that will try and look for

00:18:24.558 --> 00:18:28.960
things that are AI generated and tag

00:18:26.319 --> 00:18:30.639
them. But it's an almost impossible task

00:18:28.960 --> 00:18:33.440
because there's hundreds of thousands of

00:18:30.640 --> 00:18:35.200
posts made every day and many many many

00:18:33.440 --> 00:18:37.440
are slipping through the net. Now the

00:18:35.200 --> 00:18:41.279
thing is it is possible if they really

00:18:37.440 --> 00:18:43.919
wanted to. There are ways to put um a

00:18:41.279 --> 00:18:48.240
cryptographic signature on actual

00:18:43.919 --> 00:18:50.400
photographic images um called CP2A. Um

00:18:48.240 --> 00:18:52.319
but that's just not something that the

00:18:50.400 --> 00:18:54.240
tech companies are investing in because

00:18:52.319 --> 00:18:56.079
if you think about it, there is a

00:18:54.240 --> 00:18:57.599
commercial incentive to just let this

00:18:56.079 --> 00:18:59.439
carry on because to to your point

00:18:57.599 --> 00:19:02.480
earlier, do people actually like this?

00:18:59.440 --> 00:19:04.558
Weirdly, the public don't hate AI

00:19:02.480 --> 00:19:06.640
avatars. They're kind of okay with it

00:19:04.558 --> 00:19:07.839
>> if they know. I mean, how many people

00:19:06.640 --> 00:19:09.200
really know? because some of these are

00:19:07.839 --> 00:19:10.240
really good and they're getting better.

00:19:09.200 --> 00:19:11.759
>> I think that's just going to make it

00:19:10.240 --> 00:19:13.440
harder to to deal with.

00:19:11.759 --> 00:19:15.519
>> So weird. Everyone talks about being

00:19:13.440 --> 00:19:17.038
authentic and brands are all about your

00:19:15.519 --> 00:19:19.759
values and then they do this stuff

00:19:17.038 --> 00:19:20.240
that's so fake and people eat it up.

00:19:19.759 --> 00:19:21.839
>> Weird.

00:19:20.240 --> 00:19:23.839
>> Do we still call California the Wild

00:19:21.839 --> 00:19:25.038
West? No, maybe not.

00:19:23.839 --> 00:19:26.798
>> But this is the Wild West.

00:19:25.038 --> 00:19:28.720
>> Yeah, absolutely.

00:19:26.798 --> 00:19:30.240
>> Now, the late Val Kilmer was one of the

00:19:28.720 --> 00:19:32.240
greats. Do we agree on that?

00:19:30.240 --> 00:19:33.200
>> Yeah. Top Gun Batman forever.

00:19:32.240 --> 00:19:35.519
>> Real genius.

00:19:33.200 --> 00:19:37.440
>> Yeah. The Doors. I liked him in that. uh

00:19:35.519 --> 00:19:39.359
one of the Hollywood uh greats, one of

00:19:37.440 --> 00:19:41.200
the versatile actors of Hollywood as

00:19:39.359 --> 00:19:43.519
well. And he died a year ago, as many of

00:19:41.200 --> 00:19:46.080
you will know, age 65 after a long and

00:19:43.519 --> 00:19:48.879
sad battle with throat cancer. But he

00:19:46.079 --> 00:19:50.798
had been cast in a film uh a few years

00:19:48.880 --> 00:19:53.919
earlier. It's called As Deep as the

00:19:50.798 --> 00:19:56.960
Grave. It's a historical drama about uh

00:19:53.919 --> 00:19:58.400
the American Southwest. And uh of

00:19:56.960 --> 00:20:00.640
course, he didn't make it to set because

00:19:58.400 --> 00:20:03.120
he was he was too ill at the end. But

00:20:00.640 --> 00:20:06.240
this week, the trailer for that movie

00:20:03.119 --> 00:20:08.558
debuted at Cineacon in Las Vegas. He's

00:20:06.240 --> 00:20:10.400
in it. And every scene in which he is in

00:20:08.558 --> 00:20:16.599
and every line that he speaks, of

00:20:10.400 --> 00:20:16.600
course, is generated entirely by AI.

00:20:18.558 --> 00:20:22.759
>> Is me.

00:20:29.839 --> 00:20:35.639
Hey, hey,

00:20:32.640 --> 00:20:35.640
hey.

00:20:55.119 --> 00:20:58.119
Hey,

00:21:04.960 --> 00:21:09.600
>> don't fear the dead and don't fear me.

00:21:07.759 --> 00:21:12.400
His children gave their blessing to this

00:21:09.599 --> 00:21:14.000
Stephanie. Um, and just so everybody

00:21:12.400 --> 00:21:16.240
knows, the filmmakers followed the

00:21:14.000 --> 00:21:20.000
guidelines, spoke to the unions. Kilmer,

00:21:16.240 --> 00:21:22.720
in fact, himself uh embraced AI in his

00:21:20.000 --> 00:21:25.200
final movie, Top Gun Maverick. uh his

00:21:22.720 --> 00:21:28.079
his voice was recreated by AI, so he

00:21:25.200 --> 00:21:30.240
wasn't oblivious to this.

00:21:28.079 --> 00:21:33.359
Is this the blueprint, do you think, for

00:21:30.240 --> 00:21:35.519
for for AI in Hollywood, an ethical way

00:21:33.359 --> 00:21:37.839
of using it, or does it for you open a

00:21:35.519 --> 00:21:40.240
door that we can't close?

00:21:37.839 --> 00:21:42.079
>> I think it's just about choice. So, I

00:21:40.240 --> 00:21:43.759
like the idea that if directors and

00:21:42.079 --> 00:21:46.000
other artists want to experiment with

00:21:43.759 --> 00:21:47.200
AI, that they are doing so mindfully,

00:21:46.000 --> 00:21:49.279
that they're trying to come up with an

00:21:47.200 --> 00:21:51.200
ethical standard that is no doubt going

00:21:49.279 --> 00:21:53.519
to be discussed and may eventually be

00:21:51.200 --> 00:21:55.840
formalized. I also think it's really

00:21:53.519 --> 00:21:58.558
important for any creative person who

00:21:55.839 --> 00:22:00.720
doesn't want their likeness, their

00:21:58.558 --> 00:22:02.960
biometrics or their creative output to

00:22:00.720 --> 00:22:05.759
be used in this way to be able to say

00:22:02.960 --> 00:22:07.360
no. Right? So in that case for anybody

00:22:05.759 --> 00:22:09.599
who's a Hollywood actor listening and

00:22:07.359 --> 00:22:11.918
watching our show as we know they are

00:22:09.599 --> 00:22:12.959
>> um they would want to be all the way

00:22:11.919 --> 00:22:14.320
they want to be speaking with their

00:22:12.960 --> 00:22:16.480
agents and with their team and their

00:22:14.319 --> 00:22:17.918
lawyers to be really clear about that.

00:22:16.480 --> 00:22:19.759
So you know how do you want your

00:22:17.919 --> 00:22:21.360
likeness being used while you're alive

00:22:19.759 --> 00:22:23.038
and then how do you want it being used

00:22:21.359 --> 00:22:25.119
after your death. So in this case Bal

00:22:23.038 --> 00:22:26.640
Kilmer's children were fine with it. His

00:22:25.119 --> 00:22:28.000
estate's fine with it and everything was

00:22:26.640 --> 00:22:30.559
done with everyone being I think as

00:22:28.000 --> 00:22:32.640
ethical as they can be. Other actors

00:22:30.558 --> 00:22:34.158
have made different choices. C can I

00:22:32.640 --> 00:22:35.919
just say Stephanie for the record that

00:22:34.159 --> 00:22:38.480
if you're going to use my likeness for

00:22:35.919 --> 00:22:40.000
AI decoded into the future I am happy

00:22:38.480 --> 00:22:42.640
for that so long as you pay the

00:22:40.000 --> 00:22:44.159
royalties to my estate just on the

00:22:42.640 --> 00:22:46.000
record so we're all cleared.

00:22:44.159 --> 00:22:48.960
>> Um Sasha I mean obviously Hollywood

00:22:46.000 --> 00:22:50.960
Hollywood actors are are a gift aren't

00:22:48.960 --> 00:22:53.279
they for AI because there's there's

00:22:50.960 --> 00:22:55.679
hundreds of hours of film of them.

00:22:53.279 --> 00:22:58.240
They've been in lots of performances. Uh

00:22:55.679 --> 00:23:01.038
and so in fact I think this performance

00:22:58.240 --> 00:23:04.079
that Kilmer in of this movie was it was

00:23:01.038 --> 00:23:06.319
reconstructed from 40 films hundreds of

00:23:04.079 --> 00:23:09.279
hours of footage. So as long as you

00:23:06.319 --> 00:23:10.558
signed up to this sky's is the limit.

00:23:09.279 --> 00:23:13.038
>> I think that's a very hard ethical

00:23:10.558 --> 00:23:15.038
question. Um I've seen a lot of papers.

00:23:13.038 --> 00:23:16.640
I even saw a theater play on this topic

00:23:15.038 --> 00:23:18.558
especially after death actually. I think

00:23:16.640 --> 00:23:20.480
that's a really important point. Uh who

00:23:18.558 --> 00:23:22.879
can opt in? How can you opt out if

00:23:20.480 --> 00:23:24.079
you're dead? Um, and also what does this

00:23:22.880 --> 00:23:25.520
mean for the community? Because if

00:23:24.079 --> 00:23:27.678
there's peer pressure, for example, and

00:23:25.519 --> 00:23:29.200
I think that actually AI was one of the

00:23:27.679 --> 00:23:30.720
one of the sticking points during the

00:23:29.200 --> 00:23:32.319
strikes a couple of years ago, right?

00:23:30.720 --> 00:23:33.679
Uh, to what extent is there is there

00:23:32.319 --> 00:23:35.918
union pressure? Is there community

00:23:33.679 --> 00:23:38.640
pressure to opt in? Can you can you

00:23:35.919 --> 00:23:40.480
continue opting out in this new new

00:23:38.640 --> 00:23:41.919
world, right? And it's it's similar to

00:23:40.480 --> 00:23:43.360
what a lot of workers are facing as

00:23:41.919 --> 00:23:44.880
well. Um, I'm hearing a lot of people

00:23:43.359 --> 00:23:46.558
being like, well, I'm forced to use AI

00:23:44.880 --> 00:23:48.640
in my in my workplace. We even have

00:23:46.558 --> 00:23:49.839
dashboards for tracking it. And so, it's

00:23:48.640 --> 00:23:52.720
really this pressure that we're seeing

00:23:49.839 --> 00:23:54.879
to use AI. And I think that that does uh

00:23:52.720 --> 00:23:56.240
uh make people give up some of their

00:23:54.880 --> 00:23:57.600
individual choices if they feel

00:23:56.240 --> 00:23:59.919
pressured to. So for example, if you're

00:23:57.599 --> 00:24:01.119
a young actor and you want to uh you

00:23:59.919 --> 00:24:02.640
know make your make a name for yourself,

00:24:01.119 --> 00:24:04.158
but you don't want to use AI but you

00:24:02.640 --> 00:24:05.840
have this peer pressure around you, can

00:24:04.159 --> 00:24:08.400
you really opt out without having a

00:24:05.839 --> 00:24:10.399
negative impact on your career?

00:24:08.400 --> 00:24:13.440
>> Yeah. Yeah. And and I think um Hollywood

00:24:10.400 --> 00:24:16.000
has already this history of recycling

00:24:13.440 --> 00:24:17.840
old films and making sequels and making

00:24:16.000 --> 00:24:19.519
remakes. And I think there is already

00:24:17.839 --> 00:24:21.918
this tendency to want to maximize

00:24:19.519 --> 00:24:24.639
profits by going back to whatever works.

00:24:21.919 --> 00:24:28.240
And if that's the incentive driving the

00:24:24.640 --> 00:24:30.880
remake of an actor who has died, I think

00:24:28.240 --> 00:24:32.960
in the end that can actually put younger

00:24:30.880 --> 00:24:34.720
fresh talent out of work if it's just

00:24:32.960 --> 00:24:36.480
the kind of the same icons appearing

00:24:34.720 --> 00:24:39.038
over and over again for the next 100

00:24:36.480 --> 00:24:41.200
years. Um, and to Sasha's point as well,

00:24:39.038 --> 00:24:43.119
I think where that pressure comes from,

00:24:41.200 --> 00:24:45.120
like I've spoken to a company that does

00:24:43.119 --> 00:24:48.479
virtual reality concerts or Avatar

00:24:45.119 --> 00:24:51.278
concerts, um, and they have had pressure

00:24:48.480 --> 00:24:55.120
from the families of artists who have

00:24:51.278 --> 00:24:57.278
died to try and recreate the the the

00:24:55.119 --> 00:24:58.798
deceased artist for a concert,

00:24:57.278 --> 00:24:59.359
>> not knowing whether there's consent or

00:24:58.798 --> 00:25:01.359
not for that.

00:24:59.359 --> 00:25:02.319
>> And it's a very gray area. It's a very

00:25:01.359 --> 00:25:06.079
because we're talking about people who

00:25:02.319 --> 00:25:07.678
died maybe 10 20 years ago, icons um

00:25:06.079 --> 00:25:09.199
putting pressure on them as a it's a new

00:25:07.679 --> 00:25:10.320
revenue source for the family left

00:25:09.200 --> 00:25:13.278
behind for the estate.

00:25:10.319 --> 00:25:15.359
>> We're out of time. Pie uh Sasha

00:25:13.278 --> 00:25:18.240
Stephanie, thank you very much indeed.

00:25:15.359 --> 00:25:20.158
Um uh AI decoded next week. Uh we think

00:25:18.240 --> 00:25:21.679
Scott Galloway is coming on. I'm putting

00:25:20.159 --> 00:25:24.400
that out there so he does come on next

00:25:21.679 --> 00:25:25.840
week. Um so do tune in for that. If you

00:25:24.400 --> 00:25:29.919
have any thoughts on anything we've

00:25:25.839 --> 00:25:32.000
discussed, uh do email us aidbc.co.uk.

00:25:29.919 --> 00:25:35.120
UK. And I'm going to put on screen for

00:25:32.000 --> 00:25:36.798
you the QR code for the AI decoded

00:25:35.119 --> 00:25:38.158
playlist which is on YouTube. Some of

00:25:36.798 --> 00:25:39.918
you struggling to find it. There it is.

00:25:38.159 --> 00:25:42.559
If you scan the QR code, you'll be able

00:25:39.919 --> 00:25:44.480
to find it. All the backup episodes are

00:25:42.558 --> 00:25:45.759
there. So, do take a look at that. And

00:25:44.480 --> 00:25:48.720
don't forget if you want to watch us

00:25:45.759 --> 00:25:50.158
again, we are on the BBC i Player. Uh,

00:25:48.720 --> 00:25:51.759
that's all the housekeeping. Thank you

00:25:50.159 --> 00:25:53.278
very much for watching. Thank you to our

00:25:51.759 --> 00:25:56.278
guests this week. We'll see you next

00:25:53.278 --> 00:25:56.278
time.
