Advertisement
26:15
Transcript
0:06
Hello and welcome to the program. The AI
0:10
revolution runs on electricity. Lots of
0:13
it. Vast amounts of it. The data centers
0:16
being built right now to power the next
0:18
generation of AI consume as much energy
0:21
as a small city. Last year, the world's
0:25
biggest tech companies spent more than
0:27
$400 billion building them. And they
Advertisement
0:30
need more of them, many more of them.
0:32
But there's a problem. In the United
0:34
States, four in 10 of the data centers
0:37
that are being planned for this year are
0:39
at serious risk of delay. Not enough
0:42
power, not enough equipment, and not
0:44
enough people to build them. Why? Also
0:48
on the program this week, Coachella, the
0:51
world's most glamorous music festival.
0:53
But some of the online influences you
Advertisement
0:55
may have seen pictured in the California
0:57
desert alongside the biggest stars were
1:00
fake, not real. They don't exist. AI
1:03
generated to promote brands and to make
1:05
money. Plus, we'll also talk this week
1:08
about the late Val Kilmer appearing in a
1:11
new film one year after his death. with
1:14
us this week to talk about it, Palmy
1:16
Olsen, uh who is a technology columnist
1:19
at Bloomberg. Welcome to you.
1:20
>> Thank you.
1:21
>> Also here, Dr. Sasha Luchoni, a computer
1:24
scientist specializing in AI and its
1:26
environmental impact and also in the
1:29
studio with us. Uh good to have her back
1:31
here. Dr. Stephanie Hair, colleague, uh
1:34
author and AI expert to give you your
1:36
full title. Um Pommy, let us start with
1:39
this issue of data centers. Um we're
1:41
building lots of them. They are powering
1:44
the AI revolution.
1:46
Why are so many of them on hold?
1:48
>> Yeah. So this is really an issue of
1:51
bottlenecks and you sort of alluded to
1:52
it in your introduction. There's the
1:54
issue of power. So getting access to
1:57
actual power on electricity grids that
1:59
are very very old and haven't received a
2:01
lot of investment over many decades in
2:03
the United States. Um also equipment,
2:06
getting access to things like
2:07
transformers or switch gear, which are
2:09
the types of equipments you need to
2:11
actually run data centers. There's also
2:13
a huge bottleneck getting them. It could
2:14
take up to five years to get some of
2:16
that equipment, particularly because a
2:18
lot of it comes from China and recent US
2:20
tariffs on Chinese goods has made that
2:22
even more difficult. And then there's
2:24
also just getting the people, the
2:25
talent, um the electricians um and the
2:28
people with the skill set who can
2:30
actually uh construct and run these data
2:33
centers. Um so all those things combined
2:36
have meant that at a time when there is
2:38
this rapacious demand for um for energy
2:42
from tech companies, it's actually very
2:44
difficult to build them fast enough.
2:47
>> Does that problem become more acute in
2:49
the energy crisis we're in currently?
2:51
>> I think it does to some extent and it's
2:53
more of a problem for the so-called
2:56
hyperscalers, the big tech companies
2:57
like Meta, um Alphabet, Amazon, etc. who
3:03
are actually the ones who have to shell
3:04
out for these energy costs. And so
3:07
they're setting up um these kind of mini
3:10
nuclear reactors that can actually
3:12
provide um energy specifically for the
3:15
data center um and use renewable sources
3:18
instead of gas. But at the moment even
3:20
getting those up and running is
3:23
logistically very very difficult. It's
3:25
time consuming. There aren't any
3:26
actually that are operational just yet.
3:29
and and so right now I think the main
3:31
source of energy is gas.
3:33
>> Should we give our viewers just a scale
3:35
of what we're talking about here? Scott
3:36
Galloway who might be coming on the
3:38
program next week over in New York, he
3:40
says Open AI alone, Stephanie will need
3:44
20% of current US electricity capacity
3:47
at $10 trillion.
3:50
>> That's extraordinary. It's extraordinary
3:52
for a company that is yet to turn a
3:54
profit and which is under huge pressure
3:56
to demonstrate value ahead of an IPO as
3:59
well. So it's just worth saying that all
4:02
of these companies, not just OpenAI,
4:03
were making very big promises about
4:06
their data structure buildout plans as
4:08
of last year. We've already seen the US
4:10
UK technology deal that's on hold. We've
4:13
seen OpenAI have to pull out of some of
4:15
its Stargate, the big US data structure
4:20
plan. and they've had to pull out on
4:21
some of those things. Maybe we are
4:23
walking back. So that question of will
4:25
the AI bubble burst? It might not burst.
4:27
It might just sort of slightly
4:29
>> situation, isn't it? When you think that
4:30
actually if you're going to plug these
4:32
into the grid and you don't know whether
4:35
these companies are going to survive in
4:37
the scale or perhaps they're even bigger
4:39
than they are right now, it you it's a
4:42
difficult thing to plan.
4:44
>> It's a difficult thing to plan when
4:45
you're not a planned economy. Which is
4:47
why when we're looking at the data
4:48
center roll out in a country like China
4:50
and comparing that to the United States
4:52
or indeed here in Europe, we get very
4:54
different pictures.
4:55
>> Right. Um Dr. Luchi, Sasha, I'm going to
4:58
call you Sasha. Does it make sense for
5:00
the US president to be so vehemently
5:02
opposed to renewable energy given the
5:05
scale that we're talking?
5:07
Well, the problem is is that the data
5:09
centers are being built so quickly that
5:11
renewable capacity has trouble keeping
5:13
up, especially in rural areas,
5:15
especially out outside of places where
5:18
renewables are are are typically the
5:19
case. So, I think that currently the
5:21
emphasis is build faster, build bigger,
5:24
and they don't want to wait around for
5:26
solar or wind, which is why um
5:28
essentially most uh the data centers
5:30
that are coming online as quickly are
5:32
essentially bringing in turbines on the
5:34
back of trucks, natural gas. It's it's
5:35
like bring your bring your own energy uh
5:37
essentially and most of that is and they
5:39
have the money
5:39
>> is non-renewable.
5:40
>> I mean these companies
5:41
>> they have the money
5:42
>> they have the money but actually
5:44
currently there's a bottleneck even when
5:45
you have the money because there's not
5:46
enough turbines to to power all these
5:48
data centers because there's there's a
5:50
backlog nowadays and and even these uh
5:52
these turbines can't be produced fast
5:54
enough to to respond to demand. Palmy at
5:57
Bloomberg recently you highlighted an
5:59
issue in Northern Spain um with the data
6:02
center build out there which has
6:03
actually been held up as a a model for
6:05
the rest of Europe but for the people
6:06
who live around these projects the
6:09
reality is sometimes very different why
6:12
>> I think it's a common story we're also
6:14
seeing in the United States a lot of
6:15
push back from local residents in areas
6:17
where companies want to build data
6:19
centers and in northern Spain um there
6:21
have been uh the situation is that AWS
6:24
which is the cloud business of Amazon um
6:27
sent letters to local people um saying
6:31
we want to buy your land, giving them
6:33
sometimes 4 days notice to say yes or
6:35
no. Um and some of these people in in in
6:38
northern Spain actually thought it was a
6:40
scam at first. Um one lady went to her
6:42
local town hall and even they didn't
6:44
know. So, it's a real um kind of land
6:47
grab almost to try and get land that is
6:50
relatively cheap in an area where energy
6:53
costs are relatively low and that are
6:56
sparssely populated as well. It seems
6:58
like an ideal situation for building a
7:00
data center, but at the same time there
7:02
is the reality for people who do live
7:04
there and there are people who live
7:06
there um that they have to give up that
7:08
land or maybe the suddenly you've got
7:10
this eyes sore in a place that you've
7:12
lived in for many generations. If you're
7:14
in a community like that though and
7:15
you've already struggled to to get
7:18
natural resources or to to to get
7:20
electricity to get yourselves on the
7:22
grid, does the arrival of a big AI
7:24
company help in that process? Perhaps it
7:26
it could help a community
7:28
>> in some respects. And the funny part in
7:31
that is that governments um local
7:33
governments often uh frame data center
7:36
buildouts as being great for jobs. Yeah.
7:38
But I think you're conflating in that
7:40
situation permanent jobs with
7:43
construction jobs which are temporary.
7:45
And so when you build out a data center,
7:46
you're you're going to hire me.
7:47
>> They're not are they not necessarily big
7:49
employers once the kit is there?
7:50
>> No, I think in a typical data center you
7:52
might have about 100 people, most of
7:54
them cleaners and security people. Um
7:56
whereas for the buildout, sure,
7:58
hundreds, maybe thousands of people, but
8:00
then that's only temporary.
8:01
>> All right, I'm going to bring in an
8:02
audience question quite early into the
8:04
program this week because it it it fits
8:06
what you're talking about. It's from
8:08
James in the UK. He says, "Sasha, AI
8:10
companies continue to minimize this
8:12
environmental impact." He points
8:15
specifically to Sam Alman's recent claim
8:17
that AI's water usage is minimal. James
8:20
says that's simply not true. He also
8:22
tells us that younger generations are
8:24
increasingly boycotting generative AI
8:26
for environmental reasons. So, here's
8:28
his question. Should mainstream media be
8:30
doing more to hold these companies to
8:33
account?
8:34
>> Definitely. Actually, a recent Guardian
8:37
study found that uh the big tech
8:38
companies were lobbying very very hard
8:40
against transparency to make sure that I
8:43
mean citing confidentiality to not
8:45
include any uh energy figures or water
8:48
figures about data centers and so we
8:50
we're we're seeing them play dirty and I
8:52
think it's time to ask for
8:54
accountability and I think that
8:55
especially in a time where people are
8:57
increasingly sustainability conscious so
8:59
so you know we make our decisions based
9:00
on on the environment and and ethical
9:03
concerns we need this information
9:05
whether it be for choosing one AI model
9:07
over the other for for using AI or not
9:09
using AI, right? There's there's lots of
9:11
decisions that we make on an everyday
9:12
basis that we just don't have the
9:14
information for. And especially since AI
9:16
has become such a common technology, we
9:18
definitely need these numbers and and
9:20
these companies have them. It's just a
9:22
matter of of giving them maybe positive
9:24
and and and less positive incentives for
9:26
sharing them.
9:26
>> Well, well, let's let's try and choose
9:28
to look at this positively because we're
9:30
all using the technology. We're going to
9:32
use it in our work. So we we need these
9:34
companies to be successful if we're if
9:36
we're going to employ AI fully. What
9:39
does a responsible data center look
9:41
like, Sasha?
9:43
>> So you can definitely create them in a
9:45
way that's u more integrated in the
9:47
existing infrastructure. So currently um
9:50
the data centers are being built out in
9:51
a very kind of bigger is better kind of
9:54
way. So typically they're outside of
9:55
cities, they're huge like warehouse
9:57
sized um but they can really be
9:59
integrated like the smaller data centers
10:00
can be in basement. Um, the heat can be
10:03
reused to heat office buildings or
10:04
university campuses. It's much easier to
10:06
use renewable energy or or a mix at
10:08
least of of renewable energy if if
10:11
there's less capacity that's needed.
10:13
>> You think it that maybe part of this
10:14
answer is then partnering with other
10:16
companies,
10:18
>> partnering and rethinking the paradigm.
10:19
So, currently it's like we need the
10:21
biggest data centers, we need sovereign
10:23
AI, we need, you know, bigger, let's
10:25
build it out. Even in like I mean in
10:26
Canada it's the same thing. We need our
10:28
own data center. Let's let's build it
10:29
out. But instead of thinking of that as
10:31
as the the you know when you have a
10:33
hammer everything's a nail I think we
10:34
should be thinking about the nails that
10:35
we have and thinking about okay so what
10:37
do we need this data center for? Is it
10:39
for a university? Is it for a private
10:41
company? Is there a way of for example
10:43
incentivizing uh some mix of renewables
10:45
or or for example helping them build it
10:47
out in a way that isn't you know bring
10:49
your own turbine on a on a on a truck
10:51
kind of situation. And so I think there
10:53
are ways of being more more agile if we
10:56
rethink our way of of doing AI. And it's
10:58
not only for data sensors. Same thing
10:59
for AI models. Instead of being like we
11:01
need the biggest, we need the most
11:02
energy intensive model for every single
11:04
task. We can have smaller models for
11:06
example ondevice models. Uh instead of
11:09
having every query be dispatched to the
11:11
cloud, we can have AI models running
11:12
locally on our smartphones and and
11:14
computers. So I think we should really
11:16
be rethinking a little bit the way that
11:17
we design and deploy AI currently.
11:20
>> A quick question just to um satisfy my
11:23
curiosity, Sasha. Um quick answers if
11:26
you could. There are some country uh
11:28
companies that are developing these air
11:30
cooling systems to reduce water
11:32
consumption. Do they work?
11:35
>> Yes, but it's often a trade-off of using
11:37
more energy and less water. So often um
11:40
it's true that you can for example
11:42
recycle water. So essentially water gets
11:44
cycled through and it heats up and you
11:46
have to cool it down. So either you need
11:48
uh cooling towers or sometimes you know
11:50
you cool it down with energy with
11:52
electricity. And so it's often a
11:53
trade-off where they're using more
11:55
energy but but less water. It's a closed
11:57
loop system.
11:57
>> Yeah. And at the outset you you said
11:59
that very often these data centers are
12:01
outstripping what the renewable industry
12:03
can provide for them. But there are good
12:06
examples and I wanted to point to them
12:08
where data centers have been cited very
12:11
close to renewable energy. So Iceland is
12:13
using geothermal, Norway using
12:16
hydroelectric. Is that an example that
12:18
other countries should be following?
12:21
>> Yes. But I think that um very few
12:23
countries I mean uh in in the current
12:26
state of things have that extra capacity
12:28
and also if these data centers continue
12:30
to be so like for example if a data
12:31
center uses as much energy as 100,000
12:33
homes um there's very few grids
12:36
renewable grids that can take that that
12:38
can provide that energy on such a short
12:40
notice. Even for example in Quebec where
12:42
I live we have hydro but we don't have
12:43
the extra capacity for you know in two
12:45
years an extra 100 thousand homes to be
12:47
built. It has to be gradual. And so it's
12:49
really the timelines that often don't
12:51
line up. And this is why natural gas is
12:53
the cheapest, fastest solution. And and
12:55
often there are long-term plans. Often
12:57
it's like, well, in 10 years we're going
12:58
to do renewables. In 10 years, we're
13:00
going to do this. But in the meantime,
13:02
it adds a lot of emissions.
13:04
>> Okay. Well, you might have questions on
13:05
what you've been hearing about data
13:07
centers. You might have some strong
13:08
thoughts on it. AI decoded atbc.co.uk.
13:12
Now, since Stephanie has been focusing
13:14
on clarity and regulation, I've got a
13:15
story for her. Um, let me show you some
13:17
images. Uh, these are images that look
13:20
entirely real, but the people in them
13:23
are fake.
13:27
>> My Coachella week was so much fun. Let
13:30
me take you around. It's a secret. You
13:32
can be in Coachella as me just by a few
13:34
prompts. Stay with me till the end for
13:36
the prompts.
13:42
They are computerenerated influencers
13:46
uh who were seen photographed on
13:49
Instagram alongside some of the most
13:52
famous people at Coachella uh which is
13:56
those in the know will know is this very
13:57
trendy music festival in the desert in
14:00
one of the desert valleys in California.
14:03
How many of those engaging in these
14:05
photographs knew that uh the people they
14:09
were pictured alongside were fake? I
14:11
would suggest not very many. I'm not
14:12
even sure that Coachella knew that there
14:14
were fake influences uh in the crowd.
14:16
Stephanie, we've talked about this on
14:18
the program before about AI generated
14:21
beauty, the impact it has on young
14:23
people. This for me, actually, I was
14:25
reading about it this week. This feels
14:26
like the next chapter of that.
14:28
>> Yeah. And again, the law is just not fit
14:30
for purpose on this. I think we're
14:32
really going to have to get to a point
14:33
where we have laws on the books that say
14:36
if you have someone that's pre
14:37
pretending to be a human being, it has
14:39
to be labeled. It just has to because
14:42
you're dealing with children first of
14:43
all, so like anyone that's under the age
14:44
of 18 needs to be protected, but you've
14:47
also dealing with older people. You're
14:49
also dealing with the potential for
14:50
scam, for fraud, for misinformation and
14:53
disinformation. So this would just solve
14:55
a lot of things.
14:56
>> Pommy, who's behind these images? What
14:58
do they what do they want? From what I
15:00
understand, it is mostly agencies. It's
15:03
not, you know, it's not a cottage
15:05
industry of people working from home.
15:06
There are, you know, agencies most of
15:08
the time in in Europe, in places like
15:11
London, um, and on the continent who are
15:14
producing these as branding exercises
15:16
and as an opportunity for a brand to get
15:18
a sponsorship. But for Yeah, absolutely.
15:20
Um, if you think about it, an influencer
15:24
um who has a brand sponsorship deal will
15:27
be quite costly because if they want to
15:28
go to Coachella, they want to go
15:30
business class maybe. Um, they want to
15:33
get a hotel, they want to get some other
15:34
freebies. But if you have an influencer
15:36
who you are sponsoring to hold your can
15:39
of whatever in the photograph, um,
15:42
they're not going to have a bad day or
15:44
get old or look weird in the photo.
15:46
They're always going to look great. It's
15:48
funny, ahead of this program, I actually
15:50
looked at some of these influencers who
15:51
were in Coachella in Coachella and it
15:55
was amazing like one of them had about
15:57
170,000 followers and had it was
15:59
pictures of her with Justin Bieber um
16:02
with the Kardashians with Madonna and no
16:05
one in the comments was saying this
16:07
isn't real. They were all the comments
16:09
were kind of congratulatory
16:12
um and there was no disclosure at all on
16:14
the Instagram profile that it was AI
16:16
generated. So I think a lot of people in
16:18
good faith would look at it and think
16:20
this is really
16:20
>> the obvious problem is Sasha that the
16:22
very famous person who's gone to
16:24
Coachella can say to someone who might
16:27
be advertising kryptonite next to them
16:29
look I don't want to be advertising
16:30
kryptonite and they can push them away.
16:32
They have no choice. They have no say in
16:34
in an AI generated person being put next
16:37
to them in a photograph that they pose
16:39
for unknowingly.
16:41
>> Yeah. In a in a world of AI agents,
16:43
humans lose their own agency. I think to
16:46
some extent and especially famous people
16:48
because there's so many likenesses of
16:49
them on the internet that it's very very
16:51
easy to generate a false image or video
16:53
now of of a celebrity.
16:55
>> Didn't Stephanie, didn't we talk about
16:57
New York bringing in new regulation to
17:00
stop this? I think you had to put on
17:02
your website whether you were you were
17:04
using an AI generated influencer, but
17:07
there's I mean some of these pictures
17:08
from Coachella do do that, but plenty of
17:10
them don't.
17:11
>> Yeah. And that's the enforcement thing.
17:13
Like there are all sorts of laws that
17:14
are obvious
17:15
>> from state to state is different. Right.
17:16
>> Exactly. And how you know whose job is
17:19
it to police that and how are they able
17:22
to get the accountability that they
17:24
need? So again this is a case of if you
17:25
were to take them to court that's going
17:27
to take years right it's going to cost a
17:29
lot of money etc. So it's kind of like
17:31
everything that we saw about
17:32
accountability with social media not
17:34
being very effective.
17:35
>> I mean Coachella themselves could just
17:37
say enough. You can't do this. It's up
17:39
to the organizer.
17:40
>> They absolutely could. And I think, you
17:42
know, what you mentioned earlier about
17:43
the uh the celebrities, you might want
17:45
to just play the world's tiniest violin
17:47
for these people for being in these
17:49
photographs and potentially compromising
17:51
situations. But I mean, ultimately, if
17:53
they do get upset and if the brands get
17:55
upset, I think that's perhaps going to
17:57
be potentially even more effective than
17:59
actual regulation in um in getting
18:03
enforcement. So,
18:04
>> what about the platforms though? I mean,
18:05
in Instagram, Tik Tok, they're profiting
18:08
from these engagements.
18:09
>> Yes. And um technically on these
18:11
platforms you're you are supposed to
18:13
disclose if something is AI generated.
18:15
The fact is nobody actually follows that
18:18
rule. And Meta does have um automated
18:22
systems that will try and look for
18:24
things that are AI generated and tag
18:26
them. But it's an almost impossible task
18:28
because there's hundreds of thousands of
18:30
posts made every day and many many many
18:33
are slipping through the net. Now the
18:35
thing is it is possible if they really
18:37
wanted to. There are ways to put um a
18:41
cryptographic signature on actual
18:43
photographic images um called CP2A. Um
18:48
but that's just not something that the
18:50
tech companies are investing in because
18:52
if you think about it, there is a
18:54
commercial incentive to just let this
18:56
carry on because to to your point
18:57
earlier, do people actually like this?
18:59
Weirdly, the public don't hate AI
19:02
avatars. They're kind of okay with it
19:04
>> if they know. I mean, how many people
19:06
really know? because some of these are
19:07
really good and they're getting better.
19:09
>> I think that's just going to make it
19:10
harder to to deal with.
19:11
>> So weird. Everyone talks about being
19:13
authentic and brands are all about your
19:15
values and then they do this stuff
19:17
that's so fake and people eat it up.
19:19
>> Weird.
19:20
>> Do we still call California the Wild
19:21
West? No, maybe not.
19:23
>> But this is the Wild West.
19:25
>> Yeah, absolutely.
19:26
>> Now, the late Val Kilmer was one of the
19:28
greats. Do we agree on that?
19:30
>> Yeah. Top Gun Batman forever.
19:32
>> Real genius.
19:33
>> Yeah. The Doors. I liked him in that. uh
19:35
one of the Hollywood uh greats, one of
19:37
the versatile actors of Hollywood as
19:39
well. And he died a year ago, as many of
19:41
you will know, age 65 after a long and
19:43
sad battle with throat cancer. But he
19:46
had been cast in a film uh a few years
19:48
earlier. It's called As Deep as the
19:50
Grave. It's a historical drama about uh
19:53
the American Southwest. And uh of
19:56
course, he didn't make it to set because
19:58
he was he was too ill at the end. But
20:00
this week, the trailer for that movie
20:03
debuted at Cineacon in Las Vegas. He's
20:06
in it. And every scene in which he is in
20:08
and every line that he speaks, of
20:10
course, is generated entirely by AI.
20:18
>> Is me.
20:29
Hey, hey,
20:32
hey.
20:55
Hey,
21:04
>> don't fear the dead and don't fear me.
21:07
His children gave their blessing to this
21:09
Stephanie. Um, and just so everybody
21:12
knows, the filmmakers followed the
21:14
guidelines, spoke to the unions. Kilmer,
21:16
in fact, himself uh embraced AI in his
21:20
final movie, Top Gun Maverick. uh his
21:22
his voice was recreated by AI, so he
21:25
wasn't oblivious to this.
21:28
Is this the blueprint, do you think, for
21:30
for for AI in Hollywood, an ethical way
21:33
of using it, or does it for you open a
21:35
door that we can't close?
21:37
>> I think it's just about choice. So, I
21:40
like the idea that if directors and
21:42
other artists want to experiment with
21:43
AI, that they are doing so mindfully,
21:46
that they're trying to come up with an
21:47
ethical standard that is no doubt going
21:49
to be discussed and may eventually be
21:51
formalized. I also think it's really
21:53
important for any creative person who
21:55
doesn't want their likeness, their
21:58
biometrics or their creative output to
22:00
be used in this way to be able to say
22:02
no. Right? So in that case for anybody
22:05
who's a Hollywood actor listening and
22:07
watching our show as we know they are
22:09
>> um they would want to be all the way
22:11
they want to be speaking with their
22:12
agents and with their team and their
22:14
lawyers to be really clear about that.
22:16
So you know how do you want your
22:17
likeness being used while you're alive
22:19
and then how do you want it being used
22:21
after your death. So in this case Bal
22:23
Kilmer's children were fine with it. His
22:25
estate's fine with it and everything was
22:26
done with everyone being I think as
22:28
ethical as they can be. Other actors
22:30
have made different choices. C can I
22:32
just say Stephanie for the record that
22:34
if you're going to use my likeness for
22:35
AI decoded into the future I am happy
22:38
for that so long as you pay the
22:40
royalties to my estate just on the
22:42
record so we're all cleared.
22:44
>> Um Sasha I mean obviously Hollywood
22:46
Hollywood actors are are a gift aren't
22:48
they for AI because there's there's
22:50
hundreds of hours of film of them.
22:53
They've been in lots of performances. Uh
22:55
and so in fact I think this performance
22:58
that Kilmer in of this movie was it was
23:01
reconstructed from 40 films hundreds of
23:04
hours of footage. So as long as you
23:06
signed up to this sky's is the limit.
23:09
>> I think that's a very hard ethical
23:10
question. Um I've seen a lot of papers.
23:13
I even saw a theater play on this topic
23:15
especially after death actually. I think
23:16
that's a really important point. Uh who
23:18
can opt in? How can you opt out if
23:20
you're dead? Um, and also what does this
23:22
mean for the community? Because if
23:24
there's peer pressure, for example, and
23:25
I think that actually AI was one of the
23:27
one of the sticking points during the
23:29
strikes a couple of years ago, right?
23:30
Uh, to what extent is there is there
23:32
union pressure? Is there community
23:33
pressure to opt in? Can you can you
23:35
continue opting out in this new new
23:38
world, right? And it's it's similar to
23:40
what a lot of workers are facing as
23:41
well. Um, I'm hearing a lot of people
23:43
being like, well, I'm forced to use AI
23:44
in my in my workplace. We even have
23:46
dashboards for tracking it. And so, it's
23:48
really this pressure that we're seeing
23:49
to use AI. And I think that that does uh
23:52
uh make people give up some of their
23:54
individual choices if they feel
23:56
pressured to. So for example, if you're
23:57
a young actor and you want to uh you
23:59
know make your make a name for yourself,
24:01
but you don't want to use AI but you
24:02
have this peer pressure around you, can
24:04
you really opt out without having a
24:05
negative impact on your career?
24:08
>> Yeah. Yeah. And and I think um Hollywood
24:10
has already this history of recycling
24:13
old films and making sequels and making
24:16
remakes. And I think there is already
24:17
this tendency to want to maximize
24:19
profits by going back to whatever works.
24:21
And if that's the incentive driving the
24:24
remake of an actor who has died, I think
24:28
in the end that can actually put younger
24:30
fresh talent out of work if it's just
24:32
the kind of the same icons appearing
24:34
over and over again for the next 100
24:36
years. Um, and to Sasha's point as well,
24:39
I think where that pressure comes from,
24:41
like I've spoken to a company that does
24:43
virtual reality concerts or Avatar
24:45
concerts, um, and they have had pressure
24:48
from the families of artists who have
24:51
died to try and recreate the the the
24:55
deceased artist for a concert,
24:57
>> not knowing whether there's consent or
24:58
not for that.
24:59
>> And it's a very gray area. It's a very
25:01
because we're talking about people who
25:02
died maybe 10 20 years ago, icons um
25:06
putting pressure on them as a it's a new
25:07
revenue source for the family left
25:09
behind for the estate.
25:10
>> We're out of time. Pie uh Sasha
25:13
Stephanie, thank you very much indeed.
25:15
Um uh AI decoded next week. Uh we think
25:18
Scott Galloway is coming on. I'm putting
25:20
that out there so he does come on next
25:21
week. Um so do tune in for that. If you
25:24
have any thoughts on anything we've
25:25
discussed, uh do email us aidbc.co.uk.
25:29
UK. And I'm going to put on screen for
25:32
you the QR code for the AI decoded
25:35
playlist which is on YouTube. Some of
25:36
you struggling to find it. There it is.
25:38
If you scan the QR code, you'll be able
25:39
to find it. All the backup episodes are
25:42
there. So, do take a look at that. And
25:44
don't forget if you want to watch us
25:45
again, we are on the BBC i Player. Uh,
25:48
that's all the housekeeping. Thank you
25:50
very much for watching. Thank you to our
25:51
guests this week. We'll see you next
25:53
time.
— end of transcript —
Advertisement