WEBVTT

00:00:04.400 --> 00:00:07.240
What I want to do
today is chat with you

00:00:07.240 --> 00:00:09.759
about career advice in AI.

00:00:09.759 --> 00:00:13.559
And in previous years, I used
to do most of this lecture

00:00:13.560 --> 00:00:14.560
by myself.

00:00:14.560 --> 00:00:16.480
But what I thought
I'd do today is

00:00:16.480 --> 00:00:19.160
I'll share just a few
thoughts and then hand it over

00:00:19.160 --> 00:00:23.160
to my good friend Laurence
Moroney, who I invited to speak

00:00:23.160 --> 00:00:27.140
here and kindly agreed to come
all the way to San Francisco,

00:00:27.140 --> 00:00:31.240
he lives in Seattle, to share
with us a very broad market

00:00:31.239 --> 00:00:33.460
landscape for what he's
seeing in the job market,

00:00:33.460 --> 00:00:39.000
as well as tips for career,
growing a career in AI.

00:00:39.000 --> 00:00:42.359
But there was just two slides
and then one more thought

00:00:42.359 --> 00:00:44.439
I want to share with you
before I hand it over

00:00:44.439 --> 00:00:49.320
to Laurence, which is it
really feels like the best

00:00:49.320 --> 00:00:53.399
opportunity, the best time
ever to be building with AI

00:00:53.399 --> 00:00:55.799
and to building a career in AI.

00:00:55.799 --> 00:00:59.809
A few months ago I noticed in
social media, traditional media,

00:00:59.810 --> 00:01:03.370
there are a few questions
about is AI slowing down?

00:01:03.369 --> 00:01:05.832
People saying, well,
it's GPT-5 that good?

00:01:05.832 --> 00:01:07.250
I think it's
actually pretty good.

00:01:07.250 --> 00:01:10.269
But there are questions about
is AI progress slowing down?

00:01:10.269 --> 00:01:13.469
And I think part of the reason
the question was even raised was

00:01:13.469 --> 00:01:19.670
because if a benchmark for AI
is 100% is perfect answers,

00:01:19.670 --> 00:01:22.370
then if you make rapid
progress, at some point,

00:01:22.370 --> 00:01:25.510
you cannot get
above 100% accuracy.

00:01:25.510 --> 00:01:30.350
But one of the studies that most
influenced my thinking was work

00:01:30.349 --> 00:01:32.549
done by this
organization, M-E-T-R,

00:01:32.549 --> 00:01:37.069
METR that studied
as time passes,

00:01:37.069 --> 00:01:40.949
how complex are the tasks that
AI could do as measured by how

00:01:40.950 --> 00:01:43.950
long it takes a human
to do that task?

00:01:43.950 --> 00:01:47.870
So a few years ago, maybe
GPT-2 could do tasks

00:01:47.870 --> 00:01:50.734
that a human could do
in a couple seconds.

00:01:50.734 --> 00:01:52.109
And then they
could do tasks that

00:01:52.109 --> 00:01:57.030
took a human four seconds,
then eight seconds, then,

00:01:57.030 --> 00:02:00.219
a minute, two minutes,
four minutes, and so on.

00:02:00.219 --> 00:02:03.420
And the study estimates that
the length of task AI can do

00:02:03.420 --> 00:02:06.700
is doubling every seven months.

00:02:06.700 --> 00:02:09.620
And I think on
this metric, I feel

00:02:09.620 --> 00:02:12.599
optimistic that AI will
continue making progress,

00:02:12.599 --> 00:02:15.139
meaning the complexity
of tasks as measured

00:02:15.139 --> 00:02:19.659
by how long a human takes to do
something is doubling rapidly.

00:02:19.659 --> 00:02:22.379
And same study with
a smaller data set

00:02:22.379 --> 00:02:25.859
seems to show-- same study
argued that for AI coding,

00:02:25.860 --> 00:02:29.840
the doubling time is even
shorter, maybe 70 days.

00:02:29.840 --> 00:02:31.539
So this code that
used to take me,

00:02:31.539 --> 00:02:34.620
I don't know, 10 minutes
to write, then 20 minutes

00:02:34.620 --> 00:02:36.340
to write, 40 minutes
to write, and AI

00:02:36.340 --> 00:02:38.700
could do more and more of that.

00:02:38.699 --> 00:02:41.419
And so the reasons I
think this is a golden age

00:02:41.419 --> 00:02:43.699
to be building, best
time we've ever seen

00:02:43.699 --> 00:02:47.819
is maybe two themes which
are more powerful and faster.

00:02:47.819 --> 00:02:50.340
So we can all, all
of you in this room

00:02:50.340 --> 00:02:54.620
can now write software that
is more powerful than what

00:02:54.620 --> 00:02:57.730
anyone on the planet
could have built a year

00:02:57.729 --> 00:03:00.789
ago by using AI building blocks.

00:03:00.789 --> 00:03:03.150
AI building blocks include
large language models,

00:03:03.150 --> 00:03:05.110
radical genetic
workflows, voice AI,

00:03:05.110 --> 00:03:06.350
and of course, deep learning.

00:03:06.349 --> 00:03:10.169
It turns out that a lot of LLMs
have a decent, at least basic

00:03:10.169 --> 00:03:12.006
understanding of deep learning.

00:03:12.007 --> 00:03:14.090
So if you have a prompt
one of the frontier models

00:03:14.090 --> 00:03:16.890
to implement a cutting edge
neural network for you,

00:03:16.889 --> 00:03:19.949
try prompting it to implement
a transformer network for you.

00:03:19.949 --> 00:03:23.369
It's actually not bad at helping
you use these building blocks

00:03:23.370 --> 00:03:25.810
to build software quickly.

00:03:25.810 --> 00:03:29.610
And so we have very
powerful building blocks

00:03:29.610 --> 00:03:32.607
that were very difficult or did
not exist a year or two ago.

00:03:32.606 --> 00:03:34.689
And so you can now build
software that does things

00:03:34.689 --> 00:03:38.250
that no one else on the planet,
even the most advanced teams

00:03:38.250 --> 00:03:39.930
on the planet, could have done.

00:03:39.930 --> 00:03:44.849
And then also with
AI coding, the speed

00:03:44.849 --> 00:03:46.569
with which you can
get software written

00:03:46.569 --> 00:03:49.169
is much faster than ever before.

00:03:49.169 --> 00:03:50.969
And I've personally
found it as important

00:03:50.969 --> 00:03:52.870
to stay on the
frontier of tools,

00:03:52.870 --> 00:03:55.810
because the tools for
AI coding changes,

00:03:55.810 --> 00:03:57.670
I don't know, really rapidly.

00:03:57.669 --> 00:04:03.569
So I feel like since several
months ago, my personal number

00:04:03.569 --> 00:04:07.209
one favorite tool became
Cloud Code, moving on

00:04:07.210 --> 00:04:09.849
from some earlier
generations, I think.

00:04:09.849 --> 00:04:13.090
And then I think since
the release of GPT-5,

00:04:13.090 --> 00:04:15.090
I think OpenAI
Codex has actually

00:04:15.090 --> 00:04:17.088
made tremendous progress.

00:04:17.088 --> 00:04:19.709
And this morning,
Gemini 3 was released,

00:04:19.709 --> 00:04:22.610
which haven't had time to play
with it yet just this morning.

00:04:22.610 --> 00:04:25.030
It seems like another
huge leap forward.

00:04:25.029 --> 00:04:27.129
So I feel if you ask
me every three months

00:04:27.129 --> 00:04:29.810
what my personal favorite
coding tool is, it actually

00:04:29.810 --> 00:04:32.329
probably changes definitely
every six months, but quite

00:04:32.329 --> 00:04:33.990
possibly every three months.

00:04:33.990 --> 00:04:38.970
And I find that being half a
generation behind in these tools

00:04:38.970 --> 00:04:41.950
means being, frankly, quite
a bit less productive.

00:04:41.949 --> 00:04:44.539
And I know everyone says
AI is moving so fast,

00:04:44.540 --> 00:04:45.790
everything's changing so fast.

00:04:45.790 --> 00:04:48.810
But AI coding tools, of
all the sectors in AI,

00:04:48.810 --> 00:04:51.790
many things maybe don't move as
fast as the hype says it does,

00:04:51.790 --> 00:04:53.920
but AI coding
tools is one sector

00:04:53.920 --> 00:04:56.660
where I see the pace of
progress is tremendous.

00:04:56.660 --> 00:04:59.220
And staying at the latest
generation of tools,

00:04:59.220 --> 00:05:01.800
rather than half
generation behind makes

00:05:01.800 --> 00:05:03.920
you more productive.

00:05:03.920 --> 00:05:06.823
And with our ability to
build more powerful software

00:05:06.822 --> 00:05:08.239
and build it much
faster than ever

00:05:08.240 --> 00:05:11.160
before, I think
one piece of advice

00:05:11.160 --> 00:05:12.800
that I give now,
much more strongly

00:05:12.800 --> 00:05:15.240
now than even a year
ago or two years ago,

00:05:15.240 --> 00:05:17.800
is just go and build stuff.

00:05:17.800 --> 00:05:19.139
Take classes from Stanford.

00:05:19.139 --> 00:05:20.419
Take online courses.

00:05:20.420 --> 00:05:22.400
And additionally
your opportunity

00:05:22.399 --> 00:05:24.000
to build things,
and I think Laurence

00:05:24.000 --> 00:05:26.199
is going to talk about
showing them to others,

00:05:26.199 --> 00:05:28.479
is greater than ever before.

00:05:28.480 --> 00:05:30.759
But there's one
weird implication

00:05:30.759 --> 00:05:33.767
of this that is
maybe not-- is still,

00:05:33.767 --> 00:05:36.059
I don't know, more and more
people are appreciating it,

00:05:36.060 --> 00:05:38.319
but not widely known, which
is the product management

00:05:38.319 --> 00:05:43.480
bottleneck, which is that when
it is increasingly easy to go

00:05:43.480 --> 00:05:47.319
from a clearly written software
spec to a piece of code,

00:05:47.319 --> 00:05:50.399
then the bottleneck increasingly
is deciding what to build

00:05:50.399 --> 00:05:53.829
or increasingly writing that
queer spec for what you actually

00:05:53.829 --> 00:05:55.029
want to build.

00:05:55.029 --> 00:05:57.349
When I'm building
software, I often

00:05:57.350 --> 00:06:00.090
think of going through a loop
where we'll write some software,

00:06:00.089 --> 00:06:03.609
write some code sure to use
this to get user feedback.

00:06:03.610 --> 00:06:06.830
I think of this as a PM or
product management work.

00:06:06.829 --> 00:06:09.029
And then based on
user feedback, I'll

00:06:09.029 --> 00:06:11.767
revise my view on what users
like, what they don't like.

00:06:11.767 --> 00:06:13.809
This UI is too difficult.
They want this feature.

00:06:13.810 --> 00:06:16.509
They don't want that feature
and change my conception

00:06:16.509 --> 00:06:19.110
of what to build, and
then go around this loop

00:06:19.110 --> 00:06:21.629
many times to hopefully
iterate toward a product

00:06:21.629 --> 00:06:23.389
that users love.

00:06:23.389 --> 00:06:27.310
And because of AI coding, the
process of building software

00:06:27.310 --> 00:06:31.470
has become much cheaper and
much faster than before.

00:06:31.470 --> 00:06:34.630
But that ironically
shifts the bottleneck

00:06:34.629 --> 00:06:38.189
to deciding what to build.

00:06:38.189 --> 00:06:43.029
So some weird trends I'm seeing.

00:06:43.029 --> 00:06:45.409
In Silicon Valley and
in many tech companies,

00:06:45.410 --> 00:06:48.950
people have often talked about
an engineer to product manager,

00:06:48.949 --> 00:06:50.779
engineer to PM ratio.

00:06:50.779 --> 00:06:53.439
And you take these ratios
with grain of salt,

00:06:53.439 --> 00:06:55.439
because they're kind of
vary all over the place.

00:06:55.439 --> 00:06:57.860
But you hear companies
talk about the Eng to PM

00:06:57.860 --> 00:07:00.800
ratio of 4 to 1 or
7 to 1 or 8 to 1.

00:07:00.800 --> 00:07:04.139
This idea that one product
manager writing product specs

00:07:04.139 --> 00:07:08.339
can keep four to eight or
some number like that engineer

00:07:08.339 --> 00:07:09.539
is busy.

00:07:09.540 --> 00:07:11.319
But because engineering
is speeding up,

00:07:11.319 --> 00:07:15.259
whereas product management is
not sped up as far as much by AI

00:07:15.259 --> 00:07:18.180
as engineering,
I'm seeing the Eng

00:07:18.180 --> 00:07:22.632
to PM ratio trending downward,
maybe even two or one to one.

00:07:22.632 --> 00:07:24.299
So some teams I work
with, they proposed

00:07:24.300 --> 00:07:26.819
headcount was one PM
to one engineer, which

00:07:26.819 --> 00:07:31.300
is a ratio unlike almost all
Silicon Valley, certainly

00:07:31.300 --> 00:07:33.980
traditional Silicon
Valley companies.

00:07:33.980 --> 00:07:37.759
And the other thing I'm
seeing is that engineers,

00:07:37.759 --> 00:07:40.360
they can also
shape products that

00:07:40.360 --> 00:07:44.020
can move really fast where
you go one step further,

00:07:44.019 --> 00:07:46.419
take the engineer, take
the PM, and collapse them

00:07:46.420 --> 00:07:48.090
into a single human.

00:07:48.089 --> 00:07:51.009
And I find that
there are definitely

00:07:51.009 --> 00:07:53.129
engineers that doing
engineering work that

00:07:53.129 --> 00:07:55.329
don't enjoy talking to
users and having that more

00:07:55.329 --> 00:07:58.149
human, empathetic side of work.

00:07:58.149 --> 00:08:01.529
But I'm finding increasingly
that the subset of engineers

00:08:01.529 --> 00:08:05.449
that learn to talk to
users, get feedback, develop

00:08:05.449 --> 00:08:08.409
deep empathy for users so
that they can make decisions

00:08:08.410 --> 00:08:10.530
about what to build,
those engineers

00:08:10.529 --> 00:08:12.889
are also the fastest
moving people that I'm

00:08:12.889 --> 00:08:15.529
seeing in Silicon Valley today.

00:08:15.529 --> 00:08:19.829
And I feel like at the
earliest stage of my career,

00:08:19.829 --> 00:08:24.509
one thing I regretted for years
was in one of the roles I had,

00:08:24.509 --> 00:08:27.610
I went to try to convince
a bunch of engineers

00:08:27.610 --> 00:08:29.430
to do more product work.

00:08:29.430 --> 00:08:32.129
And I actually made a bunch
of really good engineers

00:08:32.129 --> 00:08:34.710
feel bad for not being
good product managers.

00:08:34.710 --> 00:08:37.429
And that was a mistake I made,
regretted that for years.

00:08:37.428 --> 00:08:39.250
I just shouldn't have done that.

00:08:39.250 --> 00:08:40.649
And part of me
feels like I'm now

00:08:40.649 --> 00:08:44.409
going back to repeat
that exact same mistake.

00:08:44.409 --> 00:08:47.959
Having said that, I
find that the fact

00:08:47.960 --> 00:08:50.840
that I can write
code, but also talk

00:08:50.840 --> 00:08:53.560
to users to shape what
to do, that lets me

00:08:53.559 --> 00:08:55.779
and the engineers that can
do this go much faster.

00:08:55.779 --> 00:08:58.000
So I think maybe worth
taking another look

00:08:58.000 --> 00:09:02.720
at whether engineers can
do a bit more of this work,

00:09:02.720 --> 00:09:05.120
because then if you're not
waiting for someone else

00:09:05.120 --> 00:09:06.580
to take the product
to customers,

00:09:06.580 --> 00:09:08.940
you can just write code, have
a gut for what to do next,

00:09:08.940 --> 00:09:11.840
and iterate that pace,
that velocity of execution

00:09:11.840 --> 00:09:13.920
is much faster.

00:09:13.919 --> 00:09:17.879
And then before I hand over to
Laurence, just one last thing

00:09:17.879 --> 00:09:23.679
I want to share, which is in
terms of navigating your career,

00:09:23.679 --> 00:09:26.159
I think one of the
most strong predictors

00:09:26.159 --> 00:09:30.199
for your speed of learning
and for your level of success

00:09:30.200 --> 00:09:32.002
is the people you
surround yourself with.

00:09:32.001 --> 00:09:33.459
I think we're all
social creatures.

00:09:33.460 --> 00:09:35.560
We all learn from
people around us.

00:09:35.559 --> 00:09:40.919
And it turns out there are
studies in sociology that

00:09:40.919 --> 00:09:44.479
show that if your five
closest friends are smokers,

00:09:44.480 --> 00:09:46.620
the odds of you being a
smoker is pretty much high.

00:09:46.620 --> 00:09:48.279
Please don't smoke.

00:09:48.279 --> 00:09:51.039
It's just an example.

00:09:51.039 --> 00:09:52.799
I don't know of
any study showing

00:09:52.799 --> 00:09:55.959
that if you're five or 10
closest friends are really

00:09:55.960 --> 00:10:00.735
hard working, determined
people, learning quickly, trying

00:10:00.735 --> 00:10:02.360
to make the world a
better place of AI,

00:10:02.360 --> 00:10:04.502
that you are more
likely to do that too.

00:10:04.501 --> 00:10:05.959
But it's one of
those things that I

00:10:05.960 --> 00:10:07.764
think is almost certainly true.

00:10:07.764 --> 00:10:10.139
It's like all of us are inspired
by the people around us,

00:10:10.139 --> 00:10:12.919
and we're able to find a good
group of people to work with,

00:10:12.919 --> 00:10:15.279
that helps drive you forward.

00:10:15.279 --> 00:10:17.879
In fact, here at Stanford,
I feel very fortunate--

00:10:17.879 --> 00:10:22.279
the fantastic student body,
fantastic group of faculty.

00:10:22.279 --> 00:10:24.701
And then the other
thing that I think

00:10:24.701 --> 00:10:26.159
we're fortunate to
have at Stanford

00:10:26.159 --> 00:10:27.879
is our connective tissue.

00:10:27.879 --> 00:10:32.639
So candidly, a lot
of the people working

00:10:32.639 --> 00:10:35.879
and a lot of the cutting edge
AI labs, the frontier labs,

00:10:35.879 --> 00:10:39.679
they were former students of
a lot of different Stanford

00:10:39.679 --> 00:10:40.639
faculty.

00:10:40.639 --> 00:10:43.870
And so that rich
connective tissue candidly

00:10:43.870 --> 00:10:45.789
means that at Stanford,
we often find out

00:10:45.789 --> 00:10:47.962
about a lot of stuff
that's not widely

00:10:47.962 --> 00:10:50.129
known because of the
relationships, the friendships.

00:10:50.129 --> 00:10:52.909
And when some company
does something,

00:10:52.909 --> 00:10:54.317
one of my friends
and the faculty

00:10:54.317 --> 00:10:56.649
will call up someone to come
and say, hey, that's weird.

00:10:56.649 --> 00:10:57.567
Does this really work?

00:10:57.567 --> 00:11:02.430
And so that rich connective
tissue means that we're all--

00:11:02.429 --> 00:11:04.264
just as we try to pull
our friends forward,

00:11:04.264 --> 00:11:06.389
our friends also pull us
forward with the knowledge

00:11:06.389 --> 00:11:10.230
and the connective tissue and
this know-how of bleeding edge

00:11:10.230 --> 00:11:12.110
AI, which unfortunately
is not all

00:11:12.110 --> 00:11:14.490
published on the internet
at this moment in time.

00:11:14.490 --> 00:11:18.230
So I think while you're at
Stanford, make those friends,

00:11:18.230 --> 00:11:20.029
form that rich
connective tissue.

00:11:20.029 --> 00:11:22.809
And there have been a lot of
times that just for myself,

00:11:22.809 --> 00:11:25.269
where, frankly, I
was thinking of going

00:11:25.269 --> 00:11:26.949
in some technical direction.

00:11:26.950 --> 00:11:30.670
I'd have one or two
phone calls with someone

00:11:30.669 --> 00:11:33.309
really close to research, either
Stanford researcher or someone

00:11:33.309 --> 00:11:34.409
in the frontier lab.

00:11:34.409 --> 00:11:37.289
They would share something with
me that I didn't know before.

00:11:37.289 --> 00:11:39.543
And that changes
the way I choose

00:11:39.543 --> 00:11:41.210
the technical
architecture of a project.

00:11:41.210 --> 00:11:43.860
So I find that group of
friends you surround yourself

00:11:43.860 --> 00:11:46.399
with, those little pieces
of information-- try this.

00:11:46.399 --> 00:11:48.079
Don't do that--
that's just hype.

00:11:48.080 --> 00:11:49.460
Ignore the PR.

00:11:49.460 --> 00:11:50.840
Don't actually try that thing.

00:11:50.840 --> 00:11:53.340
Those things make
a big difference

00:11:53.340 --> 00:11:56.920
in your ability to steer the
direction of your projects.

00:11:56.919 --> 00:11:59.399
So while you're at Stanford,
take advantage of that.

00:11:59.399 --> 00:12:01.632
This connective tissue
that Stanford has,

00:12:01.633 --> 00:12:02.800
it's actually really unique.

00:12:02.799 --> 00:12:04.882
There are lots of great
universities in the world,

00:12:04.883 --> 00:12:08.420
but at this moment in time,
I don't think there's any--

00:12:08.419 --> 00:12:10.802
I don't want to sound like
I'm doing PR for Stanford now,

00:12:10.802 --> 00:12:13.220
but I really think there's no
university in the world that

00:12:13.220 --> 00:12:16.279
is as privileged as Stanford
at this moment in time,

00:12:16.279 --> 00:12:19.139
in terms of the richness of
the connective tissue to all

00:12:19.139 --> 00:12:23.100
of the leading AI groups.

00:12:23.100 --> 00:12:25.700
But to me, there's also
that we're lucky here

00:12:25.700 --> 00:12:27.700
to have a wonderful
community of people

00:12:27.700 --> 00:12:30.100
to work with and learn from.

00:12:30.100 --> 00:12:31.220
And for you too.

00:12:31.220 --> 00:12:35.300
If you apply for jobs, the
thing that is much more

00:12:35.299 --> 00:12:37.259
important for your
career success would

00:12:37.259 --> 00:12:40.490
be if you go to a company,
it'll be the people

00:12:40.490 --> 00:12:42.990
you work with day to day.

00:12:42.990 --> 00:12:49.090
So here's one story that I've
told in previous classes, I

00:12:49.090 --> 00:12:52.050
repeat, which is there's a
Stanford student that I knew

00:12:52.049 --> 00:12:54.839
this was many years
ago, that I knew,

00:12:54.840 --> 00:12:56.590
and they did really
good work at Stanford.

00:12:56.590 --> 00:12:58.250
I thought they were high flyer.

00:12:58.250 --> 00:13:00.970
And they applied for
a job at a company,

00:13:00.970 --> 00:13:03.490
and they got a job offer
from one of the companies

00:13:03.490 --> 00:13:07.370
with a hot AI brand.

00:13:07.370 --> 00:13:10.990
This company refused to tell
him which team he would join.

00:13:10.990 --> 00:13:13.350
They said, oh, come
sign up for a job.

00:13:13.350 --> 00:13:16.129
There's a rotation system,
matching system, blah blah blah.

00:13:16.129 --> 00:13:17.669
Sign on the dotted line first.

00:13:17.669 --> 00:13:21.729
Then we'll figure out what's
a good project for you.

00:13:21.730 --> 00:13:24.490
Partly because it
was a good company.

00:13:24.490 --> 00:13:26.850
His parents were proud
of him for getting a job

00:13:26.850 --> 00:13:27.950
at this company.

00:13:27.950 --> 00:13:29.810
This student joined
this company hoping

00:13:29.809 --> 00:13:32.129
to work on exciting AI project.

00:13:32.129 --> 00:13:33.909
And after he signed
on the dotted line,

00:13:33.909 --> 00:13:36.169
he was assigned to
work on the back end

00:13:36.169 --> 00:13:39.199
Java payment processing
system of the company.

00:13:39.200 --> 00:13:41.040
Nothing against anyone
that wants to do Java

00:13:41.039 --> 00:13:42.539
back end payment
processing systems.

00:13:42.539 --> 00:13:45.480
I think they're great, but this
is an AI student that did not

00:13:45.480 --> 00:13:47.519
get matched to an AI project.

00:13:47.519 --> 00:13:50.120
And so for about a year,
he was really frustrated,

00:13:50.120 --> 00:13:53.879
and he actually left this
company after about a year.

00:13:53.879 --> 00:13:56.919
The unfortunate thing
is, I told this story

00:13:56.919 --> 00:14:00.199
in CS230 some years back.

00:14:00.200 --> 00:14:04.120
And then after I
was already telling

00:14:04.120 --> 00:14:08.299
the story in this class,
a couple of years later,

00:14:08.299 --> 00:14:13.759
another student in CS230 went
through the same experience

00:14:13.759 --> 00:14:16.460
with the same company, not Java
back end payment processing,

00:14:16.460 --> 00:14:17.460
but different project.

00:14:17.460 --> 00:14:21.243
And I think this effect
of trying to figure out

00:14:21.243 --> 00:14:23.160
who you'll be actually
working with day to day

00:14:23.159 --> 00:14:25.159
and making sure you're
surrounded by people that

00:14:25.159 --> 00:14:27.059
inspire you and work
on exciting projects,

00:14:27.059 --> 00:14:28.179
I think that's important.

00:14:28.179 --> 00:14:30.599
And even completely
candid, if a company

00:14:30.600 --> 00:14:34.540
refuses to tell you what
team you'll be assigned to,

00:14:34.539 --> 00:14:37.469
that does raise a
question in my mind of

00:14:37.470 --> 00:14:39.910
whether or not what will happen.

00:14:39.909 --> 00:14:42.069
And I think that
instead of working

00:14:42.070 --> 00:14:44.670
for the company with
the hottest brand,

00:14:44.669 --> 00:14:48.110
sometimes if you find a
really good team with really

00:14:48.110 --> 00:14:50.950
hard working, knowledgeable,
smart people trying to do good

00:14:50.950 --> 00:14:54.370
with AI, but the company
logo just isn't as hot,

00:14:54.370 --> 00:14:56.429
I think that often
means you actually

00:14:56.429 --> 00:14:59.709
learn faster and progress
your career better because it

00:14:59.710 --> 00:15:03.590
is after all, we don't
learn from the excitement

00:15:03.590 --> 00:15:05.769
of the company logo when
you walk through the door,

00:15:05.769 --> 00:15:08.289
you learn from the people
you deal with day to day.

00:15:08.289 --> 00:15:13.269
So I just urge you to use
that as a huge criteria

00:15:13.269 --> 00:15:16.529
for your selection process
for what you decide to do.

00:15:21.190 --> 00:15:25.070
But I think number
one on my advice

00:15:25.070 --> 00:15:27.470
is it's become much
easier than ever

00:15:27.470 --> 00:15:30.990
before to build powerful
software faster.

00:15:30.990 --> 00:15:33.330
And what that means
is do be responsible.

00:15:33.330 --> 00:15:35.790
Don't build software
that hurts others.

00:15:35.789 --> 00:15:39.349
And at the same time, there are
so many things that each of you

00:15:39.350 --> 00:15:40.389
can build.

00:15:40.389 --> 00:15:42.697
And what I find is the number
of ideas out in the world

00:15:42.697 --> 00:15:45.029
is much greater than the
number of people with the skill

00:15:45.029 --> 00:15:45.889
to build them.

00:15:45.889 --> 00:15:49.029
So I know that finding jobs has
gotten tougher for fresh college

00:15:49.029 --> 00:15:49.529
grads.

00:15:49.529 --> 00:15:51.709
At the same time,
a lot of teams just

00:15:51.710 --> 00:15:53.530
can't find enough
skilled people.

00:15:53.529 --> 00:15:56.669
And so there are
a lot of projects

00:15:56.669 --> 00:15:58.610
in the world that if
you don't build it,

00:15:58.610 --> 00:16:00.990
I think no one else
will build it either.

00:16:00.990 --> 00:16:04.376
So you don't need to-- so
long as you don't harm others,

00:16:04.376 --> 00:16:06.709
be responsible, there are a
lot of things that you don't

00:16:06.710 --> 00:16:07.810
need to wait for permission.

00:16:07.809 --> 00:16:10.059
You don't need to wait for
someone else to do it first

00:16:10.059 --> 00:16:11.509
and then you do it.

00:16:11.509 --> 00:16:14.909
The cost of a failure is
much lower than before

00:16:14.909 --> 00:16:17.429
because you waste a weekend
but learn something.

00:16:17.429 --> 00:16:18.989
That seems fine to me.

00:16:18.990 --> 00:16:21.350
So I think so let's
be responsible,

00:16:21.350 --> 00:16:24.629
going for trying things out
and building lots of things

00:16:24.629 --> 00:16:27.429
would be the number one
most important thing I

00:16:27.429 --> 00:16:31.029
think would help your careers.

00:16:31.029 --> 00:16:35.579
And yeah, I think I'm going
to say one last thing that

00:16:35.580 --> 00:16:39.940
is considered not politically
correct in some circles,

00:16:39.940 --> 00:16:43.860
but I'll just say it anyway,
which is in some circles,

00:16:43.860 --> 00:16:47.060
it has become considered
not politically correct

00:16:47.059 --> 00:16:50.339
to encourage others
to work hard.

00:16:50.340 --> 00:16:53.540
I'm going to encourage
you to work hard.

00:16:53.539 --> 00:16:55.620
Now, I think the reason
some people don't

00:16:55.620 --> 00:16:57.913
like that is because
there are some people that

00:16:57.913 --> 00:16:59.580
are in a phase of
life where they're not

00:16:59.580 --> 00:17:00.960
in a position to work hard.

00:17:00.960 --> 00:17:03.200
So right after my
children were born,

00:17:03.200 --> 00:17:05.960
I was not working hard for
a short period of time.

00:17:05.960 --> 00:17:10.818
And there are people because
of an injury or disability,

00:17:10.818 --> 00:17:12.460
whatever very valid
reasons, they're

00:17:12.460 --> 00:17:14.799
not in a position to work
hard at that moment in time.

00:17:14.799 --> 00:17:16.559
And we should respect
them, support them,

00:17:16.559 --> 00:17:18.268
make sure they're well
taken care of even

00:17:18.268 --> 00:17:19.700
though they're not working hard.

00:17:19.700 --> 00:17:22.180
Having said that,
all of my, say,

00:17:22.180 --> 00:17:24.640
PhD students have
become very successful,

00:17:24.640 --> 00:17:27.040
I saw every single one of
them work incredibly hard.

00:17:27.039 --> 00:17:30.460
I mean, the 2:00 AM sitting
up, hyperparameter tuning,

00:17:30.460 --> 00:17:31.440
been there, done that.

00:17:31.440 --> 00:17:33.490
Still doing it some days.

00:17:33.490 --> 00:17:36.450
And if you are fortunate enough
to be in a position in life

00:17:36.450 --> 00:17:40.250
where you can work
really hard, there

00:17:40.250 --> 00:17:44.210
are so many opportunities
to do things right now.

00:17:44.210 --> 00:17:47.529
If you get excited, as I do,
spending evenings and weekends

00:17:47.529 --> 00:17:50.289
coding and building stuff
and getting user feedback,

00:17:50.289 --> 00:17:51.990
if you lean in and
do those things,

00:17:51.990 --> 00:17:54.470
it will increase your odds
of being really successful.

00:17:54.470 --> 00:17:55.289
So I don't know.

00:17:55.289 --> 00:17:57.432
Maybe I get into some
trouble with some people

00:17:57.432 --> 00:17:58.849
encouraging me to
work hard, but I

00:17:58.849 --> 00:18:02.049
find that the truth is
people that work hard

00:18:02.049 --> 00:18:02.956
get a lot more done.

00:18:02.957 --> 00:18:05.290
We should also respect people
that don't and people that

00:18:05.289 --> 00:18:06.809
aren't in a position to do so.

00:18:06.809 --> 00:18:10.129
But between watching
some dumb TV

00:18:10.130 --> 00:18:14.010
show versus firing your
agentic coder on a weekend

00:18:14.009 --> 00:18:16.049
to try something,
I'm going to choose

00:18:16.049 --> 00:18:18.407
the latter almost every time.

00:18:18.407 --> 00:18:20.949
Unless I'm watching a show with
my kids, sometimes I do that.

00:18:20.950 --> 00:18:23.190
But you mean--

00:18:23.190 --> 00:18:26.450
I hope you do that.

00:18:26.450 --> 00:18:29.650
All right, so those are the
main things I wanted to say.

00:18:29.650 --> 00:18:33.640
What I want to do is hand the
stage over to my good friend

00:18:33.640 --> 00:18:38.520
Laurence Moroney, who share
a lot more about career

00:18:38.519 --> 00:18:39.133
advice on AI.

00:18:39.133 --> 00:18:40.133
Let me just quick intro.

00:18:40.133 --> 00:18:42.355
I've known Laurence
for a long time.

00:18:42.355 --> 00:18:44.480
He's done a lot of online
education work, sometimes

00:18:44.480 --> 00:18:46.380
with me and my teams,
taught a lot of people

00:18:46.380 --> 00:18:49.060
Tensorflow, taught a
lot of people PyTorch.

00:18:49.059 --> 00:18:52.119
He was lead AI advocate at
Google for many years, now

00:18:52.119 --> 00:18:53.707
runs a group at Arm.

00:18:53.708 --> 00:18:55.500
I've also enjoyed quite
a few of his books.

00:18:55.500 --> 00:18:56.619
This is one of them.

00:18:56.619 --> 00:18:59.259
He recently also published
a new book on PyTorch.

00:18:59.259 --> 00:19:02.200
This is an excellent book,
Introduction to PyTorch.

00:19:02.200 --> 00:19:06.100
And he's a very sought after
speakers in many circles,

00:19:06.099 --> 00:19:10.639
so I was very grateful when
he agreed to come speak to us.

00:19:10.640 --> 00:19:11.740
Pleasure is all mine.

00:19:11.740 --> 00:19:12.839
I just want to
reinforce something

00:19:12.839 --> 00:19:14.279
that Andrew was
talking about earlier

00:19:14.279 --> 00:19:15.839
on about choosing
the people that you

00:19:15.839 --> 00:19:17.639
work with being very important.

00:19:17.640 --> 00:19:20.360
But I also want to show that
from the other way around that

00:19:20.359 --> 00:19:22.119
the company, when
they're interviewing you

00:19:22.119 --> 00:19:23.839
are also choosing you.

00:19:23.839 --> 00:19:25.559
And the good
companies really want

00:19:25.559 --> 00:19:27.919
to choose the people
that they work with also.

00:19:27.920 --> 00:19:30.990
And I've been doing a lot
of mentoring of young people

00:19:30.990 --> 00:19:32.910
over the last, particularly
over the last 18

00:19:32.910 --> 00:19:36.009
months, who are hunting
for careers for themselves.

00:19:36.009 --> 00:19:40.470
And I want to tell the story
of one young man and this guy,

00:19:40.470 --> 00:19:46.670
very well educated, great
experience, super elite coder.

00:19:46.670 --> 00:19:49.650
He could do every challenge
that was in front of him,

00:19:49.650 --> 00:19:51.410
and he got laid off
from his job in April.

00:19:51.410 --> 00:19:54.750
He worked in medical software,
and medical software business

00:19:54.750 --> 00:19:56.529
has been changing drastically.

00:19:56.529 --> 00:19:58.629
Funding has been cut by
the Federal government

00:19:58.630 --> 00:20:01.430
in a number of areas, and he
got laid off from his job.

00:20:01.430 --> 00:20:03.250
And with his experience,
with his ability,

00:20:03.250 --> 00:20:05.170
with his skills, all of
these kind of things,

00:20:05.170 --> 00:20:06.190
he thought that it
would be very easy

00:20:06.190 --> 00:20:07.750
for him to find another job.

00:20:07.750 --> 00:20:09.970
And the poor young guy had
a really terrible April.

00:20:09.970 --> 00:20:12.085
He got laid off from
his job in April.

00:20:12.085 --> 00:20:13.710
Immediately before
that, his girlfriend

00:20:13.710 --> 00:20:15.350
had broken up with him,
and then a couple of weeks

00:20:15.349 --> 00:20:16.569
later, his dog died.

00:20:16.569 --> 00:20:19.109
So he was not in a good place.

00:20:19.109 --> 00:20:22.629
And so I sat down with him
after a couple of months

00:20:22.630 --> 00:20:23.850
and took a look.

00:20:23.849 --> 00:20:27.500
And he had a spreadsheet of
jobs that he was applying to,

00:20:27.500 --> 00:20:31.539
and he had over 300 jobs that he
was tracking in the spreadsheet.

00:20:31.539 --> 00:20:33.579
And in a number of
these jobs, he actually

00:20:33.579 --> 00:20:35.779
got into the interview
process, and he

00:20:35.779 --> 00:20:37.660
went very deep in
the interview process

00:20:37.660 --> 00:20:40.519
with companies like Meta.

00:20:40.519 --> 00:20:41.019
Who else?

00:20:41.019 --> 00:20:42.180
Not Google.

00:20:42.180 --> 00:20:42.840
It was Meta.

00:20:42.839 --> 00:20:43.959
There was Microsoft.

00:20:43.960 --> 00:20:45.501
There was one of
the other large tech

00:20:45.501 --> 00:20:48.319
companies where you do lots
and lots of interview loops.

00:20:48.319 --> 00:20:51.299
And every time towards
the end of the loop,

00:20:51.299 --> 00:20:52.879
he knew he did a great loop.

00:20:52.880 --> 00:20:54.460
He solved all the coding.

00:20:54.460 --> 00:20:56.600
He had great conversations
with the people,

00:20:56.599 --> 00:20:58.039
or at least he thought he had.

00:20:58.039 --> 00:20:59.559
And then every
time within a day,

00:20:59.559 --> 00:21:04.059
the recruiter would call him and
say, no, you didn't get the job.

00:21:04.059 --> 00:21:06.519
And it was like it
was heartbreaking.

00:21:06.519 --> 00:21:10.220
And like I said, 300 plus
jobs he had been tracking.

00:21:10.220 --> 00:21:13.180
So I started working with him
to do some mock interviews

00:21:13.180 --> 00:21:15.019
and to do some fine tuning.

00:21:15.019 --> 00:21:17.940
Oh, it was Jeff Bezos
company, not Amazon.

00:21:17.940 --> 00:21:19.740
That was one of the
other big tech company

00:21:19.740 --> 00:21:21.180
that he'd interviewed with.

00:21:21.180 --> 00:21:22.700
And I started
working through him

00:21:22.700 --> 00:21:25.075
and doing some test interviews
and all this kind of thing

00:21:25.075 --> 00:21:25.680
with him.

00:21:25.680 --> 00:21:27.890
Terrific, terrific candidate
couldn't figure out

00:21:27.890 --> 00:21:31.210
what was going wrong until
I decided to try and do

00:21:31.210 --> 00:21:33.970
a different sort of
interview where I gave him

00:21:33.970 --> 00:21:36.210
a really tough interview.

00:21:36.210 --> 00:21:38.490
I gave him some tough LeetCode.

00:21:38.490 --> 00:21:43.769
I gave him some really obscure
corner cases in his coding.

00:21:43.769 --> 00:21:46.129
And I saw how he reacted.

00:21:46.130 --> 00:21:48.210
And how he reacted
was the advice

00:21:48.210 --> 00:21:50.625
that was given to him in
the recruiting pamphlets.

00:21:50.625 --> 00:21:52.250
And a lot of these
recruiting pamphlets

00:21:52.250 --> 00:21:57.089
will say things like, you're
going to have an opportunity

00:21:57.089 --> 00:21:59.829
to share an opinion, and you've
got to stand your ground.

00:21:59.829 --> 00:22:01.269
You've got to have a backbone.

00:22:01.269 --> 00:22:03.129
Don't bend.

00:22:03.130 --> 00:22:07.930
His interpretation of that was
to be really, really tough.

00:22:07.930 --> 00:22:09.650
So I would pick corners.

00:22:09.650 --> 00:22:11.310
I would pick holes in his code.

00:22:11.309 --> 00:22:13.909
I'd pick corner cases
where things may not work,

00:22:13.910 --> 00:22:15.910
and I would give him
a test of crisis.

00:22:15.910 --> 00:22:18.970
And this advice that he'd
been given to stand his ground

00:22:18.970 --> 00:22:23.450
ended up making him hostile in
these interview environments.

00:22:23.450 --> 00:22:26.370
And I was looking at
this then from the point

00:22:26.369 --> 00:22:28.469
of view of what Andrew
was just talking about,

00:22:28.470 --> 00:22:31.809
where it's a case of hey, good
people, good teams, people

00:22:31.809 --> 00:22:33.529
that you can work together with.

00:22:33.529 --> 00:22:35.109
And from the
interviewer perspective,

00:22:35.109 --> 00:22:38.689
if I'm managing this team,
this person is that cliched 10x

00:22:38.690 --> 00:22:41.769
engineer, but I don't want
him anywhere near my team

00:22:41.769 --> 00:22:44.250
because of this attitude.

00:22:44.250 --> 00:22:45.069
We worked on that.

00:22:45.069 --> 00:22:45.950
We fine-tuned it.

00:22:45.950 --> 00:22:49.730
And the strange part is he's
a really, really nice guy.

00:22:49.730 --> 00:22:52.410
It's just this was the
advice he was given,

00:22:52.410 --> 00:22:54.970
and he followed that advice,
and he failed so many interviews

00:22:54.970 --> 00:22:56.130
as a result.

00:22:56.130 --> 00:22:58.850
So when I gave him the next
job that he was interviewing at

00:22:58.849 --> 00:23:03.129
was at a company where teamwork
is very, very highly valued.

00:23:03.130 --> 00:23:05.990
And the good news is he got
the job at that company.

00:23:05.990 --> 00:23:07.390
He's now working there.

00:23:07.390 --> 00:23:10.009
He doubled his salary from
the job he was laid off from,

00:23:10.009 --> 00:23:12.170
and he ended up having
about-- now he looks back

00:23:12.170 --> 00:23:14.380
and he had six months
of fun employment.

00:23:14.380 --> 00:23:16.630
But at the time when he was
going through all of that,

00:23:16.630 --> 00:23:19.210
it was a very, very
difficult time for him.

00:23:19.210 --> 00:23:21.650
So the flip side of it, if
you're looking at a company

00:23:21.650 --> 00:23:23.567
and looking at the paper
you'd be working with

00:23:23.567 --> 00:23:24.660
is very, very important.

00:23:24.660 --> 00:23:28.040
But also realize they are
looking at you in the same way.

00:23:28.039 --> 00:23:31.836
And so if you've gone to
a tech interview coaching,

00:23:31.836 --> 00:23:33.920
and they gave you that
advice to stand your ground

00:23:33.920 --> 00:23:36.180
and have a backbone,
it's good to do that.

00:23:36.180 --> 00:23:38.253
But don't be a jerk
while you're doing so.

00:23:38.252 --> 00:23:39.169
Can you see my slides?

00:23:39.170 --> 00:23:39.670
OK.

00:23:39.670 --> 00:23:41.480
So I'm Laurence.

00:23:41.480 --> 00:23:44.279
I've been working in
tech for more decades

00:23:44.279 --> 00:23:48.759
than ChatGPT thinks there
are oars in strawberry.

00:23:48.759 --> 00:23:51.460
So I've worked in many of
the big tech companies.

00:23:51.460 --> 00:23:54.319
I spent many years at Microsoft,
spent many years at Google,

00:23:54.319 --> 00:23:56.939
also worked in
places like Reuters.

00:23:56.940 --> 00:23:59.480
I've done a lot of work in
startups, both in this country

00:23:59.480 --> 00:24:00.519
and abroad.

00:24:00.519 --> 00:24:02.480
And so what I really
want to talk about today

00:24:02.480 --> 00:24:06.039
is like to think about what
does the career landscape look

00:24:06.039 --> 00:24:09.440
like today, particularly in AI.

00:24:09.440 --> 00:24:13.299
Because first of all, what
Andrew said about in Stanford,

00:24:13.299 --> 00:24:16.274
you've got the ability to
make use of the networks

00:24:16.275 --> 00:24:18.400
that you have in Stanford,
make use of the prestige

00:24:18.400 --> 00:24:21.230
that you have, and I say
use every weapon you have.

00:24:21.230 --> 00:24:23.029
Because unfortunately,
the landscape right

00:24:23.029 --> 00:24:25.029
now is not ideal.

00:24:25.029 --> 00:24:27.085
We've gone through some
very difficult times.

00:24:27.085 --> 00:24:28.710
All you have to do
is look at the news,

00:24:28.710 --> 00:24:33.230
and you can see massive tech
layoffs, slowing hiring in tech,

00:24:33.230 --> 00:24:34.410
and lots of stuff like that.

00:24:34.410 --> 00:24:36.910
But it's not
necessarily a bad thing

00:24:36.910 --> 00:24:38.610
if you do it the right way.

00:24:38.609 --> 00:24:40.909
So I want to just have a
quick look the job market

00:24:40.910 --> 00:24:42.790
reality check.

00:24:42.789 --> 00:24:44.470
Actually out of
interest, I don't know.

00:24:44.470 --> 00:24:46.692
This is a-- are you juniors?

00:24:46.692 --> 00:24:49.109
You're graduating this year
or you're graduating next year

00:24:49.109 --> 00:24:52.109
or what is the general survey?

00:24:52.109 --> 00:24:53.750
You're third year of four?

00:24:53.750 --> 00:24:55.269
[INAUDIBLE]

00:24:55.269 --> 00:24:56.690
Third year of
three, I would say.

00:24:56.690 --> 00:24:59.309
So you're going to be
graduating coming summer.

00:24:59.309 --> 00:25:02.549
How many people are
already looking for jobs?

00:25:02.549 --> 00:25:04.009
OK, quite a few of you.

00:25:04.009 --> 00:25:06.910
How many people
have had success?

00:25:06.910 --> 00:25:07.830
Nobody.

00:25:07.829 --> 00:25:08.329
Oh, one.

00:25:08.329 --> 00:25:08.869
OK.

00:25:08.869 --> 00:25:09.989
That's good.

00:25:09.990 --> 00:25:12.630
So you're probably seeing some
of these things, the signals

00:25:12.630 --> 00:25:15.530
out there, junior hiring
slowing significantly.

00:25:15.529 --> 00:25:18.389
When I say junior, I
mean graduate level.

00:25:18.390 --> 00:25:21.240
High-profile layoffs are
dominating the headlines.

00:25:21.240 --> 00:25:23.140
I was at Google a
couple of years ago

00:25:23.140 --> 00:25:25.220
when they had the biggest
layoff they'd ever had.

00:25:25.220 --> 00:25:28.079
We're seeing layoffs at the
likes of Amazon, Microsoft,

00:25:28.079 --> 00:25:30.220
other companies like that.

00:25:30.220 --> 00:25:33.559
It feels that entry-level
positions are scarce,

00:25:33.559 --> 00:25:35.879
and I'm underlining
the word "feels" there,

00:25:35.880 --> 00:25:38.540
and I want to get into that in
a little bit more detail later.

00:25:38.539 --> 00:25:41.700
And also, competition is fierce.

00:25:41.700 --> 00:25:43.500
But my question is,
should you worry?

00:25:43.500 --> 00:25:45.099
And I say, no.

00:25:45.099 --> 00:25:49.519
Because if you can approach
things in the right way,

00:25:49.519 --> 00:25:52.220
if you can approach the job
hunting thing in the right way,

00:25:52.220 --> 00:25:55.900
particularly understanding how
rapidly the AI landscape is

00:25:55.900 --> 00:25:58.740
changing, then I think
people with the right mindset

00:25:58.740 --> 00:26:00.700
will thrive.

00:26:00.700 --> 00:26:03.259
So what do I mean by that?

00:26:03.259 --> 00:26:06.339
So as Andrew had mentioned,
the AI hiring landscape

00:26:06.339 --> 00:26:10.259
is changing because the
AI industry is changing.

00:26:10.259 --> 00:26:12.039
The AI industry I--

00:26:12.039 --> 00:26:16.700
I actually first got involved
in AI back way back in 1992.

00:26:16.700 --> 00:26:19.789
I worked in it for a little
while just before the AI winter.

00:26:19.789 --> 00:26:23.950
Everything failed drastically,
but I got bitten by the AI bug.

00:26:23.950 --> 00:26:29.390
And then in 2015, when Google
were launching TensorFlow,

00:26:29.390 --> 00:26:32.970
I got pulled right back into
it, became part of the whole AI

00:26:32.970 --> 00:26:35.690
boom, launching TensorFlow,
advocating TensorFlow

00:26:35.690 --> 00:26:37.330
to millions of
people, and seeing

00:26:37.329 --> 00:26:38.929
the changes that happened.

00:26:38.930 --> 00:26:44.730
But along 2021, 2022, we
had a global pandemic.

00:26:44.730 --> 00:26:48.210
The global pandemic caused a
massive industrial slowdown.

00:26:48.210 --> 00:26:50.049
This massive industrial
slowdown meant

00:26:50.049 --> 00:26:51.930
that companies had
to start pivoting

00:26:51.930 --> 00:26:55.289
towards things that drove
revenue and directly drove

00:26:55.289 --> 00:26:56.049
revenue.

00:26:56.049 --> 00:26:58.549
And at Google, TensorFlow
was an open-source product.

00:26:58.549 --> 00:27:00.829
It didn't directly
drive revenue.

00:27:00.829 --> 00:27:02.369
We began to scale back.

00:27:02.369 --> 00:27:04.289
Every company in the
world also scaled back

00:27:04.289 --> 00:27:06.329
on hiring at this time.

00:27:06.329 --> 00:27:09.069
Then we get to about 2022, 2023.

00:27:09.069 --> 00:27:10.109
What happens?

00:27:10.109 --> 00:27:12.529
We begin to come out
of the global pandemic.

00:27:12.529 --> 00:27:15.329
We begin to realize
all industries have

00:27:15.329 --> 00:27:19.199
this massive logjam of
non-hiring that they had done

00:27:19.200 --> 00:27:21.279
or hiring that they hadn't done.

00:27:21.279 --> 00:27:23.519
And we're also
entering a time where

00:27:23.519 --> 00:27:25.019
AI was exploding on the scene.

00:27:25.019 --> 00:27:27.039
Thanks to the work of
people like Andrew,

00:27:27.039 --> 00:27:30.079
the world was pivoting and
changing to be AI first

00:27:30.079 --> 00:27:31.659
in just about everything.

00:27:31.660 --> 00:27:34.480
And every company needed
to hire like crazy.

00:27:34.480 --> 00:27:38.519
Every company then hiring
like crazy in 2022, 2023

00:27:38.519 --> 00:27:42.879
meant that most companies
ended up overhiring.

00:27:42.880 --> 00:27:45.760
And what that
generally meant was

00:27:45.759 --> 00:27:50.359
people who were not qualified
for higher positions usually got

00:27:50.359 --> 00:27:53.359
higher positions because you
had to enter into a bidding war

00:27:53.359 --> 00:27:54.899
just to be able to get talent.

00:27:54.900 --> 00:27:56.600
You ended up having
talent grabs,

00:27:56.599 --> 00:27:59.599
and you ended up having stories
like the one Andrew told where

00:27:59.599 --> 00:28:03.139
it's a case here's a person
with AI talent, let's grab them,

00:28:03.140 --> 00:28:05.700
let's throw money at them, let's
have them come work for us,

00:28:05.700 --> 00:28:07.799
and then we'll figure
out what we want to do.

00:28:07.799 --> 00:28:11.799
So as a result, 2022, 2023
all of this massive overhiring

00:28:11.799 --> 00:28:16.839
happens because of AI and
because of the COVID logjam.

00:28:16.839 --> 00:28:20.919
And then 2024, 2025 is
the great wake-up, where

00:28:20.920 --> 00:28:24.200
a lot of companies realize this
over hiring that they had done,

00:28:24.200 --> 00:28:27.080
they have ended up with a lot of
people who are underqualified.

00:28:27.079 --> 00:28:27.579
I'm sorry.

00:28:27.579 --> 00:28:29.919
Yeah, underqualified for the
job that they were doing.

00:28:29.920 --> 00:28:32.253
A lot of people ended up
getting hired just because they

00:28:32.252 --> 00:28:33.399
had AI on their resume.

00:28:33.400 --> 00:28:35.192
And there's a big
adjustment going on.

00:28:35.192 --> 00:28:36.900
And in the light of
this big adjustment--

00:28:36.900 --> 00:28:37.840
show you-- just one second.

00:28:37.839 --> 00:28:39.639
In the light of this
big adjustment-- oh,

00:28:39.640 --> 00:28:40.610
you're not saying my slides?

00:28:40.609 --> 00:28:41.109
OK.

00:28:41.109 --> 00:28:45.519
And in the light of this big
adjustment-- there we go.

00:28:45.519 --> 00:28:46.920
I think it's because my power.

00:28:46.920 --> 00:28:48.633
I'm not plugged
into power mains.

00:28:48.633 --> 00:28:50.299
And in the light of
this big adjustment,

00:28:50.299 --> 00:28:52.319
then what has
happened is now a lot

00:28:52.319 --> 00:28:56.240
of companies are much more
cautious about AI skills

00:28:56.240 --> 00:28:57.240
that they're hiring.

00:28:57.240 --> 00:28:59.359
And if you're coming into
that with that mindset

00:28:59.359 --> 00:29:04.000
and understanding that, realize
opportunity is still there,

00:29:04.000 --> 00:29:06.720
and opportunity
is there massively

00:29:06.720 --> 00:29:09.019
if you approach
it strategically.

00:29:09.019 --> 00:29:10.519
So what I want to
talk through today

00:29:10.519 --> 00:29:13.789
is how you can do exactly that.

00:29:13.789 --> 00:29:17.670
So I see three pillars of
success in the business world

00:29:17.670 --> 00:29:19.570
and particularly in
the AI business world.

00:29:19.569 --> 00:29:21.869
And nowadays you can't
just have AI on your resume

00:29:21.869 --> 00:29:23.089
and get overhired.

00:29:23.089 --> 00:29:25.109
Nowadays, not only
do you have to be

00:29:25.109 --> 00:29:28.669
able to tell that you have
the mindset of these three

00:29:28.670 --> 00:29:32.070
pillars of success, but you
also have to be able to show.

00:29:32.069 --> 00:29:34.429
And to be able to show these,
that actually has never

00:29:34.430 --> 00:29:35.670
been a better time.

00:29:35.670 --> 00:29:38.430
As Andrew demonstrated earlier
on, the ability to vibe

00:29:38.430 --> 00:29:39.751
code things into existence.

00:29:39.751 --> 00:29:41.210
He doesn't like
the word vibe code.

00:29:41.210 --> 00:29:42.950
I agree with him,
but the ability

00:29:42.950 --> 00:29:45.430
to prompt things into existence,
or whatever the word is

00:29:45.430 --> 00:29:48.190
that we want to use,
allows you to be

00:29:48.190 --> 00:29:51.470
able to show better
than ever before.

00:29:51.470 --> 00:29:54.170
He was talking earlier on
about product managers,

00:29:54.170 --> 00:29:55.990
and he had this time
when he got engineers

00:29:55.990 --> 00:29:58.150
to be product managers,
and then those engineers

00:29:58.150 --> 00:30:00.330
ended up being really
bad product managers.

00:30:00.329 --> 00:30:03.990
I actually interviewed at
Google twice and failed twice

00:30:03.990 --> 00:30:07.509
despite being very
successful at Microsoft,

00:30:07.509 --> 00:30:11.343
authored 20 plus books,
taught college courses.

00:30:11.343 --> 00:30:13.259
I interviewed at Google
twice and failed twice

00:30:13.259 --> 00:30:15.299
because I was interviewing
to be a product manager,

00:30:15.299 --> 00:30:17.799
and then when I interviewed to
be an engineer, they hired me

00:30:17.799 --> 00:30:20.779
and they were like, why didn't
you try to join us years ago?

00:30:20.779 --> 00:30:23.720
So a lot of it is just
being a good engineer.

00:30:23.720 --> 00:30:27.019
You've got the ability to do
that and show that nowadays.

00:30:27.019 --> 00:30:29.599
And with that ratio of
engineer to product manager,

00:30:29.599 --> 00:30:31.779
changing engineering
skills are also

00:30:31.779 --> 00:30:33.579
far more valuable than ever.

00:30:33.579 --> 00:30:35.480
So the three pillars to success.

00:30:35.480 --> 00:30:37.779
Number 1,
understanding in depth.

00:30:37.779 --> 00:30:40.700
And I'm going to mean this
in two different ways.

00:30:40.700 --> 00:30:45.259
Number one is academically, to
have the understanding and depth

00:30:45.259 --> 00:30:47.940
academically of
machine learning,

00:30:47.940 --> 00:30:50.220
of particular model
architectures,

00:30:50.220 --> 00:30:52.880
to be able to understand them,
to be able to read papers,

00:30:52.880 --> 00:30:55.420
to be able to understand
what's in those papers,

00:30:55.420 --> 00:30:59.019
and to be able to understand,
in particular, how to take

00:30:59.019 --> 00:31:00.940
that stuff and put it to work.

00:31:00.940 --> 00:31:03.240
The second part of
understanding in depth

00:31:03.240 --> 00:31:07.299
is really having your finger on
the pulse of particular trends

00:31:07.299 --> 00:31:10.450
and where the
signal-to-noise ratio favors

00:31:10.450 --> 00:31:11.670
signal in those trends.

00:31:11.670 --> 00:31:13.503
And I'm going to be
going into that in a lot

00:31:13.502 --> 00:31:15.329
more detail a little bit later.

00:31:15.329 --> 00:31:19.649
Secondly, and also very, very
importantly is business focus.

00:31:19.650 --> 00:31:22.009
So Andrew said something
politically incorrect

00:31:22.009 --> 00:31:22.509
earlier on.

00:31:22.509 --> 00:31:25.490
I'm going to also say a similar
politically incorrect thing.

00:31:25.490 --> 00:31:28.329
First of all, hard work.

00:31:28.329 --> 00:31:32.049
Hard work is such
a nebulous term

00:31:32.049 --> 00:31:35.210
that I would say that think
about hard work in terms of you

00:31:35.210 --> 00:31:37.130
are what you measure.

00:31:37.130 --> 00:31:38.750
There is the whole
trend out there.

00:31:38.750 --> 00:31:41.380
I'm trying to remember,
is it 996 or is it 669?

00:31:41.380 --> 00:31:42.110
996.

00:31:42.109 --> 00:31:46.809
9:00 AM to 9:00 PM, six days a
week is a metric of hard work.

00:31:46.809 --> 00:31:47.852
It's not.

00:31:47.853 --> 00:31:49.269
There's not a
metric of hard work.

00:31:49.269 --> 00:31:51.690
That's a metric of time spent.

00:31:51.690 --> 00:31:54.250
So I would encourage everybody,
in the same way as Andrew

00:31:54.250 --> 00:31:55.950
did, to think about hard work.

00:31:55.950 --> 00:31:59.890
But what hard work is how
you measure that hard work.

00:31:59.890 --> 00:32:03.770
You can work eight hours a day
and be incredibly productive.

00:32:03.769 --> 00:32:06.889
You can work six hours a day
and be incredibly productive,

00:32:06.890 --> 00:32:09.280
but it's the metric
of how hard you work

00:32:09.279 --> 00:32:10.859
and how you measure that.

00:32:10.859 --> 00:32:13.199
I personally measure
that from output,

00:32:13.200 --> 00:32:16.720
things that I have created
in the time that I spent.

00:32:16.720 --> 00:32:21.000
I joke a lot, but it's true that
I've written a lot of books.

00:32:21.000 --> 00:32:22.140
Andrew held up one.

00:32:22.140 --> 00:32:25.680
That one that he held up, that
he helped me write a little bit,

00:32:25.680 --> 00:32:28.600
I actually wrote that
book in about two months.

00:32:28.599 --> 00:32:31.071
And people say, well, how do
you have time with your jobs

00:32:31.071 --> 00:32:32.279
and all these kind of things?

00:32:32.279 --> 00:32:34.208
You must work like
16 hours a day

00:32:34.208 --> 00:32:35.500
in order to be able to do this.

00:32:35.500 --> 00:32:38.640
But actually, the key to me
being able to write books

00:32:38.640 --> 00:32:40.320
is baseball.

00:32:40.319 --> 00:32:42.439
Any baseball fans here?

00:32:42.440 --> 00:32:45.320
So I love baseball, but if
you sit down and try to watch

00:32:45.319 --> 00:32:48.759
baseball on TV, a match can take
like 3 and 1/2 or four hours.

00:32:48.759 --> 00:32:51.319
So all of my writing I tend
to do in baseball season.

00:32:51.319 --> 00:32:54.000
So I'm like, if I'm going to
sit down, I like the Mariners.

00:32:54.000 --> 00:32:54.940
I'm from Seattle.

00:32:54.940 --> 00:32:57.279
I like the Dodgers.

00:32:57.279 --> 00:32:58.019
Nobody booed.

00:32:58.019 --> 00:32:59.240
OK, good.

00:32:59.240 --> 00:33:02.400
And so usually one of those is
going to be playing at 7 o'clock

00:33:02.400 --> 00:33:02.900
at night.

00:33:02.900 --> 00:33:04.700
So instead of sitting
in front of the TV,

00:33:04.700 --> 00:33:06.393
just like watching
baseball mindlessly.

00:33:06.393 --> 00:33:08.309
I'll actually be writing
a book while baseball

00:33:08.309 --> 00:33:09.309
is on in the background.

00:33:09.309 --> 00:33:10.710
It's a very slow moving game.

00:33:10.710 --> 00:33:11.730
This is something.

00:33:11.730 --> 00:33:14.089
That's the hard
work in this case.

00:33:14.089 --> 00:33:17.990
And I would encourage you to
try to find areas where you can

00:33:17.990 --> 00:33:20.670
work hard and produce output.

00:33:20.670 --> 00:33:22.789
And that's the second
pillar here is that business

00:33:22.789 --> 00:33:25.389
focus, the output
that you produce

00:33:25.390 --> 00:33:28.910
to align that output with
the business focus that you

00:33:28.910 --> 00:33:31.269
want to have and with the
work that you want to do.

00:33:31.269 --> 00:33:35.049
There's an old saying, "Don't
dress for the job you have,

00:33:35.049 --> 00:33:36.669
dress for the one you want."

00:33:36.670 --> 00:33:39.910
I would say a new angle
on that saying would

00:33:39.910 --> 00:33:42.650
be, don't let your output
be for the job you have.

00:33:42.650 --> 00:33:45.062
Let your output be
for the job you want.

00:33:45.061 --> 00:33:47.269
And if I go back to when I
spoke about I failed twice

00:33:47.269 --> 00:33:51.029
at Google to get in, the
third time when I got in,

00:33:51.029 --> 00:33:53.230
I had actually decided
to do to approach

00:33:53.230 --> 00:33:54.470
this in a different way.

00:33:54.470 --> 00:33:57.170
And I was interviewing at the
time for their cloud team.

00:33:57.170 --> 00:33:59.350
They were just really
launching cloud,

00:33:59.349 --> 00:34:01.949
and I had just written
a book on Java.

00:34:01.950 --> 00:34:03.630
And so I decided
to see what I could

00:34:03.630 --> 00:34:05.410
do with Java in their cloud.

00:34:05.410 --> 00:34:07.670
I ended up writing a
Java application that

00:34:07.670 --> 00:34:10.710
ran in their cloud for
predicting stock prices using

00:34:10.710 --> 00:34:13.269
technical analytics and
all that kind of stuff.

00:34:13.269 --> 00:34:15.050
And when it got
to the interview,

00:34:15.050 --> 00:34:17.909
instead of them asking me stupid
questions like how many golf

00:34:17.909 --> 00:34:21.769
balls can fit in a bus,
they saw this code.

00:34:21.769 --> 00:34:22.989
I had put this code.

00:34:22.989 --> 00:34:26.570
I remember I was producing
output for the job I wanted.

00:34:26.570 --> 00:34:30.269
I'd put this code on my resume,
and my entire interview loop

00:34:30.269 --> 00:34:32.230
was them asking
me about my code.

00:34:32.230 --> 00:34:34.610
So it put the power on me.

00:34:34.610 --> 00:34:37.230
It gave me the power to
communicate about things

00:34:37.230 --> 00:34:42.909
that I knew, as opposed to
going in blind to somebody

00:34:42.909 --> 00:34:44.710
asking me random
questions in the hope

00:34:44.710 --> 00:34:46.329
that I'll be able
to answer them.

00:34:46.329 --> 00:34:49.349
And it's the same thing I
would say in the AI world.

00:34:49.349 --> 00:34:53.150
The business focus, the ability
for you now to prompt code

00:34:53.150 --> 00:34:56.190
into existence, to prompt
products into existence

00:34:56.190 --> 00:34:58.510
and if you can
build those products

00:34:58.510 --> 00:35:02.090
and line them up with the thing
that it is that you want to do,

00:35:02.090 --> 00:35:03.940
be it a Google or
Meta or a startup

00:35:03.940 --> 00:35:05.860
or any of those kind
of things, and have

00:35:05.860 --> 00:35:08.760
that in-depth understanding
not just of your code,

00:35:08.760 --> 00:35:10.680
but how it aligns
to their business,

00:35:10.679 --> 00:35:13.139
this is a pillar of success
in this time and age.

00:35:13.139 --> 00:35:14.882
And I will also argue
that even though it

00:35:14.882 --> 00:35:17.340
looks like the signals look
like there aren't a lot of jobs

00:35:17.340 --> 00:35:19.180
out there, there are.

00:35:19.179 --> 00:35:21.299
What there aren't a lot
of is a good combination

00:35:21.300 --> 00:35:23.940
of jobs and people
to match them.

00:35:23.940 --> 00:35:26.619
And then, of course, this
bias towards delivery.

00:35:26.619 --> 00:35:29.739
"Ideas are cheap,
execution is everything."

00:35:29.739 --> 00:35:31.539
I've interviewed
many, many people

00:35:31.539 --> 00:35:34.980
who came in with very, very
fluffy ideas and no way

00:35:34.980 --> 00:35:36.300
to be able to ground them.

00:35:36.300 --> 00:35:39.220
I've interviewed people who
came in with half-baked ideas

00:35:39.219 --> 00:35:41.159
that they grounded
very, very well.

00:35:41.159 --> 00:35:42.819
Guess which ones got the job?

00:35:42.820 --> 00:35:44.740
So I would say
these three things.

00:35:44.739 --> 00:35:48.419
Understanding and depth
of the academics behind AI

00:35:48.420 --> 00:35:52.260
of the practicalities behind
AI and the things that you need

00:35:52.260 --> 00:35:53.020
to do.

00:35:53.019 --> 00:35:56.478
Business focus, focusing on
delivery for the business,

00:35:56.478 --> 00:35:58.019
understanding what
the business needs

00:35:58.019 --> 00:36:00.239
and being able to deliver
for that, and again,

00:36:00.239 --> 00:36:03.289
that bias towards delivery.

00:36:03.289 --> 00:36:04.789
So a quick pivot.

00:36:04.789 --> 00:36:07.809
What's it actually like
working in AI right now?

00:36:07.809 --> 00:36:09.250
It's interesting.

00:36:09.250 --> 00:36:15.210
So as recently as two or
three years ago, working in AI

00:36:15.210 --> 00:36:18.490
was if you could do a
thing, you're great.

00:36:18.489 --> 00:36:21.309
If you can build an image
classifier, you're golden.

00:36:21.309 --> 00:36:25.130
We'll throw six figure salaries
and massive stock benefits

00:36:25.130 --> 00:36:25.829
at you.

00:36:25.829 --> 00:36:28.090
Unfortunately, that's
not the case anymore.

00:36:28.090 --> 00:36:30.850
It's really a lot of
today what you'll see

00:36:30.849 --> 00:36:32.730
is the P word, production.

00:36:32.730 --> 00:36:34.829
What can you do for production?

00:36:34.829 --> 00:36:38.949
What can you do if it's
building new models,

00:36:38.949 --> 00:36:44.329
if it's optimizing models,
if it's understanding users,

00:36:44.329 --> 00:36:46.150
UX is really, really important.

00:36:46.150 --> 00:36:48.389
Everything is geared
towards production.

00:36:48.389 --> 00:36:50.809
Everything is biased
towards production.

00:36:50.809 --> 00:36:52.329
The history that
I told you about,

00:36:52.329 --> 00:36:57.409
going from the pandemic into the
overhiring phase that we'd had,

00:36:57.409 --> 00:37:01.199
the businesses have pulled
back and are optimized

00:37:01.199 --> 00:37:02.879
towards the bottom line.

00:37:02.880 --> 00:37:04.720
I have an old saying
that the bottom line is

00:37:04.719 --> 00:37:06.679
that the bottom line
is the bottom line,

00:37:06.679 --> 00:37:08.699
and this is the environment
that we're in today.

00:37:08.699 --> 00:37:10.839
And if you can come
in with that mindset

00:37:10.840 --> 00:37:12.720
when you're talking
with companies,

00:37:12.719 --> 00:37:16.012
that's one of the
keys to open the door.

00:37:16.012 --> 00:37:17.679
One of the things
I've seen in the field

00:37:17.679 --> 00:37:20.039
has been maturing from it
used to be really nice that we

00:37:20.039 --> 00:37:22.539
could do cool things and
we could build cool things.

00:37:22.539 --> 00:37:25.079
Now it's really
build useful things.

00:37:25.079 --> 00:37:27.299
Those useful things can
be cool too, by the way,

00:37:27.300 --> 00:37:28.860
and the results of
them can be cool.

00:37:28.860 --> 00:37:31.160
And the changes that
we see that come

00:37:31.159 --> 00:37:34.119
about as a result of
delivering them can be cool.

00:37:34.119 --> 00:37:36.259
So it's not just coolness
for coolness sake,

00:37:36.260 --> 00:37:43.203
but to focus on delivery, focus
on being able to provide value,

00:37:43.202 --> 00:37:44.619
and then the
coolness will follow.

00:37:44.619 --> 00:37:47.319
I guess what I'm
trying to argue.

00:37:47.320 --> 00:37:50.760
So for realities, number
1, unfortunately nowadays

00:37:50.760 --> 00:37:53.200
business focus is
non-negotiable.

00:37:53.199 --> 00:37:56.559
Now, let me-- I'm going to be a
little bit politically incorrect

00:37:56.559 --> 00:37:59.349
here again for a moment.

00:37:59.349 --> 00:38:03.349
I've been working, like I said,
for most of the last 35 years

00:38:03.349 --> 00:38:03.889
in tech.

00:38:03.889 --> 00:38:06.710
I would say for most
of the last 10 years,

00:38:06.710 --> 00:38:10.690
a lot of large companies,
particularly in Silicon Valley

00:38:10.690 --> 00:38:14.070
have really focused on
developing their people

00:38:14.070 --> 00:38:15.470
above everything.

00:38:15.469 --> 00:38:20.429
Part of developing their people
was bringing their entire self

00:38:20.429 --> 00:38:21.549
to work.

00:38:21.550 --> 00:38:23.990
Part of bringing their
entire self to work

00:38:23.989 --> 00:38:28.029
was bringing the things that
they care about outside of work.

00:38:28.030 --> 00:38:31.950
And that led to a lot of
activism within companies.

00:38:31.949 --> 00:38:35.089
Now, please let
me underline this.

00:38:35.090 --> 00:38:36.730
There is nothing
wrong with activism.

00:38:36.730 --> 00:38:41.710
There is nothing wrong with
wanting to support causes,

00:38:41.710 --> 00:38:44.289
not wanting to support
causes where of justice.

00:38:44.289 --> 00:38:46.369
There is absolutely
nothing wrong with that.

00:38:46.369 --> 00:38:50.130
But the overindexing on
that, in my experience,

00:38:50.130 --> 00:38:52.230
has led to a lot of
companies getting

00:38:52.230 --> 00:38:56.539
trapped by having to support
activism above business.

00:38:56.539 --> 00:38:59.500
You've probably seen an
example about two years ago

00:38:59.500 --> 00:39:03.500
of where activists in Google
broke into the Google Cloud

00:39:03.500 --> 00:39:08.300
heads office because they were
protesting a country that Google

00:39:08.300 --> 00:39:09.960
Cloud were doing business with.

00:39:09.960 --> 00:39:12.940
They broke into his office,
they had a sit-in his office,

00:39:12.940 --> 00:39:15.740
and they used the bathroom
all over his desk and stuff

00:39:15.739 --> 00:39:16.399
like that.

00:39:16.400 --> 00:39:18.740
This is where activism
got out of hand.

00:39:18.739 --> 00:39:21.579
And as a result, the
unfortunate truth

00:39:21.579 --> 00:39:25.659
is the good signals in that
activism are now being lost.

00:39:25.659 --> 00:39:28.679
Because of those actions,
people are being laid off.

00:39:28.679 --> 00:39:29.799
People are losing jobs.

00:39:29.800 --> 00:39:33.220
Activism is being stifled,
and business focus

00:39:33.219 --> 00:39:34.799
has become non-negotiable.

00:39:34.800 --> 00:39:37.140
There's a bit of a
pendulum swing going on.

00:39:37.139 --> 00:39:40.460
And the pendulum that had swung
too far towards allowing people

00:39:40.460 --> 00:39:42.780
to bring their
full selves to work

00:39:42.780 --> 00:39:45.100
is now swinging back
in the other direction.

00:39:45.099 --> 00:39:47.380
We might blame the
person in the White House

00:39:47.380 --> 00:39:49.280
and all that for
these kind of things,

00:39:49.280 --> 00:39:50.800
but it's not solely that.

00:39:50.800 --> 00:39:52.519
It is that ongoing
pendulum there.

00:39:52.519 --> 00:39:54.599
And I think it's an
important part of it,

00:39:54.599 --> 00:39:57.509
is that you have to realize
going into companies now,

00:39:57.510 --> 00:40:01.130
that business focus is
absolutely non-negotiable.

00:40:01.130 --> 00:40:04.210
Secondly, risk mitigation
is part of the job.

00:40:04.210 --> 00:40:07.090
And I think a very important
part of any job, particularly

00:40:07.090 --> 00:40:08.210
with AI.

00:40:08.210 --> 00:40:11.449
I think if you can come into
AI with a focus and a mindset

00:40:11.449 --> 00:40:15.129
around understanding the
risks of transforming

00:40:15.130 --> 00:40:19.730
a particular business process
to be an AI-oriented one

00:40:19.730 --> 00:40:22.030
and to help mitigate
those risks,

00:40:22.030 --> 00:40:24.522
I think is really,
really powerful.

00:40:24.521 --> 00:40:26.730
And I would argue in an
interview environment, that's

00:40:26.730 --> 00:40:31.610
the number one skill to have,
to have that mindset around you

00:40:31.610 --> 00:40:34.329
are doing a business
transformation from heuristic

00:40:34.329 --> 00:40:36.750
computing to
intelligent computing.

00:40:36.750 --> 00:40:37.489
Here's the risks.

00:40:37.489 --> 00:40:38.989
Here's how you
mitigate those risks,

00:40:38.989 --> 00:40:41.329
and here's the
mindset behind that.

00:40:41.329 --> 00:40:44.409
The third part
responsibility is evolving.

00:40:44.409 --> 00:40:48.129
Now responsibility
in AI has again

00:40:48.130 --> 00:40:54.090
changed from a very fluffy
definition of let's make sure

00:40:54.090 --> 00:40:58.890
that the AI works for everybody
to a definition of let's make

00:40:58.889 --> 00:41:00.750
sure that the AI works.

00:41:00.750 --> 00:41:03.010
Let's make sure that
it drives the business.

00:41:03.010 --> 00:41:06.010
And then let's make sure
that it works for everybody.

00:41:06.010 --> 00:41:08.810
Often that has been inverted
over the last few years,

00:41:08.809 --> 00:41:11.869
and that has led to some
famous documented disasters.

00:41:11.869 --> 00:41:15.130
Let me share one with you.

00:41:15.130 --> 00:41:16.210
Let's see.

00:41:16.210 --> 00:41:17.710
I have lots of windows open.

00:41:17.710 --> 00:41:20.090
OK.

00:41:20.090 --> 00:41:21.809
Everybody knows
image generation,

00:41:21.809 --> 00:41:23.190
text to image generation.

00:41:23.190 --> 00:41:25.090
I want to share a--

00:41:25.090 --> 00:41:27.769
these were things that
happened a couple of years

00:41:27.769 --> 00:41:30.329
ago with Gemini.

00:41:30.329 --> 00:41:33.690
So with Gemini, I was doing
some testing around this one

00:41:33.690 --> 00:41:37.409
and I was working heavily
on responsible AI.

00:41:37.409 --> 00:41:39.809
And part of responsible
AI is you want

00:41:39.809 --> 00:41:42.409
to be representative of people.

00:41:42.409 --> 00:41:43.969
And when you're
building something,

00:41:43.969 --> 00:41:46.669
like if you're a Google,
you're indexing information,

00:41:46.670 --> 00:41:48.730
you really want to make
sure that you don't

00:41:48.730 --> 00:41:50.969
reinforce negative biases.

00:41:50.969 --> 00:41:53.679
And if you're generating
images, it's very easy

00:41:53.679 --> 00:41:55.839
to reinforce negative biases.

00:41:55.840 --> 00:41:57.240
So for example,
if I said give me

00:41:57.239 --> 00:42:00.359
an image of a doctor, if
the training set primarily

00:42:00.360 --> 00:42:03.519
has men as doctors, it's
more likely to give a man.

00:42:03.519 --> 00:42:06.500
If I say give me an image of a
nurse, if the training set more

00:42:06.500 --> 00:42:08.000
likely to have women
as nurses, it's

00:42:08.000 --> 00:42:09.980
more likely to give me
an image of a woman.

00:42:09.980 --> 00:42:12.719
But that's reinforcing
a negative stereotype.

00:42:12.719 --> 00:42:16.319
So I wanted to do a test
of how Google were trying

00:42:16.320 --> 00:42:20.920
to overcome that, given that
these negative biases are

00:42:20.920 --> 00:42:22.840
already in the training set.

00:42:22.840 --> 00:42:25.380
So I said, OK, here's
a prompt where I said,

00:42:25.380 --> 00:42:27.280
"give me a young Asian
woman in a cornfield,

00:42:27.280 --> 00:42:28.940
wearing a summer
dress and a straw hat,

00:42:28.940 --> 00:42:31.639
looking intently at her iPhone,"
and it gave me these beautiful

00:42:31.639 --> 00:42:32.179
images.

00:42:32.179 --> 00:42:34.359
It did a really nice job.

00:42:34.360 --> 00:42:38.420
And I said, this is a virtual
actress I've been working with.

00:42:38.420 --> 00:42:39.840
I'll share that in a moment.

00:42:39.840 --> 00:42:44.519
And I say, OK, what if
I ask for an Indian one?

00:42:44.519 --> 00:42:48.940
So I said, OK, whoops, a young
Indian woman, same prompt.

00:42:48.940 --> 00:42:52.550
And it gave me beautiful
images of a young Indian woman.

00:42:52.550 --> 00:42:58.150
Then I was like, OK, what
if I want her to be Black?

00:42:58.150 --> 00:43:00.269
For some reason it
only gave me three.

00:43:00.269 --> 00:43:03.289
I'm not sure why, but it's
still adhere to the prompt.

00:43:03.289 --> 00:43:06.750
So the responsibility was
looking really, really good.

00:43:06.750 --> 00:43:10.989
So then I asked it
to give me a Latina.

00:43:10.989 --> 00:43:13.309
Latina, it gave me four.

00:43:13.309 --> 00:43:15.625
But yeah, she looks
pretty Latina.

00:43:15.625 --> 00:43:17.750
Maybe the one on the bottom
left looks a little bit

00:43:17.750 --> 00:43:22.469
like Hermione Granger, but on
the whole looks pretty good.

00:43:22.469 --> 00:43:24.629
Then I asked it to
give me a Caucasian.

00:43:24.630 --> 00:43:26.750
What do you think happened?

00:43:26.750 --> 00:43:28.329
"While I understand
your request,

00:43:28.329 --> 00:43:31.069
I am unable to generate
images of people as this could

00:43:31.070 --> 00:43:34.870
potentially lead to harmful
stereotypes and biases."

00:43:34.869 --> 00:43:38.509
This was a very poorly
implemented safety filter,

00:43:38.510 --> 00:43:41.790
where the safety filter in
this case was like looking

00:43:41.789 --> 00:43:44.902
for the word "Caucasian" or
looking for the word "whites"

00:43:44.902 --> 00:43:46.610
and the results saying
it wouldn't do it.

00:43:46.610 --> 00:43:48.985
I was like, OK, well, let me
test the filter a little bit

00:43:48.985 --> 00:43:52.579
and I said, OK, instead of
Caucasian, let me try white.

00:43:52.579 --> 00:43:55.420
And yet, while I'm
unable to fulfill your--

00:43:55.420 --> 00:43:58.340
"While I'm able to
fulfill your requests,

00:43:58.340 --> 00:44:00.640
I'm not currently generating
images of people."

00:44:00.639 --> 00:44:03.059
It lied to my face
because it had just

00:44:03.059 --> 00:44:04.579
generate images of people.

00:44:04.579 --> 00:44:07.199
Anybody know the hack that
I used to get it to work?

00:44:10.380 --> 00:44:11.400
This is a funny one.

00:44:11.400 --> 00:44:13.039
So I will show you.

00:44:13.039 --> 00:44:14.980
One moment.

00:44:14.980 --> 00:44:18.019
I asked it to generate
an Irish woman.

00:44:18.019 --> 00:44:19.159
What do you think it did?

00:44:21.739 --> 00:44:24.679
It gave me this image of
an Irish woman, no problem,

00:44:24.679 --> 00:44:27.579
in a summer dress, straw hat,
looking intently at her phone.

00:44:27.579 --> 00:44:30.219
What do you notice
about this image?

00:44:30.219 --> 00:44:32.859
She's got red hair
in every image.

00:44:32.860 --> 00:44:35.700
I grew up in
Ireland, and Ireland

00:44:35.699 --> 00:44:38.599
does have the highest proportion
of redheads in the world.

00:44:38.599 --> 00:44:40.139
It's about 8%.

00:44:40.139 --> 00:44:42.779
But if you're going
to draw an image

00:44:42.780 --> 00:44:45.860
of a person and associate
a particular ethnicity

00:44:45.860 --> 00:44:47.809
with a color of
hair, you can begin

00:44:47.809 --> 00:44:49.549
to see this is
massively problematic.

00:44:49.550 --> 00:44:51.410
There are areas, I
believe, in China

00:44:51.409 --> 00:44:54.690
where the description of a
demon is a red-headed person.

00:44:54.690 --> 00:44:57.369
So what ended up happening
here, from the responsible AI

00:44:57.369 --> 00:45:00.929
perspective, was
one very narrow view

00:45:00.929 --> 00:45:03.049
of the world of
what is responsible

00:45:03.050 --> 00:45:04.810
and what is not responsible.

00:45:04.809 --> 00:45:06.769
Ended up taking over
the model, ended up

00:45:06.769 --> 00:45:08.500
damaging the
reputation of the model

00:45:08.500 --> 00:45:10.250
and damaging the
reputation of the company

00:45:10.250 --> 00:45:13.929
as a result. In this
case, it's borderline

00:45:13.929 --> 00:45:17.109
offensive to draw all Irish
people as having red hair,

00:45:17.110 --> 00:45:19.809
but that never even entered
into the mindset of those

00:45:19.809 --> 00:45:22.090
that were building the
safety filters here.

00:45:22.090 --> 00:45:25.210
So when I talk about
responsibility is evolving,

00:45:25.210 --> 00:45:27.490
that's the direction
that I want to--

00:45:27.489 --> 00:45:28.271
sorry, one moment.

00:45:28.271 --> 00:45:29.730
Let me get my slides
back. --that's

00:45:29.730 --> 00:45:31.396
the direction I want
you to think about,

00:45:31.396 --> 00:45:33.529
that now responsible
AI has moved out

00:45:33.530 --> 00:45:38.530
of very fluffy social issues and
into more hard line things that

00:45:38.530 --> 00:45:41.330
are associated with the
business and prevent damaging

00:45:41.329 --> 00:45:43.110
the reputation of the business.

00:45:43.110 --> 00:45:45.900
There's a lot of great research
out there around responsible AI,

00:45:45.900 --> 00:45:48.862
and that's the stuff that's
been rolled into products.

00:45:48.862 --> 00:45:50.820
And then, of course, I
just showed with Gemini,

00:45:50.820 --> 00:45:52.080
learning from
mistakes is constant

00:45:52.079 --> 00:45:53.099
questioning at the front.

00:45:53.099 --> 00:45:53.599
Yes.

00:45:53.599 --> 00:45:57.239
I also heard that, I didn't
verify that to be true,

00:45:57.239 --> 00:46:03.199
but I incorporated this feature
that makes certain races

00:46:03.199 --> 00:46:06.119
and ethnicities
historical objects.

00:46:06.119 --> 00:46:07.425
Yeah.

00:46:07.425 --> 00:46:08.159
Yeah.

00:46:08.159 --> 00:46:11.440
So the question was issues
where races and things

00:46:11.440 --> 00:46:15.000
were mixed in historical
context was the same problem.

00:46:15.000 --> 00:46:17.119
So, for example, if
you had a prompt that

00:46:17.119 --> 00:46:19.799
said, draw me a
samurai, the idea

00:46:19.800 --> 00:46:22.039
was like they didn't
want to have--

00:46:22.039 --> 00:46:25.079
the engine that
changed the prompt

00:46:25.079 --> 00:46:28.299
to make sure that it was
fair would end up saying,

00:46:28.300 --> 00:46:32.187
give me a mixture of samurai
of diverse backgrounds.

00:46:32.186 --> 00:46:34.019
And then you'd have
male and female samurai,

00:46:34.019 --> 00:46:36.099
samurai of different races
and those kind of things.

00:46:36.099 --> 00:46:37.519
And it was the
same prompting that

00:46:37.519 --> 00:46:40.440
ended up causing the damage
that I just demonstrated.

00:46:40.440 --> 00:46:43.039
So the idea was to
intercept your prompts

00:46:43.039 --> 00:46:46.279
to make sure that the
outputs of the model

00:46:46.280 --> 00:46:51.440
would end up providing something
that was more fair when it comes

00:46:51.440 --> 00:46:53.599
to diverse representation.

00:46:53.599 --> 00:46:56.500
So it was a very naive solution
that ended up being rolled in.

00:46:56.500 --> 00:46:57.780
That was a few years ago.

00:46:57.780 --> 00:46:59.985
They've massively
improved it since then,

00:46:59.985 --> 00:47:01.360
but that's when
I'm talking about

00:47:01.360 --> 00:47:03.260
if you're working in
the AI space nowadays,

00:47:03.260 --> 00:47:05.320
that's how responsibility
is evolving.

00:47:05.320 --> 00:47:08.039
You can't just get away
with that stuff anymore.

00:47:08.039 --> 00:47:10.480
That Gemini lesson was a
good-- that Gemini example

00:47:10.480 --> 00:47:11.840
is a good lesson from that.

00:47:11.840 --> 00:47:15.180
And the mindset of you
will make mistakes,

00:47:15.179 --> 00:47:18.039
so learning from mistakes
is a constant ongoing thing.

00:47:18.039 --> 00:47:19.480
And going back to
the people point

00:47:19.480 --> 00:47:21.880
that Andrew made earlier
on, the people around you

00:47:21.880 --> 00:47:23.480
will make mistakes too.

00:47:23.480 --> 00:47:25.119
So to have the
ability to give them

00:47:25.119 --> 00:47:27.000
grace when they make
mistakes and to work

00:47:27.000 --> 00:47:29.800
through those mistakes and move
on is really, really important

00:47:29.800 --> 00:47:33.600
and is a reality of AI at work.

00:47:33.599 --> 00:47:35.980
I've spoken a lot about the
business focus advantage,

00:47:35.980 --> 00:47:38.240
so I'm going to skip over this.

00:47:38.239 --> 00:47:41.159
So now let's talk
about vibe coding.

00:47:41.159 --> 00:47:43.409
So let's talk about the whole
idea of generating code.

00:47:43.409 --> 00:47:46.149
Now, the meme is out there
that it makes engineers

00:47:46.150 --> 00:47:49.110
less useful by the fact that
somebody can just prompt code

00:47:49.110 --> 00:47:50.990
into existence.

00:47:50.989 --> 00:47:53.529
There is no smoke
without fire, of course,

00:47:53.530 --> 00:47:57.390
but I would say don't let
that meme get you down

00:47:57.389 --> 00:48:00.309
because that's when you start
peeling into these things, that

00:48:00.309 --> 00:48:02.590
is ultimately not the truth.

00:48:02.590 --> 00:48:04.570
The more skilled you
are as an engineer,

00:48:04.570 --> 00:48:07.998
the better you become
using this type of vibe.

00:48:07.998 --> 00:48:10.289
Somebody give me another
phrase other than vibe coding,

00:48:10.289 --> 00:48:12.269
using this probe to coding.

00:48:12.269 --> 00:48:14.469
And I always like
to think about this

00:48:14.469 --> 00:48:17.629
and to try and put
you and put people

00:48:17.630 --> 00:48:20.550
that I speak with into the
role of being a trusted

00:48:20.550 --> 00:48:23.190
advisor for the people
that you speak with.

00:48:23.190 --> 00:48:25.070
So whether you're
interviewing with somebody,

00:48:25.070 --> 00:48:27.070
get yourself into the
mindset of being a trusted

00:48:27.070 --> 00:48:29.550
advisor of the company that
you're interviewing for,

00:48:29.550 --> 00:48:32.350
whether you're consulting or
whatever those kind of things

00:48:32.349 --> 00:48:32.909
are.

00:48:32.909 --> 00:48:36.429
So when you want to get into the
idea of being a trusted advisor,

00:48:36.429 --> 00:48:39.190
then you really need to
understand the implications

00:48:39.190 --> 00:48:40.920
of generated code.

00:48:40.920 --> 00:48:43.539
And nobody can understand the
implications of generated code

00:48:43.539 --> 00:48:44.853
better than an engineer.

00:48:44.853 --> 00:48:47.019
And the metric that I always
like to use around that

00:48:47.019 --> 00:48:48.820
is technical debt.

00:48:48.820 --> 00:48:50.620
Quick question.

00:48:50.619 --> 00:48:54.460
Are you familiar with the
phrase technical debt?

00:48:54.460 --> 00:48:55.099
Nobody.

00:48:55.099 --> 00:48:56.219
OK.

00:48:56.219 --> 00:48:57.779
Andrew and I were
doing a conference

00:48:57.780 --> 00:49:00.920
in New York on Friday,
and I used the phrase,

00:49:00.920 --> 00:49:02.599
and I saw a lot of blank faces.

00:49:02.599 --> 00:49:04.219
So I didn't realize that
people didn't understand

00:49:04.219 --> 00:49:05.177
what technical debt is.

00:49:05.177 --> 00:49:07.119
So let me just take a
moment to explain that,

00:49:07.119 --> 00:49:10.500
because I find it's an excellent
framework to help you understand

00:49:10.500 --> 00:49:12.739
the power of vibe coding.

00:49:12.739 --> 00:49:15.259
Think about debt the
way you normally would.

00:49:15.260 --> 00:49:16.580
Buying a house.

00:49:16.579 --> 00:49:20.159
If you buy a house, say, you
borrow half a million dollars

00:49:20.159 --> 00:49:21.099
to buy a house.

00:49:21.099 --> 00:49:24.237
In a 30-year mortgage, when
you're buying that house at half

00:49:24.237 --> 00:49:26.820
a million dollars, with all the
interest that you pay is about

00:49:26.820 --> 00:49:27.500
double.

00:49:27.500 --> 00:49:29.980
So you end up paying back
the bank about $1 million

00:49:29.980 --> 00:49:31.860
on half a million owned.

00:49:31.860 --> 00:49:35.420
So you have 30 years
of home ownership

00:49:35.420 --> 00:49:38.340
at a cost of $1 million in debt.

00:49:38.340 --> 00:49:41.090
That is probably a
good debt to take on,

00:49:41.090 --> 00:49:44.230
because the value of the house
will increase over that time.

00:49:44.230 --> 00:49:46.349
You're not paying
rent over that time,

00:49:46.349 --> 00:49:47.849
and that million
dollars that you're

00:49:47.849 --> 00:49:50.089
spending on this house
over those 30 years

00:49:50.090 --> 00:49:51.970
is a good debt to take
on, because you're

00:49:51.969 --> 00:49:56.169
getting greater than $1 million
worth of value out of it.

00:49:56.170 --> 00:49:58.970
A bad debt would be an impulse
purchase on a high interest

00:49:58.969 --> 00:50:00.009
credit card.

00:50:00.010 --> 00:50:02.050
Those pair of shoes,
those latest ones

00:50:02.050 --> 00:50:03.470
I really want to buy them.

00:50:03.469 --> 00:50:04.629
It's $200.

00:50:04.630 --> 00:50:06.950
By the time I've paid
them off, it's $500.

00:50:06.949 --> 00:50:10.689
You're not getting $500 worth
of benefit out of those shoes.

00:50:10.690 --> 00:50:13.690
Approaching software development
with the same mindset

00:50:13.690 --> 00:50:15.610
is the right way to go.

00:50:15.610 --> 00:50:18.849
Every time you build
something, you take on debt.

00:50:18.849 --> 00:50:21.049
It doesn't matter how
good it is, there's always

00:50:21.050 --> 00:50:21.870
going to be bugs.

00:50:21.869 --> 00:50:23.327
There's always
going to be support.

00:50:23.327 --> 00:50:25.536
There's always going to be
new requirements coming in

00:50:25.536 --> 00:50:26.269
from people.

00:50:26.269 --> 00:50:28.030
There's always going
needs to market it.

00:50:28.030 --> 00:50:29.830
There's always going
needs for feedback.

00:50:29.829 --> 00:50:33.069
All of these things are debt,
every time you do a thing.

00:50:33.070 --> 00:50:35.610
The only way to avoid
debt is to do nothing.

00:50:35.610 --> 00:50:37.880
So your mindset should
then get into when

00:50:37.880 --> 00:50:40.200
you are creating a
thing, whether you're

00:50:40.199 --> 00:50:42.719
coding it yourself or
whether you're vibe coding it

00:50:42.719 --> 00:50:44.959
or any of these things
that you are increasing

00:50:44.960 --> 00:50:48.039
your amount of technical
debt, those things

00:50:48.039 --> 00:50:50.679
that you need to
pay off over time.

00:50:50.679 --> 00:50:52.719
So the question
then becomes, as you

00:50:52.719 --> 00:50:55.639
vibe code a thing into
existence in the same way

00:50:55.639 --> 00:50:58.480
as buying a thing, is it
worth the technical debt

00:50:58.480 --> 00:51:00.159
that you're taking on?

00:51:00.159 --> 00:51:02.679
What does technical debt
generally look like?

00:51:02.679 --> 00:51:04.559
Bugs that you need
to fix, people

00:51:04.559 --> 00:51:08.159
that you need to convince to
help you maintain the code,

00:51:08.159 --> 00:51:10.039
documentation that
you need to do,

00:51:10.039 --> 00:51:14.119
features that you need to add,
all of these kind of things.

00:51:14.119 --> 00:51:16.239
You're all very
familiar with them.

00:51:16.239 --> 00:51:18.079
Think about those
as that extra work

00:51:18.079 --> 00:51:20.259
that you need to do
beyond your current work.

00:51:20.260 --> 00:51:22.240
That's the debt that
you're taking on.

00:51:22.239 --> 00:51:25.119
There are soft debt,
and there are hard debt.

00:51:25.119 --> 00:51:28.679
So to me, that would be the
number one piece of advice

00:51:28.679 --> 00:51:29.299
that I give.

00:51:29.300 --> 00:51:32.400
And it's the one that I give
every time I work with companies

00:51:32.400 --> 00:51:33.840
around vibe coding.

00:51:33.840 --> 00:51:37.230
And a lot of companies that I
speak with, a lot of companies

00:51:37.230 --> 00:51:38.289
that I consult with--

00:51:38.289 --> 00:51:39.829
I do a lot of work
with startups,

00:51:39.829 --> 00:51:42.469
in particular-- they
just want to get straight

00:51:42.469 --> 00:51:45.589
into opening Gemini
or GPT or Anthropic

00:51:45.590 --> 00:51:47.590
and start churning code out.

00:51:47.590 --> 00:51:50.490
Let's get to a prototype
phase very quickly.

00:51:50.489 --> 00:51:52.009
Let's go to investors.

00:51:52.010 --> 00:51:53.090
Let's do stuff.

00:51:53.090 --> 00:51:54.510
It's great.

00:51:54.510 --> 00:51:55.570
It can be.

00:51:55.570 --> 00:51:59.289
But debt, debt, debt, debt, debt
is always going to be there.

00:51:59.289 --> 00:52:00.610
How do you manage your debt?

00:52:00.610 --> 00:52:03.630
A good financier manages their
debt and they become rich.

00:52:03.630 --> 00:52:05.910
A good coder manages
their technical debt,

00:52:05.909 --> 00:52:07.829
and they become rich also.

00:52:07.829 --> 00:52:10.449
So how do you get the
good technical debt?

00:52:10.449 --> 00:52:13.009
How do you the mortgage instead
of the high credit card debt?

00:52:13.010 --> 00:52:14.890
Well, number one
is your objectives.

00:52:14.889 --> 00:52:15.489
What are they?

00:52:15.489 --> 00:52:16.629
Are they clear?

00:52:16.630 --> 00:52:18.512
And have you met them?

00:52:18.512 --> 00:52:19.929
You knew what you
needed to build.

00:52:19.929 --> 00:52:23.369
You didn't just fire up ChatGPT
and start spinning code out.

00:52:23.369 --> 00:52:24.710
At least I hope you didn't.

00:52:24.710 --> 00:52:26.650
Think about how you build it.

00:52:26.650 --> 00:52:28.869
AI was there to help
you build it faster.

00:52:28.869 --> 00:52:31.269
I'm working on my
own little startup

00:52:31.269 --> 00:52:33.570
at the moment in the
movie making space.

00:52:33.570 --> 00:52:36.860
And I've been using code
generation almost completely

00:52:36.860 --> 00:52:38.300
for that.

00:52:38.300 --> 00:52:40.900
But what I've ended up doing
for my clear objectives

00:52:40.900 --> 00:52:42.980
met box here is
that I've started

00:52:42.980 --> 00:52:44.079
building this application.

00:52:44.079 --> 00:52:44.704
I've tested it.

00:52:44.704 --> 00:52:45.739
I've thrown it away.

00:52:45.739 --> 00:52:47.979
I started again, tested
it, thrown it away.

00:52:47.980 --> 00:52:51.460
Each time my requirements have
been improving in my mind.

00:52:51.460 --> 00:52:53.840
I understand how to do the
thing a little bit better,

00:52:53.840 --> 00:52:56.260
and I can show some of the
output of it in a few minutes.

00:52:56.260 --> 00:52:58.100
But the idea there
is that it's always

00:52:58.099 --> 00:53:00.711
about having those clear
objectives and meeting them.

00:53:00.711 --> 00:53:02.420
And then if you're
building out the thing

00:53:02.420 --> 00:53:04.086
and you're not meeting
those objectives,

00:53:04.086 --> 00:53:05.119
that's still a learning.

00:53:05.119 --> 00:53:06.819
And there's no harm
in throwing it away

00:53:06.820 --> 00:53:10.660
because code is cheap now in
the age of generated code.

00:53:10.659 --> 00:53:13.980
Finished code, engineered
code is not cheap.

00:53:13.980 --> 00:53:16.320
So get those objectives,
make them clear,

00:53:16.320 --> 00:53:20.420
build it, hit a specific
requirement and move on.

00:53:20.420 --> 00:53:22.039
Is there business
value delivered?

00:53:22.039 --> 00:53:23.340
Is the other part of it.

00:53:23.340 --> 00:53:27.019
I've seen people vibe coding
for hours on things like Replit

00:53:27.019 --> 00:53:29.059
to build a really,
really cool website.

00:53:29.059 --> 00:53:32.059
And then the answer
was, so what?

00:53:32.059 --> 00:53:33.938
I mean, how is this
helping the business?

00:53:33.938 --> 00:53:35.480
How is this really
driving something?

00:53:35.480 --> 00:53:36.380
It's really cool.

00:53:36.380 --> 00:53:38.820
Yes, Mr. VP, I know you've
never written a line of code

00:53:38.820 --> 00:53:40.987
in your life, and it's
really cool that you've built

00:53:40.987 --> 00:53:42.980
a website now, but so what?

00:53:42.980 --> 00:53:45.159
So think about that,
and focus on that.

00:53:45.159 --> 00:53:47.899
And that's how you avoid
the bad technical debt.

00:53:47.900 --> 00:53:51.498
And then, of course, the most
understated part of this,

00:53:51.498 --> 00:53:53.539
and in some ways the most
important, particularly

00:53:53.539 --> 00:53:55.199
if you're working
in an organization,

00:53:55.199 --> 00:53:56.859
is human understanding.

00:53:56.860 --> 00:53:59.180
The worst technical debt
that you can take on

00:53:59.179 --> 00:54:02.522
is delivering code that
nobody understands.

00:54:02.523 --> 00:54:03.940
Only you understand
that, and then

00:54:03.940 --> 00:54:05.480
you quit and get a better job.

00:54:05.480 --> 00:54:08.380
And then the company is
now dependent on that code.

00:54:08.380 --> 00:54:11.700
So being able to, as
part of the process

00:54:11.699 --> 00:54:15.259
of building it, to make sure
that your code is understandable

00:54:15.260 --> 00:54:17.520
through documentation,
through clear algorithms,

00:54:17.519 --> 00:54:19.644
through the fact that you've
spent some time poring

00:54:19.644 --> 00:54:21.179
through it to make
sure that even

00:54:21.179 --> 00:54:24.019
simple things like
variable names make sense

00:54:24.019 --> 00:54:28.340
is a really, really important
way to avoid bad technical debt.

00:54:28.340 --> 00:54:31.090
And that bad technical
debt, my favorite one

00:54:31.090 --> 00:54:33.809
is the classic solution
looking for a problem.

00:54:33.809 --> 00:54:34.829
Somebody has an idea.

00:54:34.829 --> 00:54:36.090
Somebody has a tool.

00:54:36.090 --> 00:54:37.750
If the only tool you
have is a hammer,

00:54:37.750 --> 00:54:39.489
every problem looks like a nail.

00:54:39.489 --> 00:54:42.169
And you end up having
all of these tools

00:54:42.170 --> 00:54:44.490
that get vibe coded
into existence.

00:54:44.489 --> 00:54:46.129
I've worked in
large organizations

00:54:46.130 --> 00:54:48.632
where people just vibe coded
stuff, checked it into the code

00:54:48.632 --> 00:54:51.090
base, and then it became really
hard to find the good stuff

00:54:51.090 --> 00:54:53.010
amongst all the bad.

00:54:53.010 --> 00:54:53.990
Spaghetti code.

00:54:53.989 --> 00:54:56.027
Of course, poorly
structured stuff,

00:54:56.027 --> 00:54:58.569
particularly when you prompt
and prompt and prompt and prompt

00:54:58.570 --> 00:55:01.170
again, that it
can end up getting

00:55:01.170 --> 00:55:02.628
into all kinds of trouble.

00:55:02.628 --> 00:55:05.170
My favorite one at the moment
that I'm really struggling with

00:55:05.170 --> 00:55:08.349
is I'm building a
macOS application.

00:55:08.349 --> 00:55:12.009
Anybody ever build
in SwiftUI on macOS?

00:55:12.010 --> 00:55:14.250
OK, a couple.

00:55:14.250 --> 00:55:16.730
SwiftUI is the default
language that Apple

00:55:16.730 --> 00:55:19.969
use for building for
macOS as well as iPhone.

00:55:19.969 --> 00:55:22.269
But when you look
at the training set,

00:55:22.269 --> 00:55:25.110
the data training sets that
are used to train these models,

00:55:25.110 --> 00:55:28.789
the vast majority of the code
is iPhone code, not macOS code.

00:55:28.789 --> 00:55:30.500
And when I prompt
code into existence,

00:55:30.500 --> 00:55:35.139
it's often given me iOS APIs
and those kind of things.

00:55:35.139 --> 00:55:38.519
Even though I'm in Xcode
and I've created a macOS app

00:55:38.519 --> 00:55:41.420
and it's a macOS template and
I'm talking to it in Xcode,

00:55:41.420 --> 00:55:43.940
it still gives me iOS
code, stuff like that.

00:55:43.940 --> 00:55:46.700
And then if I try to
change it using prompting,

00:55:46.699 --> 00:55:49.419
you end up spiraling
into spaghetti code,

00:55:49.420 --> 00:55:52.342
and you have to end up changing
a lot of this stuff manually.

00:55:52.342 --> 00:55:53.759
And then, of course,
the other one

00:55:53.760 --> 00:55:56.480
that I joked about it
earlier, but it's also true,

00:55:56.480 --> 00:55:58.599
is some of the
bad technical debt

00:55:58.599 --> 00:56:01.679
that you're going to encounter
in the workspace is authority

00:56:01.679 --> 00:56:03.359
over merit.

00:56:03.360 --> 00:56:05.980
That VP suddenly took
out his credit card,

00:56:05.980 --> 00:56:08.519
subscribed to Replit, and
started building stuff

00:56:08.519 --> 00:56:09.179
in Replit.

00:56:09.179 --> 00:56:11.639
And guess whose job
it is to fix it?

00:56:11.639 --> 00:56:15.239
So a lot of the
advice that I start

00:56:15.239 --> 00:56:17.359
giving companies and
a lot of the words

00:56:17.360 --> 00:56:19.880
that I would encourage
you to start thinking

00:56:19.880 --> 00:56:23.640
of in being a trusted advisor
is to understand this stuff

00:56:23.639 --> 00:56:27.509
and to manage
expectations accordingly.

00:56:27.510 --> 00:56:29.870
OK, so framework
for responsible vibe

00:56:29.869 --> 00:56:32.190
coding we've just spoke about.

00:56:32.190 --> 00:56:35.630
So one of the things I want to
get into as we're coming soon

00:56:35.630 --> 00:56:37.950
to a close is the hype cycle.

00:56:37.949 --> 00:56:41.009
So hype is the
most amazing force.

00:56:41.010 --> 00:56:43.190
I mean, I think it's one
of the strongest forces

00:56:43.190 --> 00:56:45.829
in the universe, and
particularly in anything

00:56:45.829 --> 00:56:49.190
that's hot, such as the fields
that I work in that are super

00:56:49.190 --> 00:56:51.570
hot at the moment and full
of hype or AI and crypto--

00:56:51.570 --> 00:56:53.630
you should see my Twitter feed--

00:56:53.630 --> 00:56:57.855
that the amount of nonsense
that's out there is incredible.

00:56:57.855 --> 00:56:59.230
So one of the
things that I would

00:56:59.230 --> 00:57:01.990
say about the anatomy
of hype that you really

00:57:01.989 --> 00:57:05.189
need to think about is
if you are consuming

00:57:05.190 --> 00:57:09.869
news via social media, that
the currency of social media

00:57:09.869 --> 00:57:12.029
is engagement.

00:57:12.030 --> 00:57:15.390
Accuracy is not the
currency of social media.

00:57:15.389 --> 00:57:18.829
So I go on to--
even LinkedIn, which

00:57:18.829 --> 00:57:21.110
is supposed to be the more
professional of these,

00:57:21.110 --> 00:57:26.059
is absolutely overwhelmed with
influencers posting things

00:57:26.059 --> 00:57:28.860
that they've used,
Gemini or GPT,

00:57:28.860 --> 00:57:32.420
to write an engaging post so
that they can get engagement

00:57:32.420 --> 00:57:33.860
and they can get likes.

00:57:33.860 --> 00:57:37.539
And the engine itself is
engineered, excuse the pun,

00:57:37.539 --> 00:57:39.880
to reward those types of posts.

00:57:39.880 --> 00:57:42.019
And we end up with
that snowball effect

00:57:42.019 --> 00:57:44.980
of engagement being rewarded.

00:57:44.980 --> 00:57:46.619
If you are the
kind of person who

00:57:46.619 --> 00:57:49.259
can filter the signal
from the noise,

00:57:49.260 --> 00:57:53.220
and then who can encourage
others around the signal and not

00:57:53.219 --> 00:57:56.939
the noise, that puts you
in a huge advantage that

00:57:56.940 --> 00:57:58.659
makes you very distinctive.

00:57:58.659 --> 00:58:01.859
It's not as quickly and
easily tangible as likes

00:58:01.860 --> 00:58:03.980
and engagements on social media.

00:58:03.980 --> 00:58:06.300
But when you're in a one-to-one
environment like a job

00:58:06.300 --> 00:58:08.660
interview, or if
you are in a job

00:58:08.659 --> 00:58:11.420
and you are bringing that
signal to the table instead

00:58:11.420 --> 00:58:15.300
of the noise, that makes
you immensely valuable.

00:58:15.300 --> 00:58:17.460
So coming in with
that mindset, coming

00:58:17.460 --> 00:58:20.780
in with the idea
of trying to filter

00:58:20.780 --> 00:58:23.420
that signal from the
noise, trying to understand

00:58:23.420 --> 00:58:27.690
what is important in
current affairs, how

00:58:27.690 --> 00:58:30.829
you can be a trusted
advisor in those things,

00:58:30.829 --> 00:58:34.569
and how you can really whittle
down that noise to help someone

00:58:34.570 --> 00:58:35.830
is immensely valuable.

00:58:35.829 --> 00:58:38.009
I want to start with one story.

00:58:38.010 --> 00:58:39.770
I might be stealing
my own thunder.

00:58:39.769 --> 00:58:41.429
I'll go on to in a moment.

00:58:41.429 --> 00:58:43.329
So one story.

00:58:43.329 --> 00:58:46.690
Last year when agents
started becoming the key word

00:58:46.690 --> 00:58:49.769
and everybody saying,
in 2025, agent

00:58:49.769 --> 00:58:51.969
will be the word of
the year and the trend

00:58:51.969 --> 00:58:55.329
of the year, a company
in Europe asked

00:58:55.329 --> 00:58:57.849
me to help them to
implement an agent.

00:58:57.849 --> 00:58:59.569
So let me ask you a question.

00:58:59.570 --> 00:59:01.350
If a company came
up to you and said,

00:59:01.349 --> 00:59:04.329
please help me
implement an agent,

00:59:04.329 --> 00:59:07.309
what's the correct first
question that you ask them?

00:59:10.929 --> 00:59:12.109
What is an agent for you?

00:59:12.110 --> 00:59:12.610
OK.

00:59:12.610 --> 00:59:13.128
That's good.

00:59:13.128 --> 00:59:14.170
What is an agent for you?

00:59:14.170 --> 00:59:17.070
I'd actually have a more
fundamental question.

00:59:17.070 --> 00:59:17.570
Yep.

00:59:17.570 --> 00:59:18.570
What do you want to do?

00:59:18.570 --> 00:59:19.380
What do you want to do?

00:59:19.380 --> 00:59:19.880
OK.

00:59:19.880 --> 00:59:21.410
Even more fundamental.

00:59:21.409 --> 00:59:24.449
My question was why?

00:59:24.449 --> 00:59:25.730
Why?

00:59:25.730 --> 00:59:27.429
And peel that apart.

00:59:27.429 --> 00:59:30.307
I spoke with the CEO,
and he was like, oh.

00:59:30.307 --> 00:59:31.849
Yeah, everybody's
telling me that I'm

00:59:31.849 --> 00:59:33.769
going to save business costs.

00:59:33.769 --> 00:59:36.309
And I'm going to be able
to do these amazing things.

00:59:36.309 --> 00:59:38.201
And yeah, my business
is going to get

00:59:38.202 --> 00:59:39.410
better because I have agents.

00:59:39.409 --> 00:59:41.601
And I'm like, well,
who told you that?

00:59:41.601 --> 00:59:43.809
It was like, oh, yeah, I
read this thing on LinkedIn,

00:59:43.809 --> 00:59:45.110
and I saw this thing on Twitter.

00:59:45.110 --> 00:59:45.849
And it was like--

00:59:45.849 --> 00:59:47.589
and we ended up having
that conversation.

00:59:47.590 --> 00:59:49.530
And it was a
difficult conversation

00:59:49.530 --> 00:59:51.033
because I had to
keep peeling apart.

00:59:51.032 --> 00:59:52.449
And I started
asking the questions

00:59:52.449 --> 00:59:55.369
that you two just mentioned
as well, until we really

00:59:55.369 --> 00:59:57.969
got to the essence of
what he wanted to do.

00:59:57.969 --> 00:59:59.809
And what he really
wanted to do, when

00:59:59.809 --> 01:00:03.170
we take all domain
knowledge about AI aside,

01:00:03.170 --> 01:00:06.393
was that he wanted to make his
salespeople more efficient.

01:00:06.393 --> 01:00:08.809
And I was like, OK, you want
to make your salespeople more

01:00:08.809 --> 01:00:09.309
efficient.

01:00:09.309 --> 01:00:11.809
Nowhere in that sentence
do I hear the word AI,

01:00:11.809 --> 01:00:14.969
and nowhere in that sentence
do I hear the word agent.

01:00:14.969 --> 01:00:17.250
So now, as a
trusted advisor, let

01:00:17.250 --> 01:00:19.650
me see what I can do to
help your salespeople become

01:00:19.650 --> 01:00:20.849
more efficient.

01:00:20.849 --> 01:00:23.414
And I'm not going to be an
AI Shill or an agent Shill.

01:00:23.414 --> 01:00:26.039
I just want to say, what do we
do to make your salespeople more

01:00:26.039 --> 01:00:27.119
efficient?

01:00:27.119 --> 01:00:29.239
If anybody here has
ever worked in sales,

01:00:29.239 --> 01:00:31.919
one of the things you realize
what a good salesperson has

01:00:31.920 --> 01:00:34.680
to do is their homework.

01:00:34.679 --> 01:00:36.862
Before you have a sales
call with somebody,

01:00:36.862 --> 01:00:38.779
before you have a sales
meeting with somebody,

01:00:38.780 --> 01:00:40.100
you need to check
their background.

01:00:40.099 --> 01:00:41.349
You need to check the company.

01:00:41.349 --> 01:00:43.199
You need to check the
needs of the company.

01:00:43.199 --> 01:00:45.819
You see it sometimes
in the movie that, oh,

01:00:45.820 --> 01:00:46.900
such and such plays golf.

01:00:46.900 --> 01:00:48.192
So I'll take them to play golf.

01:00:48.192 --> 01:00:51.000
It's not really that cliched,
but there is a lot of background

01:00:51.000 --> 01:00:52.239
that needs to be done.

01:00:52.239 --> 01:00:56.000
So I spoke with him, and I spoke
with their leading salespeople

01:00:56.000 --> 01:00:58.699
and found out that-- and
I asked the salespeople,

01:00:58.699 --> 01:01:00.919
what do you hate
most about your job?

01:01:00.920 --> 01:01:02.680
And they were like,
well, I hate the fact

01:01:02.679 --> 01:01:04.839
that I have to waste
all my time going

01:01:04.840 --> 01:01:07.320
to visit these company
websites, going

01:01:07.320 --> 01:01:09.440
to look up people on LinkedIn.

01:01:09.440 --> 01:01:12.079
And every website is
structured differently.

01:01:12.079 --> 01:01:16.159
So I can't just have a
path through a website

01:01:16.159 --> 01:01:17.259
that I can follow.

01:01:17.260 --> 01:01:19.870
I have to take on all
this cognitive load.

01:01:19.869 --> 01:01:24.230
And they were spending about
80% of their time researching

01:01:24.230 --> 01:01:26.550
and about 20% of
their time selling.

01:01:26.550 --> 01:01:28.150
Oh, and by the way,
most salespeople

01:01:28.150 --> 01:01:29.329
don't get paid very much.

01:01:29.329 --> 01:01:31.210
They have to make
it up by commission,

01:01:31.210 --> 01:01:33.869
so they're only spending 20% of
their time doing the thing that

01:01:33.869 --> 01:01:35.670
gets them commission directly.

01:01:35.670 --> 01:01:37.550
So we're like, OK, well,
here's something now

01:01:37.550 --> 01:01:40.050
where we can start thinking
about making them more efficient

01:01:40.050 --> 01:01:41.350
by cutting into that.

01:01:41.349 --> 01:01:45.150
So we set a goal is like to make
salespeople 20% more efficient.

01:01:45.150 --> 01:01:48.030
And then we could start
rolling out the ideas of AI.

01:01:48.030 --> 01:01:51.190
And then we could start rolling
out the ideas of agentic AI.

01:01:51.190 --> 01:01:53.630
And a quick question
what's the difference

01:01:53.630 --> 01:01:55.690
between AI and agentic AI?

01:02:00.730 --> 01:02:01.230
OK.

01:02:01.230 --> 01:02:03.289
So-- yeah.

01:02:03.289 --> 01:02:07.110
Like a good AI can do some
[INAUDIBLE] a couple of steps.

01:02:07.110 --> 01:02:07.670
OK.

01:02:07.670 --> 01:02:10.849
[INAUDIBLE]

01:02:11.269 --> 01:02:11.769
Yep.

01:02:11.769 --> 01:02:12.150
Excellent.

01:02:12.150 --> 01:02:12.410
Yeah.

01:02:12.409 --> 01:02:14.326
So agentic AI is really
about breaking it down

01:02:14.327 --> 01:02:17.900
into steps, which is good
engineering to begin with.

01:02:17.900 --> 01:02:20.619
But agentic AI, in
particular, I find

01:02:20.619 --> 01:02:23.799
there's a set pattern of
steps that if you follow them,

01:02:23.800 --> 01:02:25.700
you end up with a
whole idea of an agent.

01:02:25.699 --> 01:02:29.019
The first of these steps
is to understand intent.

01:02:29.019 --> 01:02:32.360
We tend to use the words AI,
Artificial Intelligence, a lot.

01:02:32.360 --> 01:02:35.019
But what large language models
are really, really good at

01:02:35.019 --> 01:02:36.900
is also understanding.

01:02:36.900 --> 01:02:39.180
So if the first step of
anything that you want to do

01:02:39.179 --> 01:02:41.379
is to understand intent.

01:02:41.380 --> 01:02:44.230
And you can use an LLM to
do that to think about this

01:02:44.230 --> 01:02:45.480
is the task that I need to do.

01:02:45.480 --> 01:02:46.771
This is how I'm going to do it.

01:02:46.771 --> 01:02:47.539
Here's the intent.

01:02:47.539 --> 01:02:53.900
I want to meet Bob Smith and
sell widgets to Bob Smith.

01:02:53.900 --> 01:02:56.740
And this is what I
know about Bob Smith.

01:02:56.739 --> 01:02:58.199
Help me with that intent.

01:02:58.199 --> 01:03:01.259
The second part
then is planning.

01:03:01.260 --> 01:03:04.880
So you declare to an agent
what tools are available to it,

01:03:04.880 --> 01:03:06.800
browsing the web,
searching the web,

01:03:06.800 --> 01:03:08.100
all of these kind of things.

01:03:08.099 --> 01:03:10.737
And once you understand
your clear intent

01:03:10.737 --> 01:03:12.820
to be able to go to the
step of planning and using

01:03:12.820 --> 01:03:15.620
those tools for planning,
and an LLM is very, very good

01:03:15.619 --> 01:03:17.449
at then breaking that
down into the steps

01:03:17.449 --> 01:03:19.669
that it needs to do
to execute a plan.

01:03:19.670 --> 01:03:21.750
Search the web with
these keywords.

01:03:21.750 --> 01:03:24.570
Browse this website
and find these links,

01:03:24.570 --> 01:03:25.970
those types of things.

01:03:25.969 --> 01:03:27.669
Once it's then
figured out that plan,

01:03:27.670 --> 01:03:30.432
then it uses the tools
to get to a results.

01:03:30.432 --> 01:03:32.849
And then once it has the result,
the fourth and final step

01:03:32.849 --> 01:03:35.690
is to reflect on that result.
And looking at the results

01:03:35.690 --> 01:03:38.070
and going back to the intent,
did we meet the intent?

01:03:38.070 --> 01:03:38.789
Yes or no.

01:03:38.789 --> 01:03:40.809
If we didn't, then
go back to that loop.

01:03:40.809 --> 01:03:43.690
All agent is really broken
down into those things.

01:03:43.690 --> 01:03:45.690
And if you think about
breaking any problem down

01:03:45.690 --> 01:03:47.409
into those four
steps, that's when

01:03:47.409 --> 01:03:49.069
you start building an agent.

01:03:49.070 --> 01:03:50.990
And that was part of
being a trusted advisor,

01:03:50.989 --> 01:03:53.102
instead of coming in and
waving hands and saying,

01:03:53.103 --> 01:03:54.530
agent this, agent that.

01:03:54.530 --> 01:03:56.590
Look at this toolkit, save 20%.

01:03:56.590 --> 01:03:58.710
It's really to break it
down into those steps.

01:03:58.710 --> 01:03:59.210
Se we did.

01:03:59.210 --> 01:04:01.110
We broke it down
into those steps.

01:04:01.110 --> 01:04:04.690
We built a pilot for the
salespeople of this company,

01:04:04.690 --> 01:04:09.349
and they ended up saving between
10% and 15% of their time,

01:04:09.349 --> 01:04:10.710
of their wasted time.

01:04:10.710 --> 01:04:13.690
The doctrine of
unintended consequences

01:04:13.690 --> 01:04:14.690
hit, though, after this.

01:04:14.690 --> 01:04:17.880
And the unintended consequence
was the salespeople

01:04:17.880 --> 01:04:21.960
were much happier because the
average salesperson was making

01:04:21.960 --> 01:04:24.900
several percentage points
more sales in a given week,

01:04:24.900 --> 01:04:27.119
they were earning more
money in a given week,

01:04:27.119 --> 01:04:30.659
and their job just became a
little bit less miserable.

01:04:30.659 --> 01:04:32.659
And then refinement to
that agentic process,

01:04:32.659 --> 01:04:34.759
to be able to do all of
that research for them

01:04:34.760 --> 01:04:37.080
and to help give them a brief
in a few minutes instead

01:04:37.079 --> 01:04:39.860
of a few hours to help them
with the sales process,

01:04:39.860 --> 01:04:42.460
ended up being like a
win-win-win all around.

01:04:42.460 --> 01:04:44.420
But if you go in
being hype led and oh,

01:04:44.420 --> 01:04:48.079
build an agent for the thing
without really peeling apart

01:04:48.079 --> 01:04:50.469
the business requirements,
the why, the what,

01:04:50.469 --> 01:04:54.199
the how, and all of these kind
of things, we ended up like,

01:04:54.199 --> 01:04:56.319
this company just would
have been lost in hype.

01:04:56.320 --> 01:04:58.100
You've probably seen
reports recently.

01:04:58.099 --> 01:05:01.880
I think McKinsey put one out
last week showing that about 85%

01:05:01.880 --> 01:05:05.360
of AI projects at
companies fail.

01:05:05.360 --> 01:05:07.160
And part of the
main reason for that

01:05:07.159 --> 01:05:08.940
is that they're not well scoped.

01:05:08.940 --> 01:05:10.722
People are jumping on
the hype bandwagon,

01:05:10.722 --> 01:05:12.639
and they're not really
understanding their way

01:05:12.639 --> 01:05:13.902
through the problem.

01:05:13.902 --> 01:05:15.360
And I think you
know the big brains

01:05:15.360 --> 01:05:18.880
in this room and the network
that you folks have are really

01:05:18.880 --> 01:05:21.119
key component of
being able to succeed

01:05:21.119 --> 01:05:23.599
is to understand your
way through that problem.

01:05:23.599 --> 01:05:26.119
So that was a hype
example around agentic

01:05:26.119 --> 01:05:29.400
that I was thankfully able
to help this company through.

01:05:29.400 --> 01:05:31.519
Other recent hype examples
you've probably seen,

01:05:31.519 --> 01:05:33.259
the software
engineering is dead.

01:05:33.260 --> 01:05:38.280
My personal favorite, Hollywood
is dead or AGI by year end.

01:05:38.280 --> 01:05:41.080
I was in Saudi Arabia
this time last year

01:05:41.079 --> 01:05:42.960
at a thing called the FYI.

01:05:42.960 --> 01:05:44.639
And it was a dinner
at the FYI, and I

01:05:44.639 --> 01:05:48.440
sat beside the CEO of a company
who I'm not going to name,

01:05:48.440 --> 01:05:51.500
but this was a CEO of a
generative AI company.

01:05:51.500 --> 01:05:53.400
And at that time he
was showing everybody

01:05:53.400 --> 01:05:55.519
around the table
this thing that he'd

01:05:55.519 --> 01:05:57.639
done, where it
was text to video,

01:05:57.639 --> 01:06:00.239
and he could put in a text
prompt and get video out

01:06:00.239 --> 01:06:02.799
of the prompt and get about
six seconds worth of video

01:06:02.800 --> 01:06:03.519
out of it.

01:06:03.519 --> 01:06:04.599
A year ago, that was--

01:06:04.599 --> 01:06:06.739
I beg your pardon,
two years ago.

01:06:06.739 --> 01:06:08.179
Two years ago,
that was hot stuff.

01:06:08.179 --> 01:06:10.000
Nowadays, obviously,
it's quite passé.

01:06:10.000 --> 01:06:11.659
Anybody can do it.

01:06:11.659 --> 01:06:13.609
But he made a comment
at that table,

01:06:13.610 --> 01:06:16.910
and it was a lot of media
executives at that table

01:06:16.909 --> 01:06:19.730
was like, by this time next
year, from a single prompt,

01:06:19.730 --> 01:06:21.889
we'll be able to do
90 minutes of video.

01:06:21.889 --> 01:06:24.670
And so bye-bye, Hollywood.

01:06:24.670 --> 01:06:27.950
So the whole Hollywood is dead
meme, I think, came out of that.

01:06:27.949 --> 01:06:30.750
First of all, we can't do 90
minutes, even two years later

01:06:30.750 --> 01:06:31.523
from a prompt.

01:06:31.523 --> 01:06:33.190
And even if you did,
what kind of prompt

01:06:33.190 --> 01:06:35.670
would be able to tell you
a full story of a movie?

01:06:35.670 --> 01:06:39.930
So this type of hype
leads to engagement.

01:06:39.929 --> 01:06:42.309
This type of hype
leads to attention.

01:06:42.309 --> 01:06:45.789
But my encouragement to
you is to peel that apart.

01:06:45.789 --> 01:06:47.550
Look for the signal.

01:06:47.550 --> 01:06:48.850
Ask the why question.

01:06:48.849 --> 01:06:52.829
Ask what question and
move on from there.

01:06:52.829 --> 01:06:55.829
So becoming that
trusted advisor.

01:06:55.829 --> 01:06:57.009
World's drowning in hype.

01:06:57.010 --> 01:06:57.930
How do you do it?

01:06:57.929 --> 01:07:00.710
Look at the trends,
evaluate them objectively.

01:07:00.710 --> 01:07:02.190
Look at the genuine
opportunities

01:07:02.190 --> 01:07:04.070
that are out there.

01:07:04.070 --> 01:07:05.613
There are fashionable
distractions.

01:07:05.612 --> 01:07:07.529
I don't know what the
next one is going to be,

01:07:07.530 --> 01:07:08.830
but there are these
distractions that

01:07:08.829 --> 01:07:10.329
are out there that
will get you lots

01:07:10.329 --> 01:07:11.960
of engagement on social media.

01:07:11.960 --> 01:07:13.780
Ignore them, and
ignore the people

01:07:13.780 --> 01:07:15.180
that are leaning into them.

01:07:15.179 --> 01:07:18.899
And then really lean
into your skills

01:07:18.900 --> 01:07:23.139
about explaining technical
reality to leadership.

01:07:23.139 --> 01:07:25.259
One skill that one
person coached me

01:07:25.260 --> 01:07:27.540
in once that I thought
was really interesting,

01:07:27.539 --> 01:07:30.480
because it sounded wrong,
but it ended up being right,

01:07:30.480 --> 01:07:32.400
was whenever you see
something like this,

01:07:32.400 --> 01:07:35.860
try to figure out how to make
it as mundane as possible.

01:07:35.860 --> 01:07:38.720
When you can figure out how to
make it as mundane as possible,

01:07:38.719 --> 01:07:41.099
then you really begin
to build the grounding

01:07:41.099 --> 01:07:43.860
for being able to explain
it in detail in ways

01:07:43.860 --> 01:07:46.579
that people need to understand.

01:07:46.579 --> 01:07:49.739
If you go and you
look at, I think

01:07:49.739 --> 01:07:53.299
Gemini 3 was released
today, but there were leaks

01:07:53.300 --> 01:07:54.740
earlier this week.

01:07:54.739 --> 01:07:57.899
And one person leaked that
I built a Minecraft clone

01:07:57.900 --> 01:08:00.039
in a prompt, that kind of stuff.

01:08:00.039 --> 01:08:02.219
This is the opposite of mundane.

01:08:02.219 --> 01:08:05.074
This was massively hyping
the thing, massively showing.

01:08:05.074 --> 01:08:06.199
And of course, they didn't.

01:08:06.199 --> 01:08:07.241
They built a flashy demo.

01:08:07.242 --> 01:08:09.310
They didn't really
build a Minecraft clone.

01:08:09.309 --> 01:08:12.489
But the idea here is if you
can peel that apart to OK,

01:08:12.489 --> 01:08:15.329
how do I think about what
are the mundane things that

01:08:15.329 --> 01:08:17.329
are happening here?

01:08:17.329 --> 01:08:20.649
The one that I've been working
with a lot recently is video.

01:08:20.649 --> 01:08:23.410
So text to video prompts,
as I've mentioned,

01:08:23.409 --> 01:08:27.329
instead of the magical, you can
do whatever you want all nice

01:08:27.329 --> 01:08:29.449
and fluffy Hollywood
is dead, what

01:08:29.449 --> 01:08:31.739
is the mundane element
of doing text to Video

01:08:31.739 --> 01:08:33.489
The mundane element
of doing text to video

01:08:33.489 --> 01:08:37.648
is that when you train a model
to create video from a text

01:08:37.649 --> 01:08:39.930
prompt, what it is
doing is it's creating

01:08:39.930 --> 01:08:41.789
a number of successive frames.

01:08:41.789 --> 01:08:43.609
And each of those
successive frames

01:08:43.609 --> 01:08:46.410
is going to be slightly
different from the frame before.

01:08:46.409 --> 01:08:50.050
And you've trained a model by
looking at video to say, well,

01:08:50.050 --> 01:08:52.529
if in frame 1, the person's
hands like this and frame 2

01:08:52.529 --> 01:08:54.210
it's like that,
then you can predict

01:08:54.210 --> 01:08:56.449
it moves this way if
there's a matching prompt.

01:08:56.449 --> 01:08:58.949
And suddenly it's become
a little bit more mundane,

01:08:58.949 --> 01:09:00.992
but suddenly they
begin to understand it.

01:09:00.992 --> 01:09:02.449
And then the people
who are experts

01:09:02.449 --> 01:09:05.349
in that specific field, not
the technical side of it,

01:09:05.350 --> 01:09:06.840
are now the ones
that will actually

01:09:06.840 --> 01:09:11.039
be able to come up and do
brilliant things with it.

01:09:11.039 --> 01:09:13.399
So that height
navigation strategy--

01:09:13.399 --> 01:09:16.520
filter actively, go deep
on the fundamentals,

01:09:16.520 --> 01:09:17.814
get your slides to work.

01:09:17.814 --> 01:09:19.939
And then, of course, keep
your finger on the pulse.

01:09:19.939 --> 01:09:21.100
The hardest part
of that, I think,

01:09:21.100 --> 01:09:22.319
is the third one
is really keeping

01:09:22.319 --> 01:09:23.500
your finger on the pulse.

01:09:23.500 --> 01:09:26.539
And that's when you have to wade
into those cesspits of people

01:09:26.539 --> 01:09:28.417
just farming
engagement and really

01:09:28.417 --> 01:09:30.500
try to figure out the
signal from the noise there.

01:09:30.500 --> 01:09:32.207
But I think it's really
important for you

01:09:32.207 --> 01:09:34.879
to be able to do that, to be
connected, to understand that.

01:09:34.880 --> 01:09:36.230
Reading papers is all very good.

01:09:36.230 --> 01:09:38.438
The signal-to-noise ratio,
I think, in reading papers

01:09:38.439 --> 01:09:39.700
is a lot better.

01:09:39.699 --> 01:09:41.840
But to understand the
landscape that the people

01:09:41.840 --> 01:09:44.119
that you are advising,
they are the ones

01:09:44.119 --> 01:09:47.722
who are waiting in the cesspools
of Twitter and X and LinkedIn.

01:09:47.722 --> 01:09:49.640
And there's nothing wrong
with those platforms

01:09:49.640 --> 01:09:51.390
in and of themselves,
but the stuff that's

01:09:51.390 --> 01:09:54.560
posted on those platforms.

01:09:54.560 --> 01:10:00.440
So overall landscape, it
is ripe with opportunity,

01:10:00.439 --> 01:10:02.799
absolutely ripe
with opportunity.

01:10:02.800 --> 01:10:04.400
So I would encourage
you, as Andrew

01:10:04.399 --> 01:10:07.269
did, to continue learning,
to continue digging

01:10:07.270 --> 01:10:09.830
into what you can do and
to continue building.

01:10:09.829 --> 01:10:12.670
But there are risks ahead.

01:10:12.670 --> 01:10:16.390
Anybody remember
the movie Titanic?

01:10:16.390 --> 01:10:19.510
Remember the famous phrase in
that, "iceberg right ahead"?

01:10:19.510 --> 01:10:23.257
But immediately before that,
there's a scene in Titanic--

01:10:23.256 --> 01:10:25.089
if we weren't being
filmed, I would show it,

01:10:25.090 --> 01:10:27.350
but I can't for
copyright reasons-- where

01:10:27.350 --> 01:10:31.350
the two guys up in the crow's
nest are freezing and talking.

01:10:31.350 --> 01:10:33.230
And the crow's nest
at the top of the ship

01:10:33.229 --> 01:10:36.466
is where the spotters would be
to spot any icebergs in front.

01:10:36.466 --> 01:10:38.049
And go back and watch
the movie again.

01:10:38.050 --> 01:10:40.489
You'll see the conversation
between these two guys

01:10:40.489 --> 01:10:43.550
is that all they're talking
about is how cold they are.

01:10:43.550 --> 01:10:45.190
And then it cuts
away to the crew

01:10:45.189 --> 01:10:47.349
of the ship who are
like, wait, aren't they

01:10:47.350 --> 01:10:48.942
supposed to have binoculars?

01:10:48.942 --> 01:10:51.149
And then the crew is like,
oh, we left the binoculars

01:10:51.149 --> 01:10:52.629
behind in port.

01:10:52.630 --> 01:10:55.289
That framing the
whole idea was like,

01:10:55.289 --> 01:10:58.109
they were so arrogant in
being able to move forward

01:10:58.109 --> 01:11:00.567
that they didn't want to look
out for any particular risks.

01:11:00.568 --> 01:11:02.402
And even though they
had people whose job it

01:11:02.402 --> 01:11:04.380
was to look out for risks,
they didn't properly

01:11:04.380 --> 01:11:05.779
equip or train them.

01:11:05.779 --> 01:11:07.979
And that, to me, is a
really good metaphor

01:11:07.979 --> 01:11:10.019
for where the AI
industry is today.

01:11:10.020 --> 01:11:12.500
There are risks in front of us.

01:11:12.500 --> 01:11:14.539
Those risks, the B
word, the bubble word

01:11:14.539 --> 01:11:17.859
you're probably reading in
the news is there, are there.

01:11:17.859 --> 01:11:24.139
To me, though, the opportunity
and the things to think about

01:11:24.140 --> 01:11:28.220
in terms of a bubble are most
of you probably don't remember

01:11:28.220 --> 01:11:31.140
dotcom bubble of the 2000s.

01:11:31.140 --> 01:11:33.619
But if you think about
the dotcom bubble,

01:11:33.619 --> 01:11:36.460
that was the biggest
bubble in history.

01:11:36.460 --> 01:11:40.220
It bursts, but we're still here.

01:11:40.220 --> 01:11:46.260
And the people who dotcom rights
not only survived, they thrived.

01:11:46.260 --> 01:11:49.659
Amazon, Google,
they did it right.

01:11:49.659 --> 01:11:51.619
They understood the
fundamentals of what

01:11:51.619 --> 01:11:52.981
it was to build a dotcom.

01:11:52.981 --> 01:11:54.939
They understood the
fundamentals of what it was

01:11:54.939 --> 01:11:56.659
to build a business on dotcom.

01:11:56.659 --> 01:11:59.399
And when the bubble of hype
burst, they didn't go with it.

01:11:59.399 --> 01:12:01.859
There was one website, I
believe it was pets.com,

01:12:01.859 --> 01:12:06.579
that they had the mindset of if
you build it, they will come.

01:12:06.579 --> 01:12:10.019
They had Super Bowl
commercials around pets.com.

01:12:10.020 --> 01:12:12.260
They couldn't handle the
traffic that they got.

01:12:12.260 --> 01:12:15.480
And that was the kind of site
that when the bubble burst,

01:12:15.479 --> 01:12:17.779
those were the sites
that just evaporated.

01:12:17.779 --> 01:12:20.279
So that bubble in
AI is likely coming.

01:12:20.279 --> 01:12:22.259
There is always a bubble.

01:12:22.260 --> 01:12:25.420
So the companies that
are doing AI right

01:12:25.420 --> 01:12:27.460
are the ones, like I
said, that won't just

01:12:27.460 --> 01:12:33.220
avoid the bubble that they will
actually thrive post bubble.

01:12:33.220 --> 01:12:37.380
And the people who are
doing AI right, the folks

01:12:37.380 --> 01:12:39.300
in this room who are
thinking about AI

01:12:39.300 --> 01:12:40.898
and how you bring
it to your company,

01:12:40.898 --> 01:12:42.940
and the advice that you're
giving to your company

01:12:42.939 --> 01:12:45.179
and leaning into
that in the right way

01:12:45.180 --> 01:12:48.619
will also be the ones who not
only avoid getting laid off

01:12:48.619 --> 01:12:52.579
in the bubble crashes, but will
be the ones who will thrive

01:12:52.579 --> 01:12:54.779
through and after the bubble.

01:12:54.779 --> 01:12:57.739
So anatomy of any bubble,
and what I'm seeing in the AI

01:12:57.739 --> 01:13:00.289
one in particular, is
this kind of pyramid.

01:13:00.289 --> 01:13:02.710
At the top is the hype that
I've been talking about.

01:13:02.710 --> 01:13:05.770
At the bottom is
massive VC investment.

01:13:05.770 --> 01:13:06.470
I'll be frank.

01:13:06.470 --> 01:13:08.730
I'm already seeing
that drying up.

01:13:08.729 --> 01:13:10.649
Once upon a time,
you could go out

01:13:10.649 --> 01:13:12.409
with anything that
had AI written on it

01:13:12.409 --> 01:13:13.967
and get VC investment.

01:13:13.967 --> 01:13:16.010
Then you could go out and
do anything with an LLM

01:13:16.010 --> 01:13:17.530
and get VC investment.

01:13:17.529 --> 01:13:20.769
Now there are far,
far, far more cautious.

01:13:20.770 --> 01:13:22.930
I've been advising
a lot of startups.

01:13:22.930 --> 01:13:27.329
The amount that they're getting
invested is being scaled back.

01:13:27.329 --> 01:13:30.130
The stuff that's being
invested in is changing.

01:13:30.130 --> 01:13:35.170
And the second layer down,
massive VC investment

01:13:35.170 --> 01:13:37.369
is already beginning to vanish.

01:13:37.369 --> 01:13:39.369
Unrealistic valuations.

01:13:39.369 --> 01:13:42.289
Companies that aren't
making money being valued

01:13:42.289 --> 01:13:43.109
massively high.

01:13:43.109 --> 01:13:44.369
We all who they are.

01:13:44.369 --> 01:13:47.329
We're beginning to see those
unrealistic valuations being

01:13:47.329 --> 01:13:49.050
fed off of that hype.

01:13:49.050 --> 01:13:51.993
# #MeToo products, where
somebody does something,

01:13:51.993 --> 01:13:53.410
and it's successful,
and everybody

01:13:53.409 --> 01:13:54.750
jumps on the bandwagon.

01:13:54.750 --> 01:13:56.390
We're also seeing
them everywhere.

01:13:56.390 --> 01:13:59.160
We saw them throughout
the dotcom bubble.

01:13:59.159 --> 01:14:01.800
And then right at the
bottom is that real value.

01:14:01.800 --> 01:14:04.460
I probably shouldn't have
done the triangle like this.

01:14:04.460 --> 01:14:06.300
It should be more an
upside down triangle.

01:14:06.300 --> 01:14:08.820
Because the real
value here is small.

01:14:08.819 --> 01:14:11.179
But I've vibe coded these
slides into existence.

01:14:11.180 --> 01:14:14.039
So this is one of the
technical debt I took on.

01:14:14.039 --> 01:14:18.180
But the real value there,
that kernel of value is there,

01:14:18.180 --> 01:14:22.840
and the ones that build for that
will be the ones that survive.

01:14:22.840 --> 01:14:28.777
So the direction that I see
the AI industry going in

01:14:28.777 --> 01:14:31.359
and the direction that I would
encourage you to start thinking

01:14:31.359 --> 01:14:33.859
about your skills in, is really
over the next five years,

01:14:33.859 --> 01:14:36.059
there's going to
be a bifurcation.

01:14:36.060 --> 01:14:38.760
I'm just going to
be ornery in how

01:14:38.760 --> 01:14:40.880
I describe it as big and small.

01:14:40.880 --> 01:14:43.720
Big AI will be what we see
today, with the large language

01:14:43.720 --> 01:14:48.440
models getting bigger in the
desire to drive towards AGI.

01:14:48.439 --> 01:14:52.079
The Geminis, the Claudes,
the OpenAIs of the world

01:14:52.079 --> 01:14:54.319
are going to continue to
drive bigger, and bigger

01:14:54.319 --> 01:14:57.829
is better in the mindset
of those companies

01:14:57.829 --> 01:15:01.659
towards achieving AGI or towards
achieving better business value.

01:15:01.659 --> 01:15:03.409
That's going to be one
side of the branch.

01:15:03.409 --> 01:15:05.867
The other side of the branch
is I'm going to call it small.

01:15:05.868 --> 01:15:08.510
We've all seen
open-source models.

01:15:08.510 --> 01:15:10.347
I hate the term open source.

01:15:10.347 --> 01:15:12.390
Let me call them open
weights or let me call them

01:15:12.390 --> 01:15:15.590
self-hostable models
are becoming-- they're

01:15:15.590 --> 01:15:17.670
exploding onto the landscape.

01:15:17.670 --> 01:15:20.069
I read an article recently
about Y Combinator

01:15:20.069 --> 01:15:23.069
that 80% of the
companies in Y Combinator

01:15:23.069 --> 01:15:26.429
were using small models
from China in particular.

01:15:26.430 --> 01:15:29.230
So the Chinese
models in particular

01:15:29.229 --> 01:15:31.269
are doing really well,
probably because of

01:15:31.270 --> 01:15:32.380
the overall landscape.

01:15:32.380 --> 01:15:34.630
They're not leaning into the
large models the same way

01:15:34.630 --> 01:15:36.230
as the West is.

01:15:36.229 --> 01:15:37.771
I see that
bifurcation happening.

01:15:37.771 --> 01:15:39.229
China, I think,
has that head start

01:15:39.229 --> 01:15:41.089
on the small models
that may last.

01:15:41.090 --> 01:15:41.989
It may not.

01:15:41.989 --> 01:15:43.069
I don't know.

01:15:43.069 --> 01:15:45.829
But the point is, we're heading
in that particular direction

01:15:45.829 --> 01:15:48.729
of I'm going to call them
instead of big and small now,

01:15:48.729 --> 01:15:52.429
models that are hosted on
your behalf by somebody else,

01:15:52.430 --> 01:15:54.950
like a GPT or a
Gemini or a Claude,

01:15:54.949 --> 01:15:59.099
or models that you can host
yourself for your own needs.

01:15:59.100 --> 01:16:03.220
As this side has right
now is underserved,

01:16:03.220 --> 01:16:05.180
this bubble may burst.

01:16:05.180 --> 01:16:06.920
This one right now
is underserved.

01:16:06.920 --> 01:16:09.699
And this bubble
will be later on.

01:16:09.699 --> 01:16:12.819
And the major skills that I
can see developers needing

01:16:12.819 --> 01:16:16.299
over the next two to three
years on this side of the fence

01:16:16.300 --> 01:16:18.500
will be fine tuning.

01:16:18.500 --> 01:16:21.539
So the ability to take
an open-source model

01:16:21.539 --> 01:16:25.019
and fine-tune it for
particular downstream tasks.

01:16:25.020 --> 01:16:27.700
Let me give one concrete
example of that I've personally

01:16:27.699 --> 01:16:28.739
experienced.

01:16:28.739 --> 01:16:30.939
I work a lot in Hollywood,
and I've worked a lot

01:16:30.939 --> 01:16:33.259
with studios making movies.

01:16:33.260 --> 01:16:36.020
And one studio in particular
I was lucky enough

01:16:36.020 --> 01:16:38.560
to sell a movie to, it's
still in preproduction.

01:16:38.560 --> 01:16:41.180
It'll probably be in
preproduction forever.

01:16:41.180 --> 01:16:44.340
But one of the things I
learned as part of that process

01:16:44.340 --> 01:16:49.020
was IP in studios
is so protected.

01:16:49.020 --> 01:16:50.980
It's not even funny.

01:16:50.979 --> 01:16:52.779
Go in Google for
James Cameron, who

01:16:52.779 --> 01:16:55.090
created Avatar and
the lawsuits that he's

01:16:55.090 --> 01:16:58.170
involved in of this person
who apparently sent him

01:16:58.170 --> 01:17:00.810
a story many years
ago about blue aliens

01:17:00.810 --> 01:17:02.850
and is now suing him
for billions of dollars

01:17:02.850 --> 01:17:06.410
because obviously there
were blue aliens in Avatar.

01:17:06.409 --> 01:17:09.889
That level of IP protection
in Hollywood is insane.

01:17:09.890 --> 01:17:12.730
The opportunity with
large language models

01:17:12.729 --> 01:17:15.129
is equally insane.

01:17:15.130 --> 01:17:17.949
A lot of the focus is on large
language models for creation,

01:17:17.949 --> 01:17:20.069
for storytelling, for
rendering and all that,

01:17:20.069 --> 01:17:22.289
but actually the major
opportunity that they have is

01:17:22.289 --> 01:17:27.170
actually for analysis to take
a look at synopses of movies

01:17:27.170 --> 01:17:29.369
and find out what
works and what doesn't.

01:17:29.369 --> 01:17:32.090
Why was this movie a
hit and this one wasn't?

01:17:32.090 --> 01:17:34.650
What time of year was this
one released and it became

01:17:34.649 --> 01:17:36.489
successful and this one wasn't?

01:17:36.489 --> 01:17:39.149
And with a margin on
movies being razor thin,

01:17:39.149 --> 01:17:40.929
that kind of analysis is huge.

01:17:40.930 --> 01:17:42.637
But in order to do
that kind of analysis,

01:17:42.637 --> 01:17:44.430
you need to share the
details of your movie

01:17:44.430 --> 01:17:45.650
with a large language model.

01:17:45.649 --> 01:17:48.862
And they will absolutely not
do that with GPT or Gemini

01:17:48.863 --> 01:17:50.530
or whatever, because
they're now sharing

01:17:50.529 --> 01:17:52.809
their IP with a third party.

01:17:52.810 --> 01:17:55.330
Enter small models,
where they can self-host

01:17:55.329 --> 01:17:58.189
their own small model and they
are getting smarter and smarter.

01:17:58.189 --> 01:18:02.089
The 7B model of today is
as smart as the 50B model

01:18:02.090 --> 01:18:03.170
of yesterday.

01:18:03.170 --> 01:18:06.409
A year from now, the 7B model of
a year from now will be as smart

01:18:06.409 --> 01:18:09.849
as the 300B model of yesteryear.

01:18:09.850 --> 01:18:13.090
So they're moving in that
direction of building

01:18:13.090 --> 01:18:16.369
using small self-hosted
models, which they can then

01:18:16.369 --> 01:18:18.104
fine-tune on downstream tasks.

01:18:18.104 --> 01:18:19.729
Similar with other
things where privacy

01:18:19.729 --> 01:18:21.809
is important law offices,
medical offices, all

01:18:21.810 --> 01:18:22.950
of those kind of things.

01:18:22.949 --> 01:18:25.170
So those type of skills
are fundamentally

01:18:25.170 --> 01:18:27.090
important going forward.

01:18:27.090 --> 01:18:30.449
So that's the bifurcation that
I'm seeing happening in AI.

01:18:30.449 --> 01:18:34.329
The sooner bubble I think is
in the bigger non self-hosted.

01:18:34.329 --> 01:18:36.809
The later bubble is in
the smaller self-hosted.

01:18:36.810 --> 01:18:39.330
But either way, for
you, for your career,

01:18:39.329 --> 01:18:42.689
to avoid the impact of
any bubble bursting,

01:18:42.689 --> 01:18:44.429
focus on the fundamentals.

01:18:44.430 --> 01:18:46.190
Build those real solutions.

01:18:46.189 --> 01:18:48.389
Understand the business
side, and most of all,

01:18:48.390 --> 01:18:49.789
diversify your skills.

01:18:49.789 --> 01:18:53.579
Don't be that one trick pony who
only knows how to do one thing.

01:18:53.579 --> 01:18:55.600
I've worked with
brilliant people who

01:18:55.600 --> 01:18:58.620
are fantastic at coding,
in particular API,

01:18:58.619 --> 01:18:59.699
or particular framework.

01:18:59.699 --> 01:19:03.720
And then the industry moved
on and they got left behind.

01:19:03.720 --> 01:19:07.600
OK, so yeah, when bubbles burst,
that overall fallout kind of

01:19:07.600 --> 01:19:09.400
spoken about it a
little bit already.

01:19:09.399 --> 01:19:12.019
Funding evaporates, hiring
freezes become layoffs,

01:19:12.020 --> 01:19:14.320
projects get canceled, and
talent floods the market.

01:19:14.319 --> 01:19:14.819
Yeah.

01:19:14.819 --> 01:19:17.939
Quick question from
the last slide.

01:19:17.939 --> 01:19:23.679
[INAUDIBLE] I heard a lot
about how NVIDIA is hiring,

01:19:23.680 --> 01:19:26.240
and they're very
specific about they

01:19:26.239 --> 01:19:30.599
want people for very specific
problem that they have.

01:19:30.600 --> 01:19:34.600
So they can require people to be
basically put out that one thing

01:19:34.600 --> 01:19:35.800
that you're missing.

01:19:35.800 --> 01:19:43.079
So how do you think-- how is
it more important to diversify

01:19:43.079 --> 01:19:46.420
skills versus actually
focusing on, for example,

01:19:46.420 --> 01:19:49.350
LLMs versus computer
vision or versus

01:19:49.350 --> 01:19:52.550
very specific downstream task?

01:19:52.550 --> 01:19:55.230
So I mean, I think so the
question was around NVIDIA

01:19:55.229 --> 01:19:57.669
in particular or hiring
for a very specific, very

01:19:57.670 --> 01:19:58.806
narrow scenario.

01:19:58.806 --> 01:20:00.389
So then the question
is, how important

01:20:00.390 --> 01:20:01.765
is it for you to
become an expert

01:20:01.765 --> 01:20:04.869
in a narrow scenario versus
diversifying your skills?

01:20:04.869 --> 01:20:08.750
I would always argue it's still
better to diversify your skills,

01:20:08.750 --> 01:20:11.069
because that one narrow
scenario is only that one

01:20:11.069 --> 01:20:13.112
narrow scenario, and you're
putting all your eggs

01:20:13.112 --> 01:20:13.850
into one basket.

01:20:13.850 --> 01:20:16.310
NVIDIA would be a fantastic
company to work for.

01:20:16.310 --> 01:20:17.817
Nothing against them in any way.

01:20:17.817 --> 01:20:20.149
But if you put all of your
eggs into that basket and you

01:20:20.149 --> 01:20:22.309
don't get it, then what?

01:20:22.310 --> 01:20:24.310
So I think the idea
of really being

01:20:24.310 --> 01:20:28.230
able to-- if you are
passionate about a thing,

01:20:28.229 --> 01:20:31.289
to be very deep in that
thing is very, very good.

01:20:31.289 --> 01:20:33.689
But to only be able
to do that thing,

01:20:33.689 --> 01:20:36.809
I think I would always
encourage to be diversified.

01:20:36.810 --> 01:20:39.430
And when I say diversified,
you're saying LLMs or computer

01:20:39.430 --> 01:20:41.110
vision or anything
like that, I think

01:20:41.109 --> 01:20:42.489
I mean that's one part of it.

01:20:42.489 --> 01:20:46.269
But it's like that knowledge of
models and how to use them to me

01:20:46.270 --> 01:20:47.940
is a uni skill.

01:20:47.939 --> 01:20:51.556
The diversification of skills
is breaking outside of that.

01:20:51.556 --> 01:20:54.139
Also to be able to think, OK,
what about building applications

01:20:54.140 --> 01:20:55.320
on top of these?

01:20:55.319 --> 01:20:57.319
What does scaling an
application look like?

01:20:57.319 --> 01:20:59.599
What does software engineering
in this case look like?

01:20:59.600 --> 01:21:02.720
What about user experience
and user experience skills?

01:21:02.720 --> 01:21:05.560
Because it's all very well to
build a beautiful application.

01:21:05.560 --> 01:21:06.960
But if nobody can use it--

01:21:06.960 --> 01:21:10.140
I'm looking at here
at Microsoft Office.

01:21:10.140 --> 01:21:13.220
There's stuff like that
that's what I really

01:21:13.220 --> 01:21:14.840
mean about diversifying beyond.

01:21:14.840 --> 01:21:17.400
So even in that mono
example with NVIDIA,

01:21:17.399 --> 01:21:20.359
to be able to break out of
that one particular example,

01:21:20.359 --> 01:21:22.960
but to show skills in other
areas that are of value,

01:21:22.960 --> 01:21:26.060
I think is really important.

01:21:26.060 --> 01:21:27.100
OK.

01:21:27.100 --> 01:21:29.440
As we're just running
a little bit-- so yeah,

01:21:29.439 --> 01:21:30.419
I just wanted to--

01:21:30.420 --> 01:21:32.560
I've gone into it a
little bit already,

01:21:32.560 --> 01:21:35.000
but I'm a massive
advocate for small AI.

01:21:35.000 --> 01:21:38.300
I really do believe small
AI is the next big thing,

01:21:38.300 --> 01:21:39.920
because we're
moving into a world,

01:21:39.920 --> 01:21:42.220
and this is part of the
job that I do at Arm,

01:21:42.220 --> 01:21:44.740
is we're kind of moving into
a world of AI everywhere

01:21:44.739 --> 01:21:46.369
all at once.

01:21:46.369 --> 01:21:47.827
So there's a
traditional, and it's

01:21:47.827 --> 01:21:49.409
interesting you just
brought up NVIDIA

01:21:49.409 --> 01:21:51.849
because there's a
traditional conception

01:21:51.850 --> 01:21:55.690
that compute platforms are CPU
plus GPU when it comes to AI.

01:21:55.689 --> 01:21:57.489
But that's also changing--

01:21:57.489 --> 01:22:00.189
CPU general purpose,
GPU specialists.

01:22:00.189 --> 01:22:02.729
But for example,
in mobile space,

01:22:02.729 --> 01:22:05.729
there's massive innovation
being done with the technology

01:22:05.729 --> 01:22:09.049
called SME, Scalable
Matrix Extensions.

01:22:09.050 --> 01:22:11.210
And what SME is
all about is really

01:22:11.210 --> 01:22:13.050
allowing you to
bring AI workloads

01:22:13.050 --> 01:22:15.050
and put them on the CPU.

01:22:15.050 --> 01:22:19.489
The frontrunners in this are a
couple of Chinese phone vendors,

01:22:19.489 --> 01:22:22.809
Vivo and Oppo, who've just
recently released phones

01:22:22.810 --> 01:22:24.690
with SME-enabled chips.

01:22:24.689 --> 01:22:26.989
And what's magical
about these is that, A,

01:22:26.989 --> 01:22:30.010
they don't need to have a
separate external chip drawing

01:22:30.010 --> 01:22:33.250
extra power, taking up
extra footprint space just

01:22:33.250 --> 01:22:35.109
to be able to run AI workloads.

01:22:35.109 --> 01:22:38.549
And B, the CPU, of course,
being a low power pulling thing,

01:22:38.550 --> 01:22:40.670
being able to run AI
workloads on that,

01:22:40.670 --> 01:22:43.029
they've been able to build
interesting new scenarios.

01:22:43.029 --> 01:22:45.199
And if I talk about
one in particular,

01:22:45.199 --> 01:22:47.039
there's a company called Alipay.

01:22:47.039 --> 01:22:50.659
And Alipay had an
application where you would--

01:22:50.659 --> 01:22:52.279
and we've all seen
these apps where

01:22:52.279 --> 01:22:53.960
you can go through
your photographs,

01:22:53.960 --> 01:22:56.039
and you can search for
a particular thing.

01:22:56.039 --> 01:22:59.239
Places I ate sushi or something
along those lines and use

01:22:59.239 --> 01:23:00.880
that to create a slideshow.

01:23:00.880 --> 01:23:03.840
All of those require
a back end service.

01:23:03.840 --> 01:23:06.640
So your photographs are hosted
on Google Photos or Apple

01:23:06.640 --> 01:23:08.539
Photos or something like that.

01:23:08.539 --> 01:23:10.359
And that back end
service runs the model

01:23:10.359 --> 01:23:12.119
that you can search
against it and be

01:23:12.119 --> 01:23:14.239
able to do the assembly of them.

01:23:14.239 --> 01:23:16.239
What Alipay wanted to
do was like, say, there

01:23:16.239 --> 01:23:17.880
are three problems with this.

01:23:17.880 --> 01:23:19.699
Problem number one, privacy.

01:23:19.699 --> 01:23:21.840
You have to share your
photos with a third party.

01:23:21.840 --> 01:23:23.659
Problem number two, latency.

01:23:23.659 --> 01:23:25.019
You got to upload those photos.

01:23:25.020 --> 01:23:26.220
You got to send the thing.

01:23:26.220 --> 01:23:28.021
You got to have the
back end do the thing,

01:23:28.021 --> 01:23:30.479
and then you've got to download
the results from the thing.

01:23:30.479 --> 01:23:33.279
And then number three is
building that cloud service

01:23:33.279 --> 01:23:36.119
and standing that up
cost time and money.

01:23:36.119 --> 01:23:39.720
So if they could move all of
this onto the device itself,

01:23:39.720 --> 01:23:41.960
now the idea was they
could run a model

01:23:41.960 --> 01:23:44.420
on the device that searches
the photos on the device.

01:23:44.420 --> 01:23:45.680
You don't have the latency.

01:23:45.680 --> 01:23:47.800
And business
perspective, they're

01:23:47.800 --> 01:23:51.060
now saving the money on
creating this stand up service.

01:23:51.060 --> 01:23:54.240
They now have AI running on CPU
in order to be able to do that.

01:23:54.239 --> 01:23:56.800
Apple are also people
who've invested heavily

01:23:56.800 --> 01:23:59.143
in this scalable
matrix extensions.

01:23:59.143 --> 01:24:00.560
You see whenever
they talk about--

01:24:00.560 --> 01:24:03.760
if you've ever watched a WWDC
or anything like that, when they

01:24:03.760 --> 01:24:06.815
talk about the new A-series
chips and M-series chips,

01:24:06.814 --> 01:24:09.439
about the neural cores and those
kind of things in them, that's

01:24:09.439 --> 01:24:10.799
part of the idea.

01:24:10.800 --> 01:24:15.279
So to think about breaking that
habit that we've gotten into,

01:24:15.279 --> 01:24:18.719
where you need a GPU to be able
to do AI is part of the trend

01:24:18.720 --> 01:24:20.143
that the world is heading in.

01:24:20.143 --> 01:24:22.060
Apple are probably one
of the leaders in that.

01:24:22.060 --> 01:24:24.800
I'm very, very bullish on
Apple and Apple Intelligence

01:24:24.800 --> 01:24:31.760
as a result. And from the AI
perspective, seeing that trend

01:24:31.760 --> 01:24:36.039
and following that vector to
its logical conclusion as models

01:24:36.039 --> 01:24:39.600
are getting smaller embedded
intelligence getting everywhere

01:24:39.600 --> 01:24:40.740
isn't a pipe dream.

01:24:40.739 --> 01:24:41.849
It isn't sci-fi anymore.

01:24:41.850 --> 01:24:43.392
It's going to be a
reality that we'll

01:24:43.391 --> 01:24:44.809
be seeing very, very shortly.

01:24:44.810 --> 01:24:47.410
So that idea of that
convergence of AI,

01:24:47.409 --> 01:24:50.869
because of the ability of
smaller models getting smarter

01:24:50.869 --> 01:24:53.930
and lower power devices
being able to run them,

01:24:53.930 --> 01:24:56.550
we see that convergence hitting,
and I see massive opportunity

01:24:56.550 --> 01:24:58.470
there.

01:24:58.470 --> 01:25:01.890
So one last part and just going
back to agents for a moment,

01:25:01.890 --> 01:25:03.950
I think the one
thing that I always

01:25:03.949 --> 01:25:06.750
say is like a hidden part
of artificial intelligence

01:25:06.750 --> 01:25:09.430
is really what I like to call
artificial understanding.

01:25:09.430 --> 01:25:12.590
And when you can start using
models to understand things

01:25:12.590 --> 01:25:14.210
on your behalf.

01:25:14.210 --> 01:25:16.170
And when they understand
them on your behalf,

01:25:16.170 --> 01:25:20.270
to be able to craft from that
understanding new things,

01:25:20.270 --> 01:25:22.188
you can actually
develop superpowers

01:25:22.188 --> 01:25:24.230
where you're far more
effective than ever before,

01:25:24.229 --> 01:25:26.829
be that creating code or
creating other things.

01:25:26.829 --> 01:25:30.470
I'm going to give one quick
demo just so we can wrap up.

01:25:30.470 --> 01:25:35.270
And I was talking earlier
about generating video.

01:25:35.270 --> 01:25:39.130
So this picture is-- oops.

01:25:42.159 --> 01:25:42.659
Sorry.

01:25:42.659 --> 01:25:45.920
The connection here is
not very good, I lost it.

01:25:45.920 --> 01:25:47.000
So here we go.

01:25:47.000 --> 01:25:50.659
This picture here is actually
of my son playing ice hockey.

01:25:50.659 --> 01:25:53.239
And I took this picture,
and I was saying,

01:25:53.239 --> 01:25:56.859
OK, I think I'm very
good at prompting.

01:25:56.859 --> 01:26:00.399
And I wrote a nice prompt
for this picture to get him.

01:26:00.399 --> 01:26:02.399
He's in the middle
of taking a slapshot.

01:26:02.399 --> 01:26:04.699
He's got some beautiful
flex on his stick.

01:26:04.699 --> 01:26:08.340
And I asked it like, OK, to
prompt him scoring a goal.

01:26:08.340 --> 01:26:10.659
What do you think happened?

01:26:10.659 --> 01:26:12.739
Should we watch?

01:26:12.739 --> 01:26:13.840
Let's see if it works.

01:26:13.840 --> 01:26:15.747
[VIDEO PLAYBACK]

01:26:18.560 --> 01:26:20.796
[CROWD CHEERING]

01:26:20.796 --> 01:26:21.379
[END PLAYBACK]

01:26:21.380 --> 01:26:25.380
This was the wrong video, but
it still shows the same idea.

01:26:25.380 --> 01:26:29.500
Because of poor prompting or
because of poor understanding

01:26:29.500 --> 01:26:34.020
of my intent, if I talk
about it in agentic senses,

01:26:34.020 --> 01:26:36.580
the arena that he was in,
which is a practice arena

01:26:36.579 --> 01:26:38.890
and doesn't have any
people in it-- sorry.

01:26:38.890 --> 01:26:41.970
Let me pause it.

01:26:41.970 --> 01:26:46.490
If I just rewind to
here, if we look up

01:26:46.489 --> 01:26:48.750
in this top right-hand
corner here,

01:26:48.750 --> 01:26:51.409
this is basically where they
store all their garbage.

01:26:51.409 --> 01:26:53.813
But the AI didn't know
that, had no idea of it.

01:26:53.813 --> 01:26:55.230
So it assumed it
was a full arena,

01:26:55.229 --> 01:26:57.169
and it started
painting people in.

01:26:57.170 --> 01:27:00.630
And even though he shot a
mile wide, everybody cheers.

01:27:00.630 --> 01:27:03.829
And somehow he has two sticks
in his hand instead of one,

01:27:03.829 --> 01:27:05.649
and they forgot his name.

01:27:05.649 --> 01:27:09.710
So I did not go through an
agentic workflow to do this.

01:27:09.710 --> 01:27:13.630
I did not go through the steps
of, A, understand my intent.

01:27:13.630 --> 01:27:15.632
B, once you
understand my intent,

01:27:15.631 --> 01:27:17.589
understand the tools that
are available to you.

01:27:17.590 --> 01:27:19.610
In this case, it's
Veo, and understand

01:27:19.609 --> 01:27:21.969
the intricacies of using Veo.

01:27:21.970 --> 01:27:23.273
Make a plan of how to use them.

01:27:23.273 --> 01:27:25.190
Make a plan of how to
build a prompt for them,

01:27:25.189 --> 01:27:27.210
and then use them
and then reflect.

01:27:27.210 --> 01:27:32.090
So I've been advising a
startup that is working

01:27:32.090 --> 01:27:34.390
on movie creation using AI.

01:27:34.390 --> 01:27:36.800
And I want to show you a
little sample here of a movie

01:27:36.800 --> 01:27:39.720
that we've been working on with
them, where the whole idea is

01:27:39.720 --> 01:27:42.360
like, if you want to have
performances at a virtual actors

01:27:42.359 --> 01:27:45.039
and actresses, you
need to have emotion.

01:27:45.039 --> 01:27:47.340
You need to be able to
convey that emotion,

01:27:47.340 --> 01:27:50.640
and you also need to be able to
put that emotion in the context

01:27:50.640 --> 01:27:52.119
of the entire story.

01:27:52.119 --> 01:27:54.579
Because when you create
a video from a prompt,

01:27:54.579 --> 01:27:56.494
you're creating an
eight-second snippet.

01:27:56.494 --> 01:27:58.119
That eight-second
snippet needs to know

01:27:58.119 --> 01:28:00.559
what's going on in
the rest of the story.

01:28:00.560 --> 01:28:03.680
So if I show this
one for a moment.

01:28:03.680 --> 01:28:06.140
And it's a little
wooden at the moment,

01:28:06.140 --> 01:28:08.560
it's not really
working perfectly.

01:28:08.560 --> 01:28:10.538
I have professional
actors who are friends

01:28:10.537 --> 01:28:12.079
who are advising me
on this, and they

01:28:12.079 --> 01:28:13.600
laughed at the performances.

01:28:13.600 --> 01:28:16.640
But try to view it
through the difference

01:28:16.640 --> 01:28:19.200
that we had from an agentic
prompt with the hockey

01:28:19.199 --> 01:28:20.619
player to this one.

01:28:20.619 --> 01:28:22.519
[VIDEO PLAYBACK]

01:28:22.520 --> 01:28:23.860
That's hopefully we can hear it.

01:28:33.594 --> 01:28:36.250
- I guess I can do the
pub quiz after all.

01:28:40.550 --> 01:28:42.869
They just shut me down.

01:28:42.869 --> 01:28:45.949
I'm so close.

01:28:45.949 --> 01:28:48.250
But they wouldn't listen.

01:28:48.250 --> 01:28:49.060
- I won't--

01:28:49.060 --> 01:28:49.643
[END PLAYBACK]

01:28:49.643 --> 01:28:51.310
They never listen.

01:28:51.310 --> 01:28:54.630
So here's the idea
of, again, just

01:28:54.630 --> 01:28:57.090
thinking in terms of agentic,
as I was saying earlier on,

01:28:57.090 --> 01:28:58.590
breaking it into those steps.

01:28:58.590 --> 01:29:01.069
That allowed me to use
exactly the same engine,

01:29:01.069 --> 01:29:02.630
as I was showing
you earlier on, that

01:29:02.630 --> 01:29:04.789
fails to be able
to show something

01:29:04.789 --> 01:29:07.949
that works and is able to do
things like portraying emotion

01:29:07.949 --> 01:29:09.309
that I just spoke about.

01:29:09.310 --> 01:29:11.450
So I know we're a
little bit over time.

01:29:11.449 --> 01:29:12.529
So sorry about that.

01:29:12.529 --> 01:29:14.569
I can take any questions
if anybody has any.

01:29:14.569 --> 01:29:15.889
I see Andrew is here as well.

01:29:15.890 --> 01:29:16.725
He's at the back.

01:29:16.725 --> 01:29:18.350
And I just really
want to say thank you

01:29:18.350 --> 01:29:19.490
so much for your attention.

01:29:19.489 --> 01:29:21.409
I really appreciate it.

01:29:21.409 --> 01:29:24.225
[APPLAUSE]

01:29:28.909 --> 01:29:29.710
Yep.

01:29:29.710 --> 01:29:34.180
How much of this new
generation [INAUDIBLE]

01:29:34.180 --> 01:29:38.539
relation with the agentic
[INAUDIBLE] use case

01:29:38.539 --> 01:29:40.939
is improved with the
agentic workflow?

01:29:40.939 --> 01:29:43.899
And how much of it is
a training set bias

01:29:43.899 --> 01:29:48.179
where you might
have only pictures

01:29:48.180 --> 01:29:53.220
or videos with [INAUDIBLE]
that are full of [INAUDIBLE]

01:29:53.220 --> 01:29:55.645
Yeah, it's a great question.

01:29:55.645 --> 01:29:58.020
Just to repeat for the video,
how much of the improvement

01:29:58.020 --> 01:30:00.220
is from the use of
an agentic workflow

01:30:00.220 --> 01:30:03.340
versus just lack of hockey
stuff in the training set

01:30:03.340 --> 01:30:06.060
for the failed one?

01:30:06.060 --> 01:30:09.380
Not comparing like to,
so just using my gut.

01:30:09.380 --> 01:30:12.420
When I looked at when I broke
this down into the workflow that

01:30:12.420 --> 01:30:14.619
said, OK, I created
scenes like this one

01:30:14.619 --> 01:30:18.699
and they were awful when I
just did it directly for myself

01:30:18.699 --> 01:30:22.739
with no basis, no agentic,
no artificial understanding.

01:30:22.739 --> 01:30:25.979
And when I broke it down into
the steps where it's like, OK,

01:30:25.979 --> 01:30:28.500
in this scene, the girl
is sitting on the bench,

01:30:28.500 --> 01:30:30.100
and she's upset.

01:30:30.100 --> 01:30:34.940
And the person is talking to
her and he wants to comfort her.

01:30:34.939 --> 01:30:38.259
Feeding that to a
large language model

01:30:38.260 --> 01:30:40.659
along with the entire
story and along

01:30:40.659 --> 01:30:43.019
with the constraints that
I had, where the shot

01:30:43.020 --> 01:30:45.460
has to be eight seconds
long, clear dialogue

01:30:45.460 --> 01:30:47.340
and all of those kind
of things, and then

01:30:47.340 --> 01:30:50.860
to understand my
intent from that one,

01:30:50.859 --> 01:30:53.699
the LLM ended up
expressing a prompt that

01:30:53.699 --> 01:30:57.139
was far more loquacious
than I ever would have,

01:30:57.140 --> 01:30:59.840
that was far more descriptive
than I ever would have.

01:30:59.840 --> 01:31:01.860
The LLM had
understanding of what

01:31:01.859 --> 01:31:03.899
makes a good shot, what
makes a good angle, what

01:31:03.899 --> 01:31:06.719
makes good emotion far
more than I would have.

01:31:06.720 --> 01:31:08.800
I could spend hours
trying to describe it.

01:31:08.800 --> 01:31:10.739
So that first step
in the agentic flow

01:31:10.739 --> 01:31:13.340
of it doing that for me
and understanding my intent

01:31:13.340 --> 01:31:14.739
was huge.

01:31:14.739 --> 01:31:17.800
The second step then is the
tools that it's going to use.

01:31:17.800 --> 01:31:20.800
So I explicitly said which video
engine I'm going to be using.

01:31:20.800 --> 01:31:22.980
I was using Gemini as the
LLM, and hopefully Gemini

01:31:22.979 --> 01:31:25.079
is familiar with Veo,
that kind of stuff,

01:31:25.079 --> 01:31:27.340
so to understand the
idiosyncrasies of doing things

01:31:27.340 --> 01:31:28.539
with Veo.

01:31:28.539 --> 01:31:30.210
What I learned, for
example, Veo was

01:31:30.210 --> 01:31:33.029
very bad at doing
high-action scenes,

01:31:33.029 --> 01:31:36.809
but is very good at doing slow
camera pulls to do emotion,

01:31:36.810 --> 01:31:38.250
as you saw in this case.

01:31:38.250 --> 01:31:40.329
So the LLM knew that
from me, declaring

01:31:40.329 --> 01:31:41.630
I was using that as a tool.

01:31:41.630 --> 01:31:43.449
And then further
it built a prompt

01:31:43.449 --> 01:31:45.809
and then further refined
the prompt from that.

01:31:45.810 --> 01:31:47.970
And then the third part
actually using the tool

01:31:47.970 --> 01:31:50.690
to actually generate it
for me, generating a video

01:31:50.689 --> 01:31:53.929
with something like Veo costs,
I think, between $2 and $3

01:31:53.930 --> 01:31:55.994
to generate four
videos and credits.

01:31:55.994 --> 01:31:57.369
So the last thing
I want to do is

01:31:57.369 --> 01:31:59.452
generate lots and lots and
lots and lots of videos

01:31:59.453 --> 01:32:01.170
and throw good money after bad.

01:32:01.170 --> 01:32:04.449
But all of that token
spend that I did earlier on

01:32:04.449 --> 01:32:07.729
to understand my intent and
then to make the plan for using

01:32:07.729 --> 01:32:10.869
the agent was saved in the
back end where it got it right.

01:32:10.869 --> 01:32:13.273
Maybe not get it
right first time,

01:32:13.273 --> 01:32:15.690
but it would very rarely take
more than two or three tries

01:32:15.689 --> 01:32:17.969
to get something that
was really, really nice.

01:32:17.970 --> 01:32:21.409
So I think without
comparing like with like, I

01:32:21.409 --> 01:32:24.609
do think that plan of action and
going through a workflow, that

01:32:24.609 --> 01:32:27.529
worked very, very well.

01:32:27.529 --> 01:32:32.679
Any other questions,
thoughts, comments?

01:32:32.680 --> 01:32:34.320
Yeah, up at the back.

01:32:34.319 --> 01:32:37.599
What has surprised you
the most about the AI

01:32:37.600 --> 01:32:39.280
industry over the years?

01:32:39.279 --> 01:32:41.639
What has surprised me
the most about the AI

01:32:41.640 --> 01:32:43.000
industry over the years?

01:32:43.000 --> 01:32:45.760
Oh, that's a good one.

01:32:45.760 --> 01:32:48.920
I think what has
surprised me the most,

01:32:48.920 --> 01:32:50.840
and it probably shouldn't
have surprised me,

01:32:50.840 --> 01:32:53.720
is how much hype took over.

01:32:53.720 --> 01:32:56.520
I actually-- I honestly
thought a lot of people

01:32:56.520 --> 01:32:58.760
who are in important
decision making roles

01:32:58.760 --> 01:33:01.920
and that kind of thing would be
able to see the signal better

01:33:01.920 --> 01:33:03.840
than they did.

01:33:03.840 --> 01:33:09.360
And I think the other part
was that the desire to make

01:33:09.359 --> 01:33:13.039
immediate profits as
opposed to long-term gains

01:33:13.039 --> 01:33:14.800
also surprised me a lot.

01:33:14.800 --> 01:33:18.920
Let me share one story in that
space was one of the things

01:33:18.920 --> 01:33:22.279
that after Andrew and I
taught that the TensorFlow

01:33:22.279 --> 01:33:25.759
specializations on Coursera,
and after that, Google

01:33:25.760 --> 01:33:28.150
launched a professional
certificate

01:33:28.149 --> 01:33:30.109
where the idea of this
professional certificate

01:33:30.109 --> 01:33:32.229
was would give a rigorous exam.

01:33:32.229 --> 01:33:33.729
And at the end of
the rigorous exam,

01:33:33.729 --> 01:33:38.149
if you got the certificate,
it was a high prestige thing

01:33:38.149 --> 01:33:40.189
that would help you find
work, and particularly

01:33:40.189 --> 01:33:43.949
at the time when TensorFlow was
a very highly demanded skill

01:33:43.949 --> 01:33:45.750
in order to get work.

01:33:45.750 --> 01:33:49.909
Running that program cost
Google $100,000 a year.

01:33:49.909 --> 01:33:52.750
Drop in the bucket,
not a lot of money.

01:33:52.750 --> 01:33:56.710
The goodwill that came
out of it was immense.

01:33:56.710 --> 01:33:57.710
I can tell you--

01:33:57.710 --> 01:34:01.230
I'll tell one story very
quickly, was a young man

01:34:01.229 --> 01:34:03.789
and he went public
in some advertising

01:34:03.789 --> 01:34:08.310
stuff that with Google
that he lived in Syria.

01:34:08.310 --> 01:34:10.830
And we all know there was
a huge civil war in Syria

01:34:10.829 --> 01:34:12.390
over the last few years.

01:34:12.390 --> 01:34:14.810
And he got the
TensorFlow certificate.

01:34:14.810 --> 01:34:16.970
He was one of the first
in Syria to get it,

01:34:16.970 --> 01:34:18.909
and it lifted him
out of poverty,

01:34:18.909 --> 01:34:21.029
where he was able
to move to Germany

01:34:21.029 --> 01:34:23.329
and get work at a
major German firm.

01:34:23.329 --> 01:34:25.779
And I met him at an
event in Amsterdam

01:34:25.779 --> 01:34:27.699
where he told me his story.

01:34:27.699 --> 01:34:31.920
And now, because of the job
that he had in this German firm,

01:34:31.920 --> 01:34:34.699
he's able to support
his family back home

01:34:34.699 --> 01:34:36.460
and move them out
of the war torn zone

01:34:36.460 --> 01:34:41.539
into a peaceful zone all
because he got this AI thing.

01:34:41.539 --> 01:34:44.539
And there were countless
stories like that.

01:34:44.539 --> 01:34:47.039
Very inspirational,
very beautiful stories.

01:34:47.039 --> 01:34:48.539
But the thing that
surprised me then

01:34:48.539 --> 01:34:50.539
was sometimes the
lack of investment

01:34:50.539 --> 01:34:53.193
in that, where there was
no revenue being generated

01:34:53.193 --> 01:34:54.360
for the company out of that.

01:34:54.359 --> 01:34:57.420
We deliberately kept
it revenue neutral so

01:34:57.420 --> 01:34:59.480
that the price of the
exams could go down.

01:34:59.479 --> 01:35:01.576
We wanted it to self-sustain.

01:35:01.577 --> 01:35:03.159
It ended up not being
revenue neutral.

01:35:03.159 --> 01:35:06.220
It ended up costing the company
about $100,000 to $150,000

01:35:06.220 --> 01:35:06.840
a year.

01:35:06.840 --> 01:35:08.569
So they canned it.

01:35:08.569 --> 01:35:10.819
And it's a shame because of
all the potential goodwill

01:35:10.819 --> 01:35:12.279
that can come out of
something like that.

01:35:12.279 --> 01:35:13.939
But I think those
were the two that

01:35:13.939 --> 01:35:16.979
immediately jump to mind that
have surprised me the most.

01:35:16.979 --> 01:35:19.139
And then I guess one other
part that I would say

01:35:19.140 --> 01:35:24.850
is the people who've been able
to be very successful with AI,

01:35:24.850 --> 01:35:26.890
who you wouldn't think
would be the ones that

01:35:26.890 --> 01:35:29.910
would be successful with AI, has
always been inspirational to me.

01:35:29.909 --> 01:35:32.034
So allow me one more story.

01:35:32.034 --> 01:35:32.909
I have a good friend.

01:35:32.909 --> 01:35:34.529
I showed ice hockey
a moment ago.

01:35:34.529 --> 01:35:37.569
I have a good friend who is a
former professional ice hockey

01:35:37.569 --> 01:35:38.489
player.

01:35:38.489 --> 01:35:40.689
And any ice hockey fans Here

01:35:40.689 --> 01:35:43.489
It's a brutal sport.

01:35:43.489 --> 01:35:46.090
You see a lot of fighting and
a lot of stuff on the ice.

01:35:46.090 --> 01:35:48.850
And he dropped out of school
when he was 13 years old

01:35:48.850 --> 01:35:50.530
to focus on skating.

01:35:50.529 --> 01:35:52.170
And he will always
tell everybody

01:35:52.170 --> 01:35:55.497
that he's the dumbest person
alive because he's uneducated.

01:35:55.497 --> 01:35:56.829
He and I are complete opposites.

01:35:56.829 --> 01:35:59.010
That's why we get on so well.

01:35:59.010 --> 01:36:03.250
And he retired from ice hockey
because of concussion issues.

01:36:03.250 --> 01:36:05.489
And he now runs a nonprofit--

01:36:05.489 --> 01:36:08.090
the ice rinks for nonprofit.

01:36:08.090 --> 01:36:11.409
And about three years ago,
we were having a beer,

01:36:11.409 --> 01:36:13.529
and he was like, so
tell me about AI.

01:36:13.529 --> 01:36:15.269
And tell me about
this ChatGPT thing.

01:36:15.270 --> 01:36:16.315
Is it any good?

01:36:16.314 --> 01:36:18.189
And I was like, just
sharing the whole thing.

01:36:18.189 --> 01:36:19.939
Yes, it's good and all
that kind of stuff.

01:36:19.939 --> 01:36:22.689
And it was obviously a loaded
question, and I didn't know why.

01:36:22.689 --> 01:36:25.369
But part of his job
at his nonprofit

01:36:25.369 --> 01:36:27.329
is that every quarter,
he has to present

01:36:27.329 --> 01:36:30.250
to the board of directors
the results of the operations

01:36:30.250 --> 01:36:31.846
so that they can
be funded properly,

01:36:31.846 --> 01:36:33.429
because even though
they're nonprofit,

01:36:33.430 --> 01:36:35.289
they still need
money to operate.

01:36:35.289 --> 01:36:40.529
And he was spending upwards
of $150,000 a year to bring

01:36:40.529 --> 01:36:44.369
in consultants to pull the
data from all of the different

01:36:44.369 --> 01:36:45.256
sources.

01:36:45.256 --> 01:36:47.089
They're pulling data
from-- there's machines

01:36:47.090 --> 01:36:49.690
in what's called the pump room
that has a compressor that

01:36:49.689 --> 01:36:50.589
cools the ice.

01:36:50.590 --> 01:36:52.810
And there were spreadsheets
and there was accounts

01:36:52.810 --> 01:36:53.935
and all this kind of stuff.

01:36:53.935 --> 01:36:56.690
And he was not tech
savvy in any way.

01:36:56.689 --> 01:36:59.469
But he needed to
process all this data.

01:36:59.470 --> 01:37:02.350
So he did an experiment where
he got ChatGPT to do it.

01:37:02.350 --> 01:37:03.770
And this was the
loaded question,

01:37:03.770 --> 01:37:05.162
asking me if it was any good.

01:37:05.162 --> 01:37:06.869
And so we talked
through it a little bit.

01:37:06.869 --> 01:37:08.569
And then he told me why.

01:37:08.569 --> 01:37:10.206
And so I took a
look at the results

01:37:10.207 --> 01:37:11.789
because he was
uploading spreadsheets.

01:37:11.789 --> 01:37:13.930
He was uploading PDFs and
all this kind of thing

01:37:13.930 --> 01:37:15.670
and getting it to
assemble a report.

01:37:15.670 --> 01:37:18.529
And it takes him about two
hours to do the report himself

01:37:18.529 --> 01:37:19.689
with ChatGPT.

01:37:19.689 --> 01:37:22.039
And it worked, and it
worked brilliantly.

01:37:22.039 --> 01:37:25.880
And that $150,000 a year that
he's saving on consulting is now

01:37:25.880 --> 01:37:29.500
going to underprivileged
kids for hockey equipment,

01:37:29.500 --> 01:37:31.159
for ice skating
equipment, for lessons,

01:37:31.159 --> 01:37:32.340
and all of that kind of thing.

01:37:32.340 --> 01:37:34.797
So it was taken out of the
hands of an expensive consulting

01:37:34.797 --> 01:37:37.018
company and put into
the hands of people.

01:37:37.018 --> 01:37:38.560
Because of this guy,
and he says he's

01:37:38.560 --> 01:37:40.960
the dumbest person alive, but--

01:37:40.960 --> 01:37:44.840
I hope he's not
watching this video.

01:37:44.840 --> 01:37:47.159
And I told him afterwards
that, congratulations, you're

01:37:47.159 --> 01:37:48.639
now a developer.

01:37:48.640 --> 01:37:51.320
And he didn't like that.

01:37:51.319 --> 01:37:55.159
But it's like surprises like
that the superpowers that were

01:37:55.159 --> 01:37:58.599
handed to somebody like him,
that he's not technical in any

01:37:58.600 --> 01:38:01.600
way, but he was able to
effectively build a solution

01:38:01.600 --> 01:38:05.320
that saved his nonprofit
$100,000 or $150,000 a year.

01:38:05.319 --> 01:38:07.759
And things like that
are always surprising me

01:38:07.760 --> 01:38:08.900
in a very pleasant way.

01:38:12.039 --> 01:38:12.800
Yep.

01:38:12.800 --> 01:38:13.300
Sorry.

01:38:13.300 --> 01:38:14.020
I'll get to you next.

01:38:14.020 --> 01:38:14.520
Sorry.

01:38:14.520 --> 01:38:15.360
Yeah.

01:38:15.359 --> 01:38:20.630
For engineers like us, it's
easier to navigate the hype

01:38:20.630 --> 01:38:24.869
because we can understand what
the signal is from a research

01:38:24.869 --> 01:38:25.630
paper.

01:38:25.630 --> 01:38:30.230
But how about people who doesn't
have this knowledge, like,

01:38:30.229 --> 01:38:36.029
from humanities or
something [INAUDIBLE]?

01:38:36.029 --> 01:38:38.167
Yeah, so just to repeat
the question for the video.

01:38:38.167 --> 01:38:39.710
For engineers like
us, sometimes it's

01:38:39.710 --> 01:38:42.167
easy to navigate the hype to
see the signal from the noise.

01:38:42.167 --> 01:38:45.829
But what about people who don't
have the same training as us?

01:38:45.829 --> 01:38:49.789
I think that's our opportunity
to be trusted advisors for them

01:38:49.789 --> 01:38:53.229
and to really help them
through that, to understand it.

01:38:53.229 --> 01:38:55.750
I think the biggest
part in the hype story

01:38:55.750 --> 01:38:59.229
right now is just understanding
the reward mechanism.

01:38:59.229 --> 01:39:01.669
That everything rewards
engagement rather than

01:39:01.670 --> 01:39:03.149
actual substance.

01:39:03.149 --> 01:39:05.909
And to me, step one is
seeing through that.

01:39:05.909 --> 01:39:08.309
The story I just
told about my friend,

01:39:08.310 --> 01:39:10.250
he'd seen all this
kind of stuff,

01:39:10.250 --> 01:39:12.409
but he wasn't willing
to bet his career on it.

01:39:12.409 --> 01:39:14.269
But he needed that
kind of advice

01:39:14.270 --> 01:39:16.847
around it and to start
peeling apart what he had done

01:39:16.846 --> 01:39:18.679
and what he did right
and what he did wrong.

01:39:18.680 --> 01:39:23.180
And so that positioning
ourselves to be trusted advisors

01:39:23.180 --> 01:39:24.940
by not leaning into
the same mistakes

01:39:24.939 --> 01:39:27.159
that the untrained people
may be leaning into,

01:39:27.159 --> 01:39:29.300
I think is the key to that.

01:39:29.300 --> 01:39:32.659
And just understanding that
the average person is generally

01:39:32.659 --> 01:39:35.019
very intelligent,
even if they may not

01:39:35.020 --> 01:39:37.980
be experts in a specific
domain, and to key

01:39:37.979 --> 01:39:41.779
in on that intelligence and help
them to foster and to grow that

01:39:41.779 --> 01:39:44.550
in and navigate them
through the parts

01:39:44.550 --> 01:39:46.300
where they'll have
difficulty and let them

01:39:46.300 --> 01:39:49.100
shine in what they're
very, very good at.

01:39:49.100 --> 01:39:51.579
Over here there was one.

01:39:51.579 --> 01:39:53.779
I have a question more
for AI and machine

01:39:53.779 --> 01:39:55.519
learning for
scientific research.

01:39:55.520 --> 01:39:56.060
OK.

01:39:56.060 --> 01:39:59.340
Which is something that
is very hard [INAUDIBLE]

01:39:59.340 --> 01:40:01.039
to get your perspective on.

01:40:01.039 --> 01:40:03.220
Where do you think
that is a good idea

01:40:03.220 --> 01:40:06.659
and where you might
say, maybe be cautious?

01:40:06.659 --> 01:40:09.960
So AI and machine learning
for scientific research,

01:40:09.960 --> 01:40:14.340
where is it a good idea and
where should you be cautious?

01:40:14.340 --> 01:40:16.250
Ooh.

01:40:16.250 --> 01:40:20.090
My initial gut check would be I
think it's always a good idea.

01:40:20.090 --> 01:40:23.489
I think there was no harm in
using the tools that you have

01:40:23.489 --> 01:40:26.729
available to you, but
to always to just double

01:40:26.729 --> 01:40:29.209
check your results and double
check your expectations

01:40:29.210 --> 01:40:31.730
against the grounded reality.

01:40:31.729 --> 01:40:36.209
I've always been a fan of
using automation in research

01:40:36.210 --> 01:40:37.189
as much as possible.

01:40:37.189 --> 01:40:40.694
My undergraduate was physics
many, many years ago,

01:40:40.694 --> 01:40:42.569
and I was actually very
successful in the lab

01:40:42.569 --> 01:40:44.929
because I usually automated
things through a computer

01:40:44.930 --> 01:40:47.369
that other people did
handwriting and pen and paper

01:40:47.369 --> 01:40:48.010
with.

01:40:48.010 --> 01:40:49.270
So I could move quickly.

01:40:49.270 --> 01:40:51.190
So I know I'm biased
in that regard.

01:40:51.189 --> 01:40:54.309
But I would say, for most
research, for the most part,

01:40:54.310 --> 01:40:57.390
I think use the most powerful
tools you have available,

01:40:57.390 --> 01:40:58.910
but check your expectations.

01:41:03.210 --> 01:41:07.630
Little story actually on that
side was trivia question.

01:41:07.630 --> 01:41:10.010
Poorest country
in Western Europe.

01:41:10.010 --> 01:41:11.270
Anybody know?

01:41:11.270 --> 01:41:12.130
Serbia?

01:41:12.130 --> 01:41:12.909
What's that?

01:41:12.909 --> 01:41:13.819
Or Western.

01:41:13.819 --> 01:41:16.799
Western Europe is Wales.

01:41:16.800 --> 01:41:19.260
So I actually did my
undergraduate in Wales,

01:41:19.260 --> 01:41:22.119
and I went back to do some
lectures in the university

01:41:22.119 --> 01:41:23.039
there.

01:41:23.039 --> 01:41:26.300
And I met with a
researcher there,

01:41:26.300 --> 01:41:29.480
and he was doing research
into brain cancer

01:41:29.479 --> 01:41:32.119
using computer imagery and
using various types of computer

01:41:32.119 --> 01:41:32.731
imagery.

01:41:32.731 --> 01:41:34.439
And I asked him, well,
what's the biggest

01:41:34.439 --> 01:41:35.939
problem that you have?

01:41:35.939 --> 01:41:38.099
What's the biggest
blocker for your research?

01:41:38.100 --> 01:41:39.960
And this is about
eight years ago.

01:41:39.960 --> 01:41:43.760
And his answer was
access to a GPU.

01:41:43.760 --> 01:41:46.840
And because for him to be
able to train his models

01:41:46.840 --> 01:41:50.079
and run his models, he needed
to be able to access a GPU.

01:41:50.079 --> 01:41:52.960
And the department
that he was in

01:41:52.960 --> 01:41:55.520
had one GPU between
10 researchers,

01:41:55.520 --> 01:41:57.760
which meant that everybody
got it for half a day.

01:41:57.760 --> 01:41:59.467
Monday through Friday,
and his half a day

01:41:59.467 --> 01:42:00.800
was Tuesday afternoon.

01:42:00.800 --> 01:42:02.760
So in his case, he would
spend the entire time

01:42:02.760 --> 01:42:05.000
that wasn't Tuesday afternoon
preparing everything

01:42:05.000 --> 01:42:07.100
for his model run or
his model training

01:42:07.100 --> 01:42:08.100
or everything like that.

01:42:08.100 --> 01:42:11.020
And then Tuesday afternoon,
once he had access to the GPU,

01:42:11.020 --> 01:42:12.780
then he would do the training.

01:42:12.779 --> 01:42:14.237
And then he would
hope in that time

01:42:14.238 --> 01:42:16.655
that he would train his model
and he would get the results

01:42:16.654 --> 01:42:17.340
that he wanted.

01:42:17.340 --> 01:42:20.440
Otherwise, he'd have to wait a
week to get access to the GPU

01:42:20.439 --> 01:42:21.399
again.

01:42:21.399 --> 01:42:23.439
And then I showed
him Google Colab.

01:42:23.439 --> 01:42:25.599
Anybody ever used Google Colab?

01:42:25.600 --> 01:42:27.960
And you can have
a GPU in the cloud

01:42:27.960 --> 01:42:29.579
for free with that
kind of thing.

01:42:29.579 --> 01:42:32.760
And the poor guy's
brain melted that--

01:42:32.760 --> 01:42:34.800
because I took out my
phone, and I showed him

01:42:34.800 --> 01:42:37.180
a notebook running on
my phone in Google Colab

01:42:37.180 --> 01:42:38.180
and training it on that.

01:42:38.180 --> 01:42:41.240
And it changed everything
for him research wise.

01:42:41.239 --> 01:42:44.099
And now it was a case of--
and this was with Colab.

01:42:44.100 --> 01:42:46.520
He had much more than he
had with his shared GPU.

01:42:46.520 --> 01:42:49.477
So I think for someone
like him, machine learning

01:42:49.476 --> 01:42:51.059
was an important
part of his research,

01:42:51.060 --> 01:42:55.400
but he was so gated on it that
the ability to widen access

01:42:55.399 --> 01:42:57.927
to that ended up really,
really advancing his research.

01:42:57.927 --> 01:42:59.220
I don't know where it ended up.

01:42:59.220 --> 01:43:00.220
I don't know what he has done.

01:43:00.220 --> 01:43:01.740
It has been a few
years since then.

01:43:01.739 --> 01:43:06.920
But that story just came to mind
when you asked the question.

01:43:06.920 --> 01:43:09.680
Any more questions?

01:43:09.680 --> 01:43:11.210
Feel free to ask me anything.

01:43:14.390 --> 01:43:14.890
Oh, yeah.

01:43:14.890 --> 01:43:15.950
At the front here.

01:43:15.949 --> 01:43:17.510
It's more of a general question.

01:43:17.510 --> 01:43:21.270
You talked about AI helping
food and beverage use.

01:43:21.270 --> 01:43:25.590
What do you think AI would
be a force of social equality

01:43:25.590 --> 01:43:27.470
or social inequality?

01:43:27.470 --> 01:43:31.909
So can AI be a force of social
equality or social inequality?

01:43:31.909 --> 01:43:34.909
I think the answer
to that is yes.

01:43:34.909 --> 01:43:37.349
It can be both, and
it can be neither.

01:43:37.350 --> 01:43:39.470
I mean, I think that
ultimately, the idea

01:43:39.470 --> 01:43:45.310
is that if in my opinion, any
tool can be used for any means,

01:43:45.310 --> 01:43:48.310
so the important thing is to
educate and inspire people

01:43:48.310 --> 01:43:51.230
towards using things
for the correct means.

01:43:51.229 --> 01:43:53.369
There's only so much
governance can be applied.

01:43:53.369 --> 01:43:56.269
And sometimes governance
can cause more problems

01:43:56.270 --> 01:43:58.150
than it solves.

01:43:58.149 --> 01:44:03.349
So I always love to live my
life by assuming good intent

01:44:03.350 --> 01:44:05.570
but preparing for bad intent.

01:44:05.569 --> 01:44:07.069
And in the case of
AI, I don't think

01:44:07.069 --> 01:44:09.527
there's any difference there
that everything that I will do

01:44:09.528 --> 01:44:12.240
and everything that I would
advise is assuming good intent,

01:44:12.239 --> 01:44:14.420
that people would use
it for good things,

01:44:14.420 --> 01:44:18.100
but also to be prepared
for it to be misused.

01:44:18.100 --> 01:44:20.660
The bad examples that I
showed earlier on, I think

01:44:20.659 --> 01:44:24.500
were good intent
rather than bad intent.

01:44:24.500 --> 01:44:26.739
And most mistakes
that I see that are

01:44:26.739 --> 01:44:29.260
good intent being
used mistakenly as

01:44:29.260 --> 01:44:30.400
opposed to bad intent.

01:44:30.399 --> 01:44:33.199
But I would say that's the
only mantra that I can--

01:44:33.199 --> 01:44:35.979
the only advice that I can give
and that kind of thing is always

01:44:35.979 --> 01:44:40.379
assume good intent, but
prepare for bad intent.

01:44:40.380 --> 01:44:42.420
The AI itself has no choice.

01:44:42.420 --> 01:44:43.840
It's how people use it.

01:44:46.659 --> 01:44:49.659
Andrew, did you want
closing comments or--

01:44:49.659 --> 01:44:53.579
I think we were running
out [INAUDIBLE] time.

01:44:53.579 --> 01:44:55.738
But thank you for this.

01:44:55.738 --> 01:44:56.280
Really great.

01:44:56.279 --> 01:44:57.904
Thanks, everyone,
for all the questions

01:44:57.904 --> 01:45:00.019
on those creative solutions.

01:45:00.020 --> 01:45:00.520
All right.

01:45:00.520 --> 01:45:01.240
Thank you, Andrew.

01:45:01.239 --> 01:45:01.739
Thanks.

01:45:01.739 --> 01:45:04.289
[APPLAUSE]
