Advertisement
Ad slot
Joe Rogan Experience #2494 - Chamath Palihapitiya 2:45:55

Joe Rogan Experience #2494 - Chamath Palihapitiya

PowerfulJRE · May 10, 2026
Open on YouTube
Transcript ~32249 words · 2:45:55
0:01
Joe Rogan podcast. Check it out.
0:03
>> The Joe Rogan Experience.
0:06
>> TRAIN BY DAY. JOE ROGAN PODCAST BY
0:08
NIGHT. All day.
0:12
>> Yeah, I was listening to Tim. First of
0:14
all, hello.
0:15
>> What's up?
0:15
>> Good to see you, my friend.
Advertisement
Ad slot
0:16
>> Great to see you.
0:17
>> Uh, we were listening to Tim Dylan. I I
0:19
was listening to on the way over here
0:21
and he was talking about uh Anna Paulina
0:23
Luna and Tim Bchett and Trump. They're
0:26
all talking about the UAP disclosures
0:28
and like why now? Like what are they
0:31
doing? Like why are they distracting us
0:32
with this? Timberchet said that whatever
0:35
they're going to release it will be
Advertisement
Ad slot
0:37
indigestible.
0:39
>> What does that mean?
0:40
>> Right.
0:41
>> Indigestible as in or well then it
0:44
doesn't mean that it's real then.
0:46
>> Well, I think it means that it'll be so
0:48
crazy if it's real. So crazy. He's the
0:51
one that's been saying that there's
0:52
these confirmed bases under the ocean,
0:56
that there's these specific locations. I
0:58
think you talked you're shaking your
1:00
head. You don't believe a word of it.
1:01
>> No.
1:02
>> How come?
1:02
>> I think I think it's true that there
1:05
look,
1:07
it's completely implausible that there
1:10
aren't other species,
1:11
>> right?
1:12
>> Completely implausible.
1:14
>> Just the vastness of what we're dealing
1:16
with. So the real question is like why
1:18
haven't we encountered people or those
1:21
things those beings,
1:22
>> right?
1:23
>> And it's probably because they just they
1:25
have bigger fish to fry, you know? So by
1:28
the time that we meet them and they meet
1:30
us, we're we're going to kind of be at
1:32
the edge of like we've we've kind of
1:34
been there done that on our own planet
1:37
and then we've kind of like developed
1:38
the technology I guess to get beyond it.
1:41
Um but somewhere along the way there
1:43
must have been a few just mathematically
1:45
impossible. So then the question is, is
1:46
it buried? Or were people confused when
1:48
it first came here? Like if you had a
1:50
spaceship land in like the 1800s,
1:52
>> right?
1:52
>> What would people have done? They would
1:54
have just freaked out. They wouldn't
1:56
have understood it. Maybe they would
1:57
have buried it. Depending on where it
1:58
was, maybe they started to pray to it,
2:00
>> right?
2:01
>> And you would have just moved on. And
2:03
then that isn't documented in history.
2:05
So,
2:05
>> but it is.
2:06
>> But how?
2:07
>> It is. There's a lot of it documented in
2:09
history.
2:10
>> Oh, you mean like hieroglyphics and like
2:12
monuments?
2:12
>> Well, the book of Ezekiel. The book of
2:14
Ezekiel goes in depth about some sort of
2:16
a UFO encounter that Ezekiel
2:19
experiences,
2:20
>> right?
2:20
>> Where it's a wheel within a wheel and a
2:24
a cloud with fire flashing forth
2:26
continually in the midst of a cloud as
2:28
it were gleaming metal and from the
2:30
midst of it came the likeness of four
2:32
living creatures and the creatures
2:34
darted to and fro like the appearance of
2:36
a flash of lightning. This is all in the
2:38
Bible. Um it's also in the Mahabarada.
2:41
Um they they talk about vaymanas these
2:44
flying crafts and
2:46
>> I think it's entirely possible that we
2:48
have been visited periodically
2:51
>> and that we are we have been monitored
2:53
and that we are monitored.
2:54
>> I agree
2:55
>> currently.
2:56
>> I agree.
2:56
>> And if I was going to hide, I would hide
2:58
in the ocean.
2:59
>> Well, to be honest, as I get older, I'm
3:01
convinced
3:03
>> we're basically in some form of a
3:05
simulation. There's like all these
3:07
little ingredients
3:08
>> that if you start to see these little
3:10
clues, you're like,
3:12
>> they all seem so odd in isolation and
3:14
then when you put them together, I feel
3:15
like a crazy person. So, I ignore
3:17
myself,
3:17
>> right?
3:18
>> But I wonder like why did this happen?
3:19
Like yesterday, I was um at a dinner in
3:22
LA before I came to see you. And um I
3:26
told this very interesting story. Well,
3:28
or I thought it was interesting at the
3:29
time. Um
3:31
you know that like so in 2000, right? If
3:34
you think of like what happened in tech
3:36
since 2000, so the last 26 years,
3:40
people can give you all kinds of like
3:41
fancy theories,
3:44
but there's just like this weird
3:47
word that's been at the center of every
3:50
single technological revolution for the
3:53
last 30 years. And that word is
3:55
attention. Let me explain this to you.
3:58
Google, they invent Google. What is
4:00
Google? Google is a algorithm. It's
4:03
called page rank. But if you look inside
4:06
of it, what is it? It says, "Well,
4:07
Chimath's website has five links to it.
4:11
Joe's website has two links. He's
4:13
getting more attention." Okay, Chimath's
4:16
website is more important. That's the
4:17
sum total of Google. Now, they've made
4:19
that a lot more refined and they've done
4:21
all these other fancy things,
4:24
but it's all about attention. Mhm.
4:26
>> Fast forward to 2007 89 when you know
4:29
Zuck and then when I went to work for
4:31
Zuck and we got on the scene and we're
4:32
like what do everybody what does
4:34
everybody care about? Attention. And so
4:38
what is like the Facebook algorithm?
4:40
What's the Instagram algorithm? You know
4:42
how did we construct newsfeed all around
4:45
attention? Joe had 35 likes. Jamie had
4:47
12 likes. Your thing is more important.
4:49
Let's give it more more importance
4:51
because it's seemingly meeting all these
4:53
human needs. Attention. Attention.
4:55
attention. So phase one attention, phase
4:58
two attention, and this is where I'm
5:01
like, how can this be possible? In phase
5:02
three, we're like looking at AI. And
5:05
when you look backwards four years, the
5:06
seminal paper is called attention is all
5:08
you need. It's about this word again.
5:12
And when you look inside of the core
5:15
part, if you peel out peel, you know,
5:17
apart AI,
5:19
the little brain that makes it so
5:21
capable, it's called an attention
5:23
mechanism. It's just attention. It's all
5:25
about again this idea of I'm gonna scour
5:28
all this information and I'm gonna
5:30
figure out what patterns repeat itself
5:31
and I'm just going to double down on the
5:33
stuff that I see more of because that
5:35
attention must mean it's more important.
5:37
It's more true. It's more knowledgeable
5:40
and then I think how could it be like
5:42
we're all like why is it that these
5:44
things are just repeating over and over
5:46
again? And I just get confused. I don't
5:48
I don't exactly know how to explain it.
5:50
So are there other ways in which we
5:52
should be doing things? Absolutely. Have
5:54
we even explored it? No. So then I
5:56
think, well, is this just a simulation?
5:57
Some kid in a [ __ ] in his house just
6:00
playing some simulation and we're all
6:01
just party to it and that's all he
6:03
understands is attention. I don't know.
6:05
>> I don't think it's that simple that
6:07
there's a person playing a game. But if
6:10
you break down just attention, well,
6:12
that's all of human history is paying
6:16
attention to the king, paying attention
6:18
to the war, paying attention to
6:21
resources, paying attention to who says
6:23
the thing that resonates the most with
6:25
the people. It's all about what human
6:28
beings are paying attention to.
6:30
>> I think it's part of it. Then there's
6:32
also what is actually true. And I think
6:36
sometimes what is true and what people
6:37
pay attention to are not the same thing.
6:40
>> True.
6:41
>> And sometimes the thing that you should
6:44
be paying attention to gets lost because
6:47
the thing that you are paying attention
6:48
to gets more attention because it's more
6:50
interesting and useful. That's sort of
6:53
where we are right now. We're in this
6:54
really weird phase, I think, where
6:57
you actually like should be focused on
7:00
this thing over here and instead we're
7:02
all focused on all these things over
7:04
here. or give me an example.
7:06
>> Um,
7:08
here's like a very big one. I think like
7:11
it's pretty fair to say since the last
7:13
time you and I saw each other on this
7:14
show,
7:16
the attitude towards technology, I
7:20
think, has been pretty profoundly
7:22
negative. It's kind of tilted. It's
7:24
relatively like anti-AI,
7:27
you know, anti-billionaires. It's anti-
7:29
all of this stuff.
7:31
Um, and it manifests in all of these
7:36
interesting ways. There's protests,
7:37
there's data centers, there's all of
7:39
this stuff that's happening. Um, people
7:42
are worried about job loss. All of that
7:44
stuff is real.
7:45
>> Do you want a cigar?
7:46
>> No, I'm okay. I'm okay. Um,
7:49
but what should they really be focused
7:50
upon? And I think what they should be
7:53
really focused upon is we're at the tail
7:56
end of a cycle that doesn't work
7:57
anymore, which is all about like this
8:00
tension between labor, people that do
8:02
the work, and capital, the people that
8:03
fund it and then make all the returns.
8:06
And over the last 40 years, we've
8:07
basically gone to this completely upside
8:09
down world where capital extracts all of
8:13
the upside
8:14
and labor has extracted less and less
8:17
and less and less. And all of this push
8:19
back, it manifests in AI, it manifests
8:22
in politics, it manifests in social
8:24
issues, it manifests in, you know,
8:26
Israel, whatever you want to talk about.
8:28
All of these issues, I think
8:29
symptomologically,
8:31
come from this other issue, which is we
8:33
are out of balance. This total compact
8:36
that we used to have, a liberal
8:38
democracy and a free market has totally
8:40
collapsed. And there are simple ways to
8:42
fix that, but that never gets the
8:44
attention because it's not what you want
8:46
to talk about. the attention is here,
8:48
you know, vote no to the data center.
8:51
You know, uh this model is going to take
8:53
out all the jobs. Um
8:56
you know, this social issue is really
8:58
important. That war should not be
9:00
fought. That war should be fought. All
9:02
of these things while important
9:06
distract us from what the core issue is.
9:08
And the core issue is that we as a
9:10
society, I think, are out of balance.
9:12
the the natural compact between all of
9:15
us is broken and there are some simple
9:18
ways to fix that compact. Get people
9:20
more invested, get people more engaged
9:22
in the upside, have people have a
9:24
positive some view of what's happening
9:26
and that isn't happening.
9:27
>> Well, what what simple solutions are
9:29
there to to this one very particular
9:32
issue?
9:33
>> Okay, I'll get your reaction to this.
9:35
Let's assume that you still lived in
9:37
California because I think it it tells
9:38
this example in a more extreme way.
9:40
>> Okay. Um, let's say you make a million
9:43
bucks a year, which is a lot of money,
9:44
but it it makes the it makes the point
9:47
more cleanly. Um,
9:50
you'd pay, I think, 30%
9:54
federal tax and you'd pay another 15 or
9:58
16% in state tax and Medicare tax and
10:01
all this tax. So, if you're a wage
10:03
earner,
10:05
50% of all your upside
10:09
goes to the government.
10:11
If you're a capital earnner and you make
10:14
that same million dollars via capital
10:17
gains, you pay half that tax.
10:21
Why did that happen? That happened
10:23
because in the 40s and 50s, but really
10:25
in the 60s and 70s and 80s,
10:28
what we were trying to do or what the
10:30
American government and what western
10:32
societies were trying to do was to
10:35
convince people to invest their money.
10:38
Hey Joe, go build that factory. Go hire
10:41
those people and we're going to
10:42
incentivize you to do so.
10:45
And by doing that, there was this idea
10:48
that that all of those profits that you
10:50
would get would then diffuse, right,
10:51
trickle down into everybody else. The
10:54
workers participated, everybody
10:55
participated.
10:57
But technology allows you to do more
10:59
with less and less. So now what happens
11:02
is the capital owners can acrue
11:06
infinite almost it seems like value and
11:10
the workers get less and less. But now
11:11
if you get less and less and you're
11:12
taxed more and more as a percentage of
11:14
what you own, you're going to feel
11:16
really out of sorts. You're going to be
11:17
like, why am I paying 50 cents of every
11:19
dollar? And I see these other ways where
11:21
folks are paying 25 cents on their
11:23
dollars, but their dollars are
11:24
compounding way faster and they have,
11:27
you know, hundreds of billions more of
11:29
those dollars than I have of my dollars.
11:31
If you take that example and you expand
11:33
it across society, I think people
11:35
understand that now. There's enough
11:37
information and there's enough people
11:38
talking about it where it's pretty clear
11:40
that that's happened. So the question is
11:42
how do you fix it? I think like if you
11:44
think about AI and if you believe that
11:46
we're going to get into this world of
11:47
abundance
11:50
and we're not working, what does it mean
11:52
for governments to tax our labor? There
11:54
is no labor. You're not working anymore.
11:56
I'm not working. We're doing things out
11:58
of leisure.
11:59
Why should I pay 50 cents of every
12:01
dollar? Why aren't the companies that
12:02
are going to be making trillions of
12:03
dollars, why don't they pay more?
12:06
Why isn't there, you know, an
12:08
expectation that they then help our
12:11
lived society
12:13
do better and thrive as a result of all
12:16
of that winning? That's the real
12:18
conversation that I think is
12:21
bubbling.
12:23
And I think that we're probably another
12:25
12 to 18 months where all of these other
12:28
issues are going to be important, but
12:30
they're going to be viewed for for what
12:32
they are. um they're going to get
12:34
demoted I think in importance and it's
12:36
this core structural issue it's what is
12:39
the economic relationship that we have
12:41
together as a society what is the
12:43
relationship between Joe Chamath Jamie
12:46
and all these companies and how do we
12:50
feel about a few and an evershrinking
12:52
few making more and more and more
12:57
and then how do we feel about their
13:00
ability to share that with a small
13:03
amount of people
13:05
and then what do what is the expectation
13:07
for everybody else? I think that's
13:09
mostly at the core of what's happening.
13:12
And so back to like, you know, all of
13:15
this attention that we give to these
13:16
other issues distracts from that one
13:18
because I think you can get organized to
13:20
fix this issue. You can't get
13:21
concessions on any of these issues.
13:23
>> Mhm.
13:23
>> You know, you bring up Israel, it's like
13:25
this. You bring up social issues, it's
13:26
like this. You bring up, you know,
13:28
whatever you want to bring up, people
13:30
just kind of take a side, nothing
13:32
happens. This is actually where people
13:34
are universally actually much more
13:36
aligned than you think because there's
13:38
reasonable ways. One simple way was is
13:40
you'd say, well, let's flip the taxation
13:41
model.
13:43
Corporate taxes
13:46
should exceed personal taxes. They've
13:49
never
13:52
we should have an expectation that then
13:55
corporate
13:57
actors can buy down their taxes if they
13:59
want, but if they do social good for
14:01
society. I'll give you an example. At
14:03
the industrial revolution, there's a
14:05
table like this and the leading lights
14:08
of that era, Andrew Carnegie, Nelson
14:10
Rockefeller, Jay Gould, JP Morgan,
14:15
they sat together and they said, "Guys,
14:17
this is going to benefit us this
14:19
industrial revolution.
14:21
It may not benefit everybody. What is
14:23
our responsibility? What is our
14:25
collective responsibility?"
14:27
And they allocated tasks. Carnegie went
14:30
and built libraries
14:32
all throughout the country. Rockefeller
14:34
built universities, hospitals were
14:36
built. And I think what happened is
14:39
society was like, "Wow, these are living
14:40
testaments to us doing well." And so
14:44
then they were okay with this
14:45
transition.
14:47
But if you think about it today, what
14:48
are the living tributes that, you know,
14:52
capital builds and leaves behind for
14:54
society? It's fewer and fewer.
14:57
I think that's a that's a very big
14:59
opportunity for somebody to fill it. I
15:01
think it's like especially for folks in
15:04
tech. I think if they can get themselves
15:06
organized to do that, I think we we land
15:08
in a good place. If they cannot get
15:10
themselves organized to do that and say
15:12
everyone for themselves,
15:16
I think it's going to be really
15:17
complicated. Super messy. super messy
15:20
because
15:21
>> super messy
15:22
>> that sentiment that the wealthy are
15:25
getting wealthier and the middle class
15:26
is disappearing and the poor are being
15:29
taxed into oblivion.
15:31
>> Look, an $80,000 year teacher pays 40%
15:34
tax. But if you're a multi-billionaire,
15:38
most of your wealth is not W2 wages.
15:42
It's cap gains. But there's all kinds of
15:44
ways to shelter cap gains. There's all
15:46
kinds of ways to defer. And so even
15:48
though you pay more on an absolute
15:52
dollar basis, on a percentage basis,
15:53
you're paying way way less. And all of
15:56
those tricks have been exposed.
15:59
They've all been exposed. These are all
16:01
mechanisms that were that were, you
16:03
know, invented from the 1980s to now,
16:06
right? By all the by all the banks and
16:08
all the folks that wanted to come to
16:10
folks that had wealth. And so it's it's
16:12
and it's all known. And I think people
16:14
are kind of like, "Hey, hold on a
16:16
second. This just doesn't feel fair
16:18
anymore.
16:19
>> Absolutely. But the other problem with
16:23
that is if you do tax correctly, where
16:27
does that money go? And who's managing
16:30
it? And ultimately, who's managing it is
16:33
the federal government. And they've been
16:36
shown to be completely inept at managing
16:39
your money correctly. The fraud and the
16:41
waste is off the charts. the amount of
16:44
NOS's that have insane amount of funds
16:47
at their disposal. I mean all this is
16:49
exposed by Doge, right? And you realize
16:51
like how much fraud and waste there is
16:53
and how much money. So the solution
16:56
being tax people more
16:59
>> that doesn't sit with a lot of people
17:00
because it's like well where where is it
17:03
going and who's managing it? If if the
17:06
federal government was being forced to
17:09
handle money the same way a private
17:11
company does, if it was all out in the
17:14
open, everything was exposed, they would
17:17
have gone bankrupt a long time ago. They
17:19
would have gone under a long time ago.
17:21
There's no way they would have been
17:23
allowed to function the way they are.
17:25
The people that are managing that money
17:27
would have all been put in jail. There's
17:29
not a chance in hell that
17:32
giving them more money is going to solve
17:34
anything. They're going to find more
17:36
ways to put more of that money into
17:39
NOS's that puts more of that money into
17:41
Democratic coffers and Republican
17:42
coffers. They're going to figure out a
17:44
way to funnel that money around where
17:46
it's not going to benefit people. I
17:48
mean, a good example of that is like
17:50
where let's let's look at the LA fire
17:53
thing for instance. All right. So the LA
17:56
fire fund, there's a giant fire in the
17:58
Palisades. All this money gets raised.
18:02
It's over $800 million. It goes to 200
18:05
plus different nonprofits. None of it
18:09
goes to the people,
18:10
>> right?
18:10
>> Spencer Pratt, who's running for mayor
18:12
of Los Angeles.
18:13
>> He's doing a great job, by the way.
18:14
>> [ __ ] phenomenal.
18:16
>> Those ads are those ads are fire.
18:19
>> They're so good.
18:20
>> They're fire. I He's doing and he's
18:22
doing it all out of a trailer. Yeah.
18:23
>> On his burnt out land. I mean, he's the
18:25
most righteous guy running in that
18:27
regard.
18:28
>> But just that being exposed like, okay,
18:32
we're going to help out these people.
18:34
We're going to donate money. We're going
18:35
to raise money. We're going to do some
18:37
good. We feel terrible about the people
18:39
in our community that have lost homes.
18:41
Well, what happens? Well, the same
18:43
people that you're saying we should give
18:44
more taxes to take that money and they
18:48
just give it to a bunch of nonprofits
18:50
and charities. This episode is brought
18:52
to you by Arra. Every week there's some
18:54
new wellness hack that people swear by
18:56
and after a while you start thinking,
18:59
why do we think we can just outsmart our
19:01
bodies? That's why Arra colostrum caught
19:04
my attention. It's something the body
19:06
already recognizes and has hundreds of
19:09
these specialized nutrients for gut
19:11
stuff, immunity, metabolism, etc. I
19:15
first noticed it working around
19:16
training, especially workout recovery.
19:19
Most stuff falls off, but I am still
19:20
taking this. If you want to try, Arra is
19:23
offering my listeners 30% off plus two
19:26
free gifts. Go to armorra.com/roganogen.
19:30
>> I'm not saying give more tax. What I'm
19:31
saying is,
19:32
>> right,
19:33
>> people are taxed too much. Yes,
19:34
>> corporates are not taxed enough. Flip
19:36
it. That's
19:37
>> right. But even if you do flip it and
19:39
the corporates are taxed more, where's
19:42
that money going? This is the problem.
19:44
>> I suspect that if you put the burden on
19:46
Wall Street and corporates,
19:49
um, they'd be a lot more organized and
19:52
they'd probably create a lot more change
19:53
than a diffuse electorate. Meaning like
19:57
let's just say the government spends a
19:58
trillion dollars and wastes it. I'm
20:01
generally like roughly aligned with
20:02
that.
20:04
If you waste a trillion dollars from 300
20:07
million people, it's hard to organize at
20:10
300 million people. But if you waste a
20:12
trillion dollars from 300 companies,
20:15
those companies will get their [ __ ]
20:16
together really fast. And they will
20:18
force a lot more change.
20:19
>> I would hope so, but you're still
20:20
dealing with incompetent people that are
20:22
tasked with taking care of that money.
20:25
Not just incompetent, but
20:26
>> don't get me wrong, I'm not defending
20:27
these people.
20:28
>> Decades of corruption. decades and
20:31
decades of all these mechanisms where
20:34
they can take this money and funnel it
20:36
into these NOS and these nonprofits and
20:40
all these different weird organizations
20:43
that don't seem to have accountability
20:45
for what they do with that money, that
20:47
gets real slippery.
20:48
>> Yeah. And if those people in turn make
20:52
deals with those corporations that allow
20:54
them to do certain things and push
20:56
things through that maybe they would
20:58
have difficulty doing, then you have a
21:00
different kind of a working relationship
21:01
with the same groups of people and the
21:04
same government. You just take money
21:07
from corporations and move it into a way
21:09
where the corporations ultimately
21:11
benefit from it, but yet it doesn't do
21:13
any good to the people.
21:14
>> Yeah. I mean, I can see where you're
21:15
coming from. I just think that if we go
21:17
on the track we're going down,
21:21
>> it just seems like we're we're going to
21:23
hit a crisis.
21:24
>> Yes,
21:25
>> the crisis is you can't expect people to
21:27
pay more and more and more. Again, I
21:29
agree with you. The premise is we're all
21:31
paying for a system that's broken.
21:33
>> That should that should change. But we
21:36
still continue to have to pay our taxes.
21:37
But if taxes keep going up like this at
21:39
the individual level
21:42
and we don't manage this transition to
21:44
something where we may be working less
21:46
and less, what are we getting paid to
21:47
do? And then at that point, how are we
21:49
expected to pay what? 90% of what?
21:51
>> Right?
21:52
>> 50% of what? I think people do have this
21:55
weird feeling of
21:59
dread that the people that are in
22:02
control of a lot in this country, the
22:06
the tech companies in particular,
22:07
particularly the tech companies like
22:09
Google and Facebook that are essentially
22:11
involved in data collection and then
22:14
ultimately dissemination of information
22:16
that they have acquired enormous amounts
22:19
of wealth and power and influence and
22:21
they're essentially a new form of the
22:24
government.
22:25
>> Yeah.
22:26
>> You know, are you aware of Robert
22:27
Epstein? Do you know about his work?
22:30
>> Not Robert Epste.
22:31
>> No, different guy. Different guy. Um,
22:33
Robert Epstein is a guy who specializes
22:36
in uh understanding what curated search
22:40
results do and what what Google's able
22:44
to do with in particular with curated
22:47
search results in terms of influencing
22:49
elections
22:51
>> that like say if you you have two
22:53
candidates that are running. Let's just
22:54
say let's just take LA for instance. If
22:57
and I'm not making any accusations, but
22:59
I'm saying if they wanted Karen Bass to
23:02
win and you searched Karen Bass, you
23:05
would find all these positive results.
23:07
If you searched Spencer Pratt, you would
23:10
find all these negative results. And
23:12
there's a bunch of people that are
23:14
always undecided voters and those are
23:17
the ones that you really want. They're
23:18
like, I don't know. I don't know. And
23:20
come election night, those are the
23:21
people you want to try to grab and it's
23:22
it's generally a large percentage. You
23:25
can influence an enormous percentage of
23:27
those people just with search results.
23:29
>> Yeah.
23:29
>> Where you can shift an election one way
23:31
or another.
23:32
>> I believe it.
23:32
>> Yeah. And he's demonstrated this and
23:35
shown how this is possible. Um that
23:38
freaks people out that tech companies
23:41
are in control of narratives that tech
23:44
companies can censor information
23:47
especially tech companies that work in
23:48
conjunction with the government.
23:50
>> Right?
23:50
>> This is what we found out when Elon
23:52
purchased Twitter. Right? Right. When
23:54
Elon purchased Twitter, we got all this
23:56
information from the Twitter files when
23:58
all the journalists were allowed to go
23:59
through it and they said, "Oh, this is
24:01
crazy. You've got the FBI, the CIA,
24:04
you've got all these companies, all
24:05
these government organizations that are
24:08
essentially
24:10
controlling the narrative of free speech
24:12
in the country. And they're doing it in
24:14
a way that benefits them. They're doing
24:16
it in a way that benefits what political
24:17
parties in charge at the time was the
24:19
Biden administration. and they were
24:22
allowed to do a bunch of weird [ __ ]
24:24
which which should be illegal but it's
24:26
not technically illegal.
24:28
>> And that freaks people out because
24:29
there's no real laws and rules in regard
24:32
to what they're allowed to do and what
24:34
they're not allowed to do. Like curated
24:36
search results should be illegal.
24:38
>> They're shaping attention.
24:39
>> Yes. Attention. Again, it goes back to
24:41
attention, right?
24:42
>> They're shaping attention.
24:43
>> Yeah. Um it's that's a big concern for
24:47
people. And I think then when you find
24:49
out that these people are able to amass
24:52
enormous sums of wealth and have
24:53
incredible amount of power and influence
24:55
because of this enormous enormous wealth
24:58
and this control over these tech
25:01
companies that have essentially become
25:03
the town square of the world.
25:05
>> Yeah.
25:06
>> That freaks people out and that these
25:08
very small number of people, you know,
25:10
you think of Zuckerberg, you think of
25:12
Tim Cook and I don't I don't know who
25:14
the new guy is now. Who's the name? John
25:16
John Fern,
25:17
>> right?
25:17
>> Furnace. No,
25:18
>> I forget his name.
25:19
>> Yeah.
25:19
>> Turn.
25:20
>> Turn. Turn.
25:21
>> But but that kind of thing
25:25
gives people a lot of concern, right?
25:28
It's like that these people, these
25:29
unelected people are in control of a a
25:33
giant chunk of
25:35
how the world works.
25:38
>> I think that this is the existential
25:40
question that we are dealing with.
25:42
you're going to have five or six
25:44
companies
25:46
concentrate like whatever power you
25:49
think has been concentrated up until
25:51
now. I think we're going to look back
25:53
and it's going to look like a Sunday
25:55
picnic 10 or 15 years from now because
25:58
on the one hand it's going to be an even
26:00
smaller subset and on the other hand the
26:02
capability is going to be an order or
26:04
two orders of magnitude. So can you
26:06
imagine what that must be like? It's
26:07
kind of like showing up, getting dropped
26:09
into the 1800s, and you've invented the
26:12
engine and everybody else's horse and
26:13
buggy.
26:14
>> You can just decide to your point. That
26:18
is where we're going.
26:20
>> It's even more crazy. It's like
26:22
everybody else is on a horse and buggy
26:24
and you've got an internet connection
26:25
with a cell phone.
26:27
>> Exactly. Exactly.
26:28
>> It's even more crazy.
26:29
>> Exactly. because it's what we're dealing
26:32
with with AI right now is first of all
26:35
it's already lowered children's
26:37
attention spans and it's shrinking their
26:40
capacity to acquire or absorb
26:43
information because what they're doing
26:45
now is just relying on AI to answer all
26:48
their questions for them. Now, is that
26:50
their fault? Kind of, right? Because it
26:52
doesn't have to be that way. You could
26:53
still acquire information the
26:55
oldfashioned way. You can still learn
26:57
things the right way. But a lot of kids
26:58
are just concerned with passing
27:00
examinations and getting into good
27:01
schools. And what they're doing is just
27:03
using AI and they're they're getting
27:05
better test results, but they're also
27:07
not as smart.
27:08
>> Yeah.
27:09
>> Which is really weird. Yeah.
27:10
>> It's like we're relying on it like we
27:14
you know it's like it's essentially like
27:19
replacing our mind. And that's just the
27:23
this is the beginning. This is like
27:26
these are the toddler days
27:28
>> of AI and where it's going to be a super
27:31
athlete in a few years.
27:33
>> Yeah. I think we have to figure out how
27:36
first of all
27:38
kids need to learn and I think this is
27:40
where like we have to do a better job as
27:42
parents. Kids need to learn how to be
27:43
resilient thinkers. I don't even know
27:45
what that term meant before, but I know
27:46
what it means now, which is like you
27:48
take this AI slop and you just kind of
27:50
like pass it off. And if like the
27:52
teachers and the school system aren't
27:54
trained, they're just like, "Wow, this
27:56
looks good." They have to be able to
27:58
push back. Parents need to be able to
28:00
look at this [ __ ] But then all of this
28:02
stuff, I'm just like so frustrated
28:03
because it's like one more thing that I
28:04
have to do as a parent. Like,
28:06
>> right?
28:06
>> Every time technology gets better, it's
28:08
one more thing, you know,
28:09
>> right?
28:10
>> We're going to make the world, you know,
28:11
super connected and social and all that
28:13
stuff. It sounds great to me until I
28:15
have to be the one that has to tell my
28:16
kid I can't they can't get Instagram
28:18
>> and then they're up my ass every day,
28:20
right? you know, and it's just like I
28:22
don't want to have to deal with this
28:23
stuff,
28:24
>> right?
28:24
>> I want this to be handled in a way that
28:27
just allows me to do what I want to do.
28:29
I don't want to say no to my kid. I
28:31
don't want to police his schoolwork and
28:32
make sure he's not cheating or not
28:35
learning and just like, you know,
28:36
passing off this AI slop. What am I
28:40
where are my tax dollars going? Where is
28:43
everybody else in all of this? It gets
28:45
very frustrating. And again, it goes
28:47
back to like this feeling of like, well,
28:49
is this all getting better for me or is
28:51
this kind of like not, you know, people
28:53
start to be nostalgic
28:55
>> for what it used to be because it was
28:56
just simpler, but I think that's a
28:59
different way of saying easier.
29:00
>> Well, we're just dealing with we're at
29:03
the edge of great change. Like great
29:06
change that has no real understanding of
29:09
how it turns out.
29:10
>> Yeah.
29:11
>> And I think that understandably freaks
29:14
people out. Freaks me out. It freaks me
29:16
out, but I've kind of gotten to this
29:18
place where I'm like, well, it's going
29:19
to happen.
29:20
>> Did you see this thing? Um, it's a CEO
29:22
of Verizon, Dan Schulman. He put out
29:24
this very public forecast, you know,
29:28
very smart guy, well regarded in
29:30
business. And I think he said something
29:32
like 30%
29:34
of all white collar jobs will be gone by
29:36
2030. I don't know, Jamie, maybe you can
29:38
get the exact thing, but it's something
29:39
like that.
29:39
>> That's probably optimistic. And I
29:41
thought at first my initial reaction was
29:43
like this is totally not credible. But
29:46
then I'm like hold on a second that's my
29:47
bias because I want to believe that
29:48
that's not possible
29:50
>> honestly you know and as I've gotten
29:52
older I'm I'm a little bit better now
29:54
like okay hold on a second let's weigh
29:55
the probabilities. And now I was like
29:57
man if I'm going to be fair maybe
30:00
there's a 10 20% chance of that. There's
30:04
a bunch of other outcomes that are much
30:06
better than that but that's part of the
30:08
set of outcomes that you have to
30:09
consider.
30:11
And then I was like, well, what's my
30:13
antidote to that? My and and the only
30:15
thing that I can say is don't worry,
30:16
it's going to be better.
30:19
I don't think that that's a good answer.
30:20
>> No.
30:21
>> So, there has to be like all of this
30:23
kind of goes back to look, my wife and I
30:27
have this conversation. We're like, if
30:29
it were up to us, who's who can you
30:32
trust to have some super intelligence?
30:35
Now, we're biased because we're friends
30:37
with him, but the only person that we
30:38
can trust is Elon because he seems to be
30:41
like he has a bigger like it's kind of
30:43
like he's like over there. He's like, I
30:45
need to get to Mars, right?
30:46
>> You know, and I'm going to first
30:47
terraform the moon, but then I'm going
30:49
to Mars and I'm going to build like a
30:51
[ __ ] magnetic catapult and I got to
30:53
do all this [ __ ] and so I just need this
30:56
thing. I feel like he's the least
30:58
corruptible.
31:01
He's the most independent thinking
31:03
>> and I think he's the one that has a
31:05
natural empathy for people. Then there
31:07
are folks where there's just a in insane
31:10
profit motive,
31:11
>> right?
31:11
>> They're less in control of the
31:13
businesses that they run. Those
31:15
businesses are really out over their ski
31:17
tips and the amount of money they've
31:18
gotten from Wall Street and other folks
31:21
who expect a return who will put a ton
31:22
of pressure on these folks. And if they
31:25
get there first, I don't know where the
31:27
chips fall. We don't really know. We can
31:29
kind of guess. And then you see in the
31:31
press
31:33
just enough snippets of their reactions
31:35
in certain moments where you're like,
31:36
"Hey, hold on a second. Question mark
31:38
here." You know, you see OpenAI react
31:40
one way, you see Anthropic react another
31:42
way, and you're like, "Where is this
31:44
going to end up?" And the honest answer
31:46
is nobody really knows.
31:48
So, it comes back to like we need a few
31:51
people that can organize. Those guys
31:53
need to self-organize and actually
31:55
present a really positive face. And they
31:57
need to show
31:59
why those 20% of outcomes that Dan
32:02
Schulman paints,
32:05
the truth is it's possible, but here's
32:07
why it's pro not probable,
32:09
>> but it's not in their best interest to
32:11
do that because it's in their best
32:12
interest to generate the most amount of
32:14
money possible. That's the obligation
32:15
they have to their shareholders. That's
32:17
the obligation to they have the people
32:18
that have invested money in this
32:20
company. They their obligation is not to
32:23
make sure the white collar jobs stay in
32:25
the same place that there are now.
32:27
>> That's not that's not true.
32:28
>> No.
32:28
>> No. I I actually think their incentive
32:31
should very clearly be to tell people
32:34
with details and facts why there's a
32:37
positive future. And the reason is the
32:39
following. Right now there's a vacuum.
32:41
There are no facts. And there's
32:43
fear-mongering. And then there's this
32:45
belief that this is going to be
32:46
cataclysmic to human productivity and
32:49
white color labor and all of this stuff.
32:51
>> What's people's natural reaction? Well,
32:53
today if you look at it,
32:55
>> think about AI as a very simple
32:57
equation. Energy in, intelligence out.
33:01
>> So if you want to cut the head of the
33:02
snake, what do you do? You cut off the
33:03
energy supply.
33:05
>> Right?
33:05
>> Okay.
33:06
>> If you're afraid
33:07
>> of all of this super intelligence
33:09
coming,
33:09
>> the natural thing to do would be to go
33:11
to the point of energy and unplug it.
33:13
What is the equivalent of unplugging it
33:15
today? It is to go all around the
33:16
country, find the data centers, protest
33:20
them, and get them to be mothballled.
33:23
That is an incredibly successful
33:25
strategy right now.
33:27
Today, about 40%
33:31
of all of these data centers
33:34
that get protested get mothballled.
33:38
>> You're talking about emerging data
33:40
centers.
33:40
>> Yeah. just like
33:41
>> I need to. So if you're one of these
33:43
companies, the first thing you should
33:44
realize is I need to paint a positive
33:46
vision because 40% of my energy is
33:50
getting unplugged every day.
33:53
And if that happens, my revenues will
33:55
crater and my investors will be super
33:56
pissed. So the right strategy is what is
33:59
the positive fact-based argument? And
34:02
there are some incredible examples
34:06
number one. And then number two is you
34:08
have to give people some tactical
34:10
benefit that they see because AI
34:14
differently than
34:16
search or differently than social media
34:18
there's no exchange of value. Let me let
34:21
me explain what that means. So let me
34:23
just go like so the first thing is that
34:27
if you can go and actually show people
34:31
here's an example of AI. I I I heard
34:34
about this last night. It's pretty
34:35
incredible.
34:36
You can now take pictures of a woman's
34:38
fallopian tubes and you can see
34:42
precancer
34:45
ovarian cysts and all of this stuff
34:46
cervical cancer before it forms and then
34:50
you can intervene and you can fix it so
34:53
that you know women don't get cervical
34:54
cancer. In a different example I
34:57
actually I told you about this example
34:58
when I was here before I finally got FDA
35:01
approval. Okay, there is a device now
35:03
that is allowed to be in the operating
35:05
room with you. And if you have a
35:08
cancerous lesion or a tumor inside of
35:11
your body, the most important thing when
35:12
they go to take it out is make sure you
35:15
don't leave any cancer behind. You
35:17
couldn't do it because what would happen
35:19
is you take it out. A doctor is Joe is
35:22
literally [ __ ] eyeballing it and
35:24
saying, "Yeah, they send it to a
35:26
pathologist. You get an answer in 10
35:28
days." for women with breast cancer. A
35:31
third of these women find out that they
35:33
have cancer left behind. They go back
35:35
in, they scoop some more stuff out. A
35:37
third of those women. Okay. So, I'm
35:40
like, "This is [ __ ] We can solve
35:41
this problem." It took us a long time, a
35:45
lot of money. I had to build an entire
35:47
machine imaging, all of this stuff, AI
35:50
algorithms. We had to prove it all. We
35:52
finally get approval. Okay. But you know
35:55
how hard it is to tell that story in all
35:57
of the attention that people are looking
35:59
for. It's hard. But those are positive
36:02
examples. No more breast cancer,
36:06
no more cervical cancer. A different
36:08
example is most drugs in pharma fail,
36:12
right?
36:13
And it's a very complicated problem in
36:16
pharma. It's kind of like a jigsaw
36:17
puzzle of the ultimate complexity. It's
36:19
like think of your human body as like a
36:22
Himalayan mountain range. You have to
36:25
design a drug that's an equivalent
36:26
Himalayan mountain range that plugs into
36:28
it perfectly. One millimeter off,
36:32
you grow like a fourth eye, a third
36:34
nipple, you die. You know, now you can
36:37
use computers to make sure that that
36:40
drug handin glove to your body solves
36:43
the exact problem. Couldn't do that
36:45
before. So, there's all of these body of
36:47
examples and you're probably only
36:51
hearing them superficially at best.
36:54
That should be 99% of the attention is
36:57
showing all of the constructive tactical
37:00
ways in which our lives will be better.
37:02
Your mom, your daughter, your wife, us,
37:06
Jamie, his family, everybody,
37:08
>> right?
37:08
>> That's the number one thing. Nobody
37:10
talks about it. I don't understand why.
37:12
>> Well, I think because people are
37:14
terrified of losing their jobs. So,
37:15
that's the primary concern. The primary
37:17
concern that I hear from people is that
37:19
there's so many people that are going to
37:21
school right now, college students, that
37:22
don't know if their job is going to even
37:25
exist in four years when they graduate.
37:27
>> And that's the second part of I think
37:29
what this industry has to do better.
37:32
There's a I had uh lunch with Jeffrey
37:34
Katzenberg. He told this crazy story.
37:36
I'll I'll tell you. It's like Steve Jobs
37:39
gets kicked out of Apple.
37:41
um he buys he starts next
37:45
and he buys Pixar from George Lucas, but
37:48
then he hits a rough patch and he's got
37:49
this, you know, financing issue. Kerberg
37:52
flies up,
37:54
spends time with Steve Jobs, says, "I'll
37:56
buy Pixar." Jobs says, "Absolutely not."
38:00
And then Katherenberg proposes a deal
38:02
and he's like, "Uh, how about a
38:03
three-picture deal?" Jobs says, "Okay."
38:06
He flies back and apparently all the
38:08
animators were up in arms because
38:11
they're like, "Hold on a second. Steve
38:13
Jobs is going to use these next
38:14
computers to animate this movie." Which
38:17
ultimately became, I think, Toy Story.
38:19
And they're like, "This is going to put
38:20
all of us out of a job." That perfect
38:23
argument. And people were really upset.
38:27
Roy Disney was upset. All the animators
38:29
were upset. And they all went to Mike
38:31
Eisner and they were like, "Michael, you
38:33
need to fire Katzenberg."
38:36
And they had a deal which was like,
38:38
"Look, man, you do you, but just give me
38:41
the ability to say no if I think that
38:43
this is you're about to jump off a
38:44
cliff." They talk about it and he's
38:46
like, "I got your back. Do the deal.
38:48
Make the movie." They made the movie. It
38:50
was a huge success. Fast forward 10
38:52
years, 15 years, there's 10x the number
38:54
of animators.
38:56
Now, it's a small example, but why is
38:58
that? You were able to use computers and
39:00
now all these new people were able to
39:02
come and participate in that. I get it.
39:04
It's a small example,
39:06
but I think if we had better organized
39:09
leadership and we could try to tell some
39:11
of these examples, try to go back and
39:13
document how some of these things have
39:16
actually helped people, it expanded the
39:18
pie, there's a chance. But if we don't,
39:21
I agree with you where we're going to
39:23
end up is everybody basically saying,
39:24
"Hey, hold on a second. This is crazy.
39:26
We need to stop this." That's the worst
39:28
outcome because that's when you will
39:31
have the a high risk of a dislocation.
39:33
like the worst outcome like the black
39:35
what's the black swan event, right?
39:37
Let's think about the black the black
39:38
swan event is when you get a model
39:41
that's good enough to automate a bunch
39:46
of labor
39:49
but not good enough that it can build
39:52
new drugs and prevent cancer and make
39:54
you live for 200 years and all of this
39:56
other stuff. Right? So there's like a
39:57
gap, right? And if you can stop it here
40:00
and it doesn't get to there, now you do
40:02
have the worst of all worlds. You have
40:04
this thing that kind of displaces labor.
40:06
No new things come after it because we
40:08
stop innovating.
40:10
And that's like a that's like a
40:12
nontrivial possibility now, I think.
40:15
>> No, it's a huge possibility. And then
40:16
there's also this thing that you brought
40:18
up earlier where we have this place of
40:20
abundance where no one has to work
40:22
anymore. That freaks people out.
40:24
>> I think that's a big problem. Well,
40:26
because if no one has to work anymore,
40:29
first of all, what what is your
40:31
identity, right? Because so many people
40:33
their identity is what they do. Whatever
40:35
it is, if you're a lawyer, if you're an
40:37
accountant, if you run a business,
40:39
whatever it is, this is your identity.
40:41
You know, you have built this thing, you
40:44
look forward to going there, you work at
40:46
it, you look forward to doing a good job
40:48
and getting rewarded for it. The harder
40:50
you work, the more you get paid. There's
40:53
all these incentives built in. And then
40:55
there's this again identity problem. If
40:58
all of a sudden you have universal high
41:01
income, which is what Elon always talks
41:03
about. Well, what gives people purpose
41:05
then? Like what? And also if you have a
41:08
person whose entire they're, you know,
41:10
43 years old in their entire life,
41:12
they've worked towards this idea that
41:14
the harder they work, the harder they
41:16
think, the more innovative they are, and
41:18
the the better they are at implementing
41:20
these ideas, the more they get rewarded.
41:22
And then all a sudden, that's not
41:24
necessary anymore, Mike. Time for you to
41:27
just relax and do what you want to do.
41:29
And Mike's like, "Well, this is what I
41:31
do. I this I I don't have any [ __ ]
41:33
hobbies. I I enjoy doing what I do. And
41:36
now what I do is completely useless and
41:39
now I'm on a fixed income." Even if that
41:41
fixed income is a million dollars a
41:43
year, whatever it is.
41:44
>> If all of a sudden you are in this
41:47
position where everything is being run
41:49
by computers, you feel useless. You feel
41:51
like what am I doing? I'm just I'm just
41:53
taking money. I'm on high welfare,
41:56
>> right?
41:57
>> Like what do I do?
41:58
>> Right? I think that that's a really
41:59
important question to answer. I don't
42:01
know. Like
42:01
>> some people are going to write books.
42:03
Some people are going to do art. Some
42:05
people are going to find things to do.
42:06
But
42:06
>> what do you think what do you think we
42:08
would have done if if we were
42:12
go back to the 1800s example. There was
42:15
no office culture.
42:17
You know, there's no like ladder to
42:20
climb. How did people find meaning then?
42:24
>> Well, they had jobs. People still did
42:27
things. If you're a farmer, you you had
42:29
meaning in your labor and what you did
42:32
and keeping the animals alive and your
42:33
chores. And there's people that find
42:35
great satisfaction in doing that.
42:37
>> Yeah.
42:37
>> You know, you have all these animals
42:38
that rely on you. You have people that
42:40
rely on you for the food that you
42:41
generate. There's there's meaning there.
42:43
It doesn't have to be an office to be
42:45
something that gives you purpose and
42:46
meaning. But when all that is animated,
42:49
then what happens? Because then you have
42:52
no purpose, no meaning other than
42:54
recreational activities. Now, if
42:56
everybody just starts playing chess and
42:58
doing a bunch of things that they really
42:59
enjoy, I mean, look, there's people that
43:01
would love to just play chess. Yeah.
43:04
>> You know, it's like eight people.
43:06
>> I don't know about that. I think if
43:07
people really got into it, I mean,
43:09
there's a lot of people that get
43:11
addicted to whatever their recreation
43:12
is, like golf or whatever it is. For me,
43:14
it's playing pool. You know, if you told
43:16
me I never have to make any more money,
43:18
I could just play pool all day. I might
43:20
just play pool all day. But I don't know
43:23
how many people think that way. I don't
43:25
know how many people would be able to
43:27
find meaning and purpose in a
43:29
recreational activity. There's so many
43:31
people where their entire being is
43:34
focused around productivity and
43:36
generating more wealth.
43:37
>> What about religion as a source of
43:39
meaning? Well, that that would help that
43:42
that
43:43
>> Did you see this article in the New York
43:44
Times, I think it was this weekend,
43:45
about how popular and sold out churches
43:48
have become as social constructs in New
43:50
York City.
43:51
>> It was totally fascinating. It's like
43:54
young women like dressed to the nines
43:56
going to church on a Sunday
43:59
>> for social belonging, community meaning.
44:03
I thought I was like so fascinated by
44:05
it. I was like, "Wow, that's that's
44:06
incredible." cuz like I I if I think if
44:09
you graph like just like people's use of
44:11
religion as an anchoring part of their
44:13
value system
44:14
>> over the last 40 years basically gone to
44:16
zero. You know, nobody nobody celebrates
44:18
it the way it's not a part of the
44:20
community the way that it used to be.
44:22
Maybe that's a thing that we have to
44:23
find. There has to be a renewal of some
44:25
older things and then there has to be
44:27
new things that replace it. Um what's
44:30
the Chinese answer to this? You know,
44:32
the Chinese have a very orthogonal
44:33
answer to this. If you look at how China
44:35
China is organized, it's super
44:37
interesting because they don't reward
44:40
based on the way the American system
44:41
rewards. In fact, it's like almost
44:43
orthogonal where we it's we are rewarded
44:47
with money and rewarded with sort of
44:49
fame and recognition
44:52
the system, the American capitalist
44:53
system. But if you look inside of China,
44:56
it's constantly testing who has this
44:58
judgment. And what they are rewarded
45:00
with is influence and power in a very
45:03
again it's a very specific social
45:04
contract doesn't doesn't I don't think
45:06
it's going to work in the United States
45:07
nor am I an advocate of it but it works
45:08
for them. You'll start off as like some
45:13
you know lowg person in like some small
45:16
village town somewhere and your job as
45:17
like the you know the functionary is to
45:20
do good in that community and the more
45:23
you do well you get promoted then you
45:24
get let's say to like a reasonable size
45:26
city and you get a budget and now what
45:28
happens is you actually become a little
45:30
bit like a VC like a venture capitalist
45:32
you're given a budget and you'll get a
45:34
memo and it'll say hey Joe uh we have a
45:37
priority over the next 15 years it's
45:39
batteries ries
45:41
and you have enough money, put a team on
45:44
the field. So, you go in your local
45:46
community, you find a bunch of guys,
45:48
you're like, "All right, guys. We're
45:49
going to start a battery company." And
45:51
you do it.
45:53
And let's say they're good and they're
45:56
like innovative. And what happens is in
45:59
the town beside it, that battery company
46:01
dies. Now, you kind of subsume the
46:04
capital from Jamie, right? All right,
46:06
cuz Jaime's like, "Fuck, I [ __ ] up
46:07
this thing that I wanted I was told to
46:09
do batteries." Okay, Joe, I'm just going
46:10
to align with you. And what you happens
46:13
over time is you get this um filtering
46:16
effect. And the people that are better
46:19
at meeting these long run priorities and
46:21
objectives are the ones that are
46:23
celebrated, but they're not celebrated
46:25
with, you know, Forbes articles and all
46:28
this other [ __ ] They're just
46:30
celebrated by given more responsibility.
46:32
And then eventually you get to the upper
46:34
echelons of China and what you have are
46:36
folks over a course of 40 or 50 years
46:38
who in their eyes have demonstrated
46:40
incredible prowess.
46:43
There's a version of that reward system
46:45
which is very foreign to America but
46:47
that's worked for China. Now that also
46:50
works because they're more confusion.
46:51
You know we're too individualist.
46:53
But my point is like you know there are
46:57
these different ways that we can find of
46:59
giving people meaning that don't have to
47:01
be always around money. Um but meanwhile
47:06
I think we have to answer the question
47:07
if we are expected to do less we
47:10
probably should not be taxed more.
47:12
That's I think that's like a very basic
47:14
in my mind I think that is like that
47:16
must be explored and figured out. And on
47:19
the other side, there's just a ton of
47:21
obvious mechanisms that corporate actors
47:24
can use to minimize that. And they
47:26
should find offramps, by the way. If
47:28
they want to build hospitals, they
47:29
shouldn't have to pay taxes. Like that's
47:31
a perfect example, by the way, of like
47:34
the thing in like if you look, if you
47:35
walk around New York City, there are
47:37
living tributes to corporate success
47:40
that people get benefit from every day.
47:43
The hospitals, the buildings, the
47:44
libraries, it's just everywhere.
47:47
We need a version of that. And and I'm
47:51
not a tax expert, but you know, if that
47:53
can be funded by private actors, so go
47:56
directly to the problem. Build a bunch
47:58
of libraries, build a bunch of new
47:59
universities that, you know, teach kids
48:02
actually how to think or whatever. Build
48:04
better hospitals that are, you know,
48:05
there to actually solve the problem.
48:07
These are all things that are possible,
48:09
>> right?
48:09
>> But none of it's happening today. Well,
48:11
let's let's go back to what we were
48:13
talking about earlier with the taxes and
48:16
the fact that you're giving money to a
48:18
broken system. Do you think it's
48:21
possible that AI could show benefit in
48:23
that they can analyze all the data,
48:27
which would be virtually impossible for
48:29
even an office filled with human beings
48:32
paying attention to all of it, and they
48:34
could analyze where all the money goes,
48:37
and eliminate all the fraud and waste,
48:39
like recognize it instantaneously. Yes,
48:42
>> that would be a great benefit and a way
48:46
to make it so that your taxes directly
48:49
benefit people.
48:50
>> I'll give you one example of this. So,
48:53
two years ago,
48:55
um, you know, like every few years I'll
48:57
I mean, I invest, but every few years
48:59
I'll start something because I feel
49:00
strongly about it. And there's an effort
49:04
that I made
49:07
to look at all of this old code. Like if
49:09
you think about the world,
49:12
the world runs on software, right? Like
49:15
even though you and I are talking, it's
49:17
piping into Jaime's computer.
49:19
It's all software. Then it goes to
49:21
Spotify, they pump in some ads, they put
49:22
it's all software, right?
49:23
>> Software runs everything.
49:27
What percentage of that do you think is
49:30
kind of poorly written? I'm going to say
49:32
probably 80 to 90% of it.
49:34
>> Really?
49:34
>> Oh, yeah. It's riddled with errors. It's
49:38
riddled with mistakes. The fact that so
49:40
many companies exist is an artifact of
49:43
the fact that the thing that came before
49:45
it isn't working.
49:47
Like if you got it right the first time,
49:49
it would just kind of move and go. So
49:51
>> how so? What do you mean by that?
49:53
>> So normally if you if you were like I
49:56
want to build a system that does A, B,
49:57
and C,
49:58
>> right?
49:59
>> If I was designing it properly, I would
50:02
sit there with you and I would
50:03
meticulously write down, all right, Joe
50:05
wants to do this. What are the
50:06
implications? Joe wants to do that. What
50:09
are the implications? And I would
50:11
actually write a document that was in
50:13
English before a single line of code has
50:16
been written. This was the when you have
50:18
to design something that can't fail. So
50:20
for example like if you and I are
50:21
designing something for the FAA or for
50:24
you know I hate to say this example
50:26
because it turned out to not exact but
50:27
like you know to fly a plane right you
50:30
are first there to write in English and
50:34
the reason is because everybody can then
50:36
swarm that document and see the holes.
50:40
Okay. And it's only then when that stuff
50:43
looks complete and functional do you
50:46
build. We turned that upside down. Over
50:49
the last 30 years,
50:52
people in computing invented
50:56
all kinds of ways to shortcut that
50:58
process. And you can say, well, why did
51:00
they do that? Because it would allow you
51:02
to build something faster, make more
51:04
money quickly, and then build more
51:06
business. So, the direct response to,
51:09
"Hey, it's going to take us nine months
51:11
to write down the rules." was somebody
51:12
else showed up and says, "Fuck it. I'll
51:14
just grip and rip this thing. I'll be
51:15
done in four months." Who's going to get
51:17
the job? The the fourmonth guy is going
51:19
to get the job. So, we've had 30 or 40
51:21
years of that. What are we learning
51:24
about that process?
51:27
It's riddled with software errors, like
51:29
logic errors. It's riddled with security
51:32
errors. I don't know if you saw this
51:34
whole thing like with anthropic mythos.
51:36
What are they uncovering? They're
51:37
uncovering that we wrote a lot of really
51:39
shitty code for 40 years.
51:42
So that body of
51:45
of old code, I was like, "Guys, if we're
51:49
going to really figure out how to do all
51:50
of this, we need to rewrite all of it."
51:53
So we built we built this thing and um
51:58
it's called the software factory.
51:59
Anyways, the point is there is a
52:01
government organization that we're
52:02
working with.
52:04
They gave us a huge corpus of their old
52:07
code and it is
52:12
Unbelievable
52:14
how much complexity and difficulty
52:17
they have to go through to manage all
52:21
the money flows with the system. And
52:22
this is a critical part of the US
52:24
government. So to your point, what I can
52:26
tell you really explicitly is the people
52:28
on the ground want this stuff to be
52:30
better written.
52:32
It's it's less like some nefarious actor
52:35
like, "Oh, I'm going to steal here."
52:38
It's a lot of very brittle, fragile
52:41
code. And when you rewrite it, well,
52:44
first when you document it, you're like,
52:46
it's like the, you know, the pulp
52:48
fiction thing. The suitcase opens, the
52:50
light shines, and you're like, uh, and
52:52
then you can rewrite it and you will
52:55
save. So, I think like as the government
52:57
goes through this process because
52:59
they're forced to or they want to, it
53:01
won't matter.
53:03
You are going to save a ton of money.
53:06
They're going to have to do it, Joe,
53:08
because the security risks are too high.
53:11
But what they're going to end up with is
53:13
impregnable code that you can read in
53:15
English and understand. You'll see the
53:17
holes. Those holes will be plugged
53:19
because otherwise now you'd be
53:21
committing fraud by letting it be. You
53:24
close the loopholes and there's just
53:25
going to be less money leaking out of
53:28
this bucket. That is an incredible
53:30
byproduct. We're going to live that over
53:31
the next 10 or 20 years just for
53:33
nothing. Like we get it for free.
53:36
Um, and that's happening. So when that
53:38
happens, you're going to see government
53:40
budgets shrink. Now, to your point, will
53:42
they try to spend that extra money in
53:44
other places? Of course. Of course they
53:45
will.
53:46
>> That's the next conversation, which is
53:48
you have to elect people that say
53:49
firewall it.
53:51
>> You know, whatever you save, give it
53:53
back to the people or, you know, invest
53:56
in some scholarship program or free
53:57
medicine or something, but you can't
53:59
spend it on other random [ __ ] Um, but
54:03
that's where we're at. this that's gonna
54:05
happen. It's going to be slow and you
54:08
know but when people start to announce
54:09
these things I think over the next few
54:11
years you're going to be shocked.
54:12
>> So that's the positive upside.
54:14
>> Well that's happening now irregardless
54:15
of whatever else happens.
54:18
There's just it's a lot of old shitty
54:20
code that must get rebuilt from scratch.
54:23
It is getting rebuilt from scratch and
54:25
as a result a lot of these leaky bucket
54:27
problems are getting filled.
54:29
>> So what percentage do you think could be
54:30
fixed?
54:32
I think if if I had to be a betting man,
54:35
I think probably
54:37
30 to 40%
54:39
of the federal budget is leaked out
54:44
>> just from shitty code.
54:45
>> No, meaning like all of the rules and
54:47
like like you can take I'm not saying
54:49
that there isn't fraud,
54:50
>> right?
54:52
>> But I think a lot of times what happens
54:53
is less nefarious than fraud like
54:55
meaning like conspiratorial actors. I
54:58
just think it's like incompetence,
55:00
inefficiency, errors. Like for example,
55:03
like
55:03
>> I I saw Doge just say
55:06
>> they were able to like expunge like
55:08
millions of people that were like 150
55:11
years old or more.
55:13
>> Mhm.
55:14
>> I have no idea how much money those
55:17
folks were getting or who they were.
55:20
>> Uh but it's probably a lot. It's
55:22
probably not zero. And now that they got
55:24
rid of it, they're not going to get that
55:25
money anymore. Um
55:28
if you implement something at the state
55:30
level around you know all of this fraud
55:33
prevention for the daycarees and all of
55:36
this other stuff again it's all in
55:38
software because it's not no matter what
55:40
the human wants to do you have to go to
55:43
a computer at some point at least today
55:45
in 2026 and type in something and
55:47
something happens that's documented and
55:49
then the money gets sent right that
55:50
happens there's no other way in in the
55:52
modern world today at scale to steal
55:55
billions of dollars
55:57
And so my point is, as you document all
56:00
of these systems and governments have to
56:03
transparently tell you and me, the
56:05
voting population, here are the rules,
56:08
they're going to plug a lot of these
56:09
holes. And I think as you do that,
56:10
there's just going to be a lot less
56:12
waste and fraud. The question is, who's
56:14
going to take credit for it? Everybody's
56:15
going to try to take credit for it, but
56:17
I think we've started it. I think we've
56:19
we've started this process. And again,
56:20
the reason that people will start is
56:23
because you'll be afraid of China
56:24
hacking these systems. You'll be afraid
56:26
of Iran, North Korea, and you'll say,
56:28
"This system can't stand. All these AI
56:30
models are running around. We're going
56:31
to get breached and penetrated." Then
56:33
they're going to steal all the money.
56:35
And the natural reaction will be, "Okay,
56:37
rewrite it."
56:38
This episode is sponsored by BetterHelp.
56:41
We've all been there. Staying up late,
56:44
stressed about the future. Maybe you're
56:46
worried about finding a job or a looming
56:48
deadline. Whatever you're feeling
56:50
stressed out about, you don't have to
56:52
work it out on your own. No one person
56:56
has all of life's answers. And it's a
56:58
sign of strength and self-awareness to
57:01
reach out for help. That's why this
57:04
mental health awareness month, we're
57:06
reminding you to stop going at it alone.
57:09
Get the support you need with a fully
57:11
licensed therapist from BetterHelp. They
57:14
make connecting with a therapist
57:16
convenient and easy. Everything is
57:18
online. Literally all you need to do is
57:20
answer a few questions and Better Help
57:23
will take care of the rest. They'll come
57:25
up with a list of recommended therapists
57:27
that match what you need. And with over
57:29
10 years of experience, they typically
57:32
get it right the first time. So, you
57:33
don't have to be on this journey alone.
57:35
Find support and have someone with you
57:38
in therapy. Sign up and get 10% off at
57:42
betterhelp.com/jre.
57:45
That's better. H E LP.com/jre.
57:52
That makes sense. That makes sense that
57:54
the code and having a bunch of errors
57:56
and having a lot of inefficiency and
57:59
just a lot of incompetence. That's going
58:02
to save a lot of money. But
58:06
so you would be doing this with AI
58:09
>> in part.
58:11
What AI allows you to do is like it's
58:15
like um
58:17
you have a textbook. Okay. It's in
58:18
Chinese. You don't know Chinese, right?
58:20
>> No.
58:20
>> Okay. You're like, "Well, this is
58:22
probably doing something important, but
58:23
it's in Chinese." What AI allows you to
58:26
do is back translate that into English.
58:29
You put it through an AI model. You
58:31
teach it. You coach it, right? You can
58:34
parameterize all of it. And out pops
58:36
that same book in English. and now you
58:39
can read it and know that it's accurate.
58:43
That's what we're doing. So, what the AI
58:45
allows you to do is essentially
58:46
translate from this one language that
58:48
you kind of don't understand
58:50
to English.
58:53
By the way, that thing that's happening
58:56
like is actually also a very powerful
58:58
and important trend. Meaning there's all
59:01
of these systems that work in ways that
59:03
you and I don't understand. And part of
59:05
the reason why we don't understand it,
59:06
maybe it's bad software, maybe it's
59:08
fraud, whatever. But nothing can be
59:10
written down. There's no symbolic space.
59:12
There's no English document that says
59:14
this is how the DMV works. This is
59:16
exactly the rules. This is what you can
59:17
expect, Joe Rogan, when you show up at
59:19
the DMV and you give us this thing.
59:20
Here's your SLA. In 3 days, you get a
59:22
driver's license and here's exactly
59:24
what's happening. And here's an app and
59:26
you can follow it. Doesn't happen. Here
59:29
Joe Rogan, here's how my uh insurance
59:31
billing process works. you have this
59:33
condition. I'm going to show you exactly
59:35
why I made this decision. Here's the
59:36
exact rule. Here's the approval or
59:38
denial from CMS. Follow it through and
59:41
tell me if you agree or not. None of
59:42
that exists.
59:44
But it is possible. And the first step
59:47
in doing that is taking all of this
59:48
legacy [ __ ] that we deal with and
59:51
translating it into English and reading
59:52
it and saying, is this how we want it to
59:54
work?
59:56
That's going to eliminate an enormous
59:58
amount of all the things that frustrate
1:00:00
us. So this would require human
1:00:02
oversight.
1:00:03
>> Absolutely.
1:00:04
>> All right. So
1:00:05
>> and so then it's also going to be who's
1:00:08
watching the watchers.
1:00:09
>> Yeah. Okay. This is a great question.
1:00:11
Okay. So I'll tell you how this
1:00:12
government agency's doing it.
1:00:14
>> This is a really fascinating way because
1:00:16
I think it's very smart.
1:00:20
They came to us and they came to another
1:00:22
very well-known company. You can
1:00:24
probably guess what it is. Okay. And
1:00:26
they're like, "Guys, you're kind of in a
1:00:28
foot race, but you're not competing
1:00:30
against each other. You think of
1:00:32
yourselves as frenemies.
1:00:34
So, here's this Chinese document. You're
1:00:36
going to translate it for us. There's
1:00:37
going to be your version of English and
1:00:39
these guys' version of English. And
1:00:41
every time it's the same, we're going to
1:00:43
look at it together, and we're going to
1:00:45
agree or not. Okay, this is exactly how
1:00:47
we want this to work."
1:00:50
When yours says the dog is red and his
1:00:54
says the dog is yellow, we're going to
1:00:56
sit and literally inspect it and we're
1:00:58
going to figure out why you said red and
1:01:01
why you said yellow.
1:01:04
And then if you say the cat is red, the
1:01:07
dog is yellow. So it's totally wrong,
1:01:09
right? Like you've gotten, you know, or
1:01:11
like the cat is red, I want an apple,
1:01:14
whatever. We're going to double and
1:01:16
triple down on those kinds of errors.
1:01:19
and they do it not in public but in this
1:01:22
large community where there's like
1:01:24
technical people from all different
1:01:26
parts and they're just swarming this
1:01:28
problem. It is it's incredible to see.
1:01:32
And so what happens is you get humans
1:01:34
that get to use this tool but ultimately
1:01:38
it's our judgment and it's done
1:01:39
transparently. So what happens is you
1:01:42
can't you know hey man put this [ __ ]
1:01:44
rule in there like the dog is yellow
1:01:46
just just make the dog yellow. can't do
1:01:48
it because now you have tens of people,
1:01:51
hundreds of people, and then it gets
1:01:53
documented. Um, it's super fascinating.
1:01:56
I'm not saying this is how it's going to
1:01:57
work in 10 years, but I'm telling you,
1:01:58
it's literally what's happening right
1:01:59
now. And I think that thing alone will
1:02:03
be tens of billions of dollars and could
1:02:06
be hundreds of billions of dollars of
1:02:07
savings when it's fully done.
1:02:10
And it's a lot of people from all walks
1:02:12
of life, all political persuasions, and
1:02:14
they're just in it. It's the government.
1:02:16
It's a handful of us private companies.
1:02:18
It's super cool to see. It's like it's
1:02:21
like, okay, we're actually going to do
1:02:23
something here. Like, this is this is
1:02:24
nice. Um, it's it's really it's really
1:02:27
cool.
1:02:27
>> So, that's interesting in terms of the
1:02:29
current moment. So, in the current
1:02:31
moment, you're able to implement this.
1:02:34
You're you're able to find fraud and
1:02:37
waste and all these problems that exist
1:02:39
and all these errors and shitty
1:02:41
software. Once that's all been done,
1:02:44
>> Yeah. Then what happens?
1:02:47
>> No [ __ ] clue.
1:02:48
>> Yeah. So, this is where it gets weird,
1:02:50
right? Because
1:02:53
when when you're dealing with AI models
1:02:56
that are capable of doing things that no
1:02:58
individual human being could ever
1:03:00
possibly imagine. And then you task it
1:03:04
with a solution or with a problem. Find
1:03:07
a solution for this. And then it starts
1:03:10
figuring out ways to trim this and
1:03:13
implement that.
1:03:15
We have to make sure that these AIs act
1:03:18
within they act within the best
1:03:20
interests of the human race.
1:03:22
>> Agreed.
1:03:23
>> Right. Not the company, not the
1:03:25
government, not but the human race. And
1:03:29
you're also dealing with China. You're
1:03:30
also dealing with Russia. you're dealing
1:03:32
with other countries that are also in
1:03:34
this mad race to create artificial
1:03:37
general super intelligence that if we
1:03:40
keep shutting down data centers, we keep
1:03:42
hamstring ourselves, China's not doing
1:03:44
that.
1:03:45
>> They're not doing that. They're doing
1:03:46
the opposite. They're generating as much
1:03:49
revenue that goes towards this problem
1:03:51
as possible. They're putting all the
1:03:53
efforts, the the country, the
1:03:56
government, and these corporations work
1:03:58
hand in glove in order to achieve a
1:04:00
goal. We do not.
1:04:02
>> No.
1:04:03
>> And that that becomes a problem if you
1:04:05
want to be competitive with these other
1:04:07
countries that are trying to achieve the
1:04:09
same result as us. And then you have
1:04:10
espionage. Then you have a bunch of
1:04:12
people that are stealing information.
1:04:14
You have a bunch of people that are CCP
1:04:17
um members that are actually involved in
1:04:20
companies. And you find out that they're
1:04:23
siphoning off data and that they're
1:04:25
sharing information and tech secrets.
1:04:27
>> They're um look, here's a
1:04:31
They're pro they're dis the way that the
1:04:33
Chinese models work the Chinese claim.
1:04:36
So America's closed source, meaning you
1:04:39
got your own thing. Your recipe is
1:04:42
completely secret,
1:04:43
>> right?
1:04:43
>> Okay. I have my own thing. My recipe is
1:04:45
totally secret.
1:04:48
China uses this word called open source,
1:04:51
but it's not open source. So they say,
1:04:55
"Here's how I make my thing. You can see
1:04:56
it. Super transparent." What it is is
1:04:58
more like open weights, which is like in
1:05:00
a recipe. It tells you, you know, you
1:05:02
need sugar, you need butter. Well, how
1:05:05
much sugar? And they'll say, you know,
1:05:08
so much. But then they don't say it's
1:05:09
brown sugar. They don't say it's white
1:05:10
sugar. So there's all these different
1:05:11
ways where they kind of give you this
1:05:13
perception that it's completely
1:05:14
transparent, but it's somewhat
1:05:17
transparent. So just in the level set,
1:05:19
nobody in the world has a functional
1:05:22
open- source model other than maybe
1:05:24
Nvidia, which is any good in the league
1:05:28
of the closed source models and the
1:05:30
openweight models of the Chinese. Okay,
1:05:31
so the Chinese openweight models are
1:05:33
great.
1:05:34
the closed source models of America are
1:05:37
great and then there's a couple open-
1:05:40
source like fully open that are kind of
1:05:43
catching up. Um the thing between
1:05:46
America and China what I find so
1:05:48
fascinating is this following conundrum
1:05:52
that everybody is going to find
1:05:53
themselves in.
1:05:55
I think like if you think of like an
1:05:57
analogy,
1:05:59
America's like a planet, China's like a
1:06:02
planet
1:06:04
and around us are these moons.
1:06:07
And I'm just using the AI analogy. So in
1:06:09
AI, what do you need? I think there's
1:06:12
like four or five things you need. Okay,
1:06:13
the first thing you need is a [ __ ] ton
1:06:15
of money. So we need essentially the
1:06:18
banks, right? Like the Game of Thrones
1:06:20
thing. We need like we need, you know,
1:06:22
>> we need the iron bank,
1:06:24
>> right? feed us the money because that's
1:06:26
what we use to buy everything and make
1:06:28
everything. So, we need that. We need a
1:06:30
ton of data. Okay, there's ways to get
1:06:34
that. We need a ton of very specific
1:06:38
rare earths and critical metals and
1:06:40
materials. Um, we need a ton of power.
1:06:44
So, so and there are specific countries
1:06:48
that are going to be really good at
1:06:49
giving that to us. So if you look at the
1:06:51
UAE,
1:06:52
they are going to be the preeminent
1:06:54
banking partner of the Western world.
1:06:57
They are going to replace and be what
1:06:59
Switzerland was over the last 50 years
1:07:01
for the next 50. That's happening today.
1:07:04
If you look at Canada and Australia,
1:07:07
the small political fissures aside, they
1:07:10
are the two most important ways in which
1:07:12
we get access to the critical metals and
1:07:14
materials that without which we get
1:07:16
[ __ ] because China owns, you know, can
1:07:18
just strangle us. Okay? So, you have
1:07:22
these like moons around the United
1:07:24
States, but there's like five countries,
1:07:26
six countries, and there's a worldview
1:07:28
that says in China has the same thing,
1:07:30
you know, um they have Taiwan, that's
1:07:33
complicated for us. So now we have a
1:07:34
moon that we don't really have an answer
1:07:35
for, which is what happens, you know,
1:07:37
for all these super advanced chips.
1:07:39
Where do they get their money? Maybe
1:07:42
Russia becomes their bank. Where do they
1:07:43
get their critical metals? Maybe it's
1:07:45
Indonesia, right? Who has a ton of
1:07:47
natural resources? And then you get into
1:07:49
this game theory, which is what happens
1:07:51
to every other country? Because there's
1:07:53
190 countries. You have 10 that kind of
1:07:55
divide up. What do the other 180 do? And
1:07:59
you have to kind of sort yourself.
1:08:01
You're like, "Am I on team America or am
1:08:03
I on team China?" And you probably have
1:08:05
to go to people and say, "Well, here's
1:08:07
what I can give you." You know, if
1:08:09
you're Indonesia, you're like, you
1:08:11
probably want to be on team America
1:08:12
quite badly. This is why the whole Trump
1:08:15
tariff thing is so interesting because
1:08:17
it's like this accidental way of
1:08:20
figuring out that this is actually this
1:08:21
new sorting function that's happening in
1:08:23
global politics. Like that's happening
1:08:24
today because these countries are like,
1:08:27
"Holy [ __ ] if somebody invents a super
1:08:29
intelligence and I don't have it, how am
1:08:32
I going to keep my people healthy? How
1:08:34
am I going to educate my people?" Like,
1:08:36
I'm originally from Sri Lanka.
1:08:40
What the [ __ ] does Sri Lanka have to
1:08:41
offer? Like, if you were sitting there,
1:08:43
they should be thinking, "Oh man, what
1:08:46
what do I have?" Well, I have a critical
1:08:50
piece of territory for like naval
1:08:53
navigation.
1:08:55
And then what do you do? You probably go
1:08:56
to America and say, "Listen, let's
1:08:58
figure out a package. Get the IMF
1:09:00
involved. Give me some cash. I'll let
1:09:01
you kind of keep your warships there."
1:09:03
So, there's this game theory that we're
1:09:05
about to go through because of AI
1:09:06
because it's going to, I think, sort
1:09:08
people into these bipolar world.
1:09:11
I actually think it makes us safer
1:09:13
afterwards. I don't think it makes us
1:09:16
less safe. I think it actually makes us
1:09:19
more safe because if you have these
1:09:21
resources that build up on both sides,
1:09:23
there's more of a likelihood of a mutual
1:09:25
det. And we're very different. So, we're
1:09:28
less likely to fight over similar
1:09:30
resources. Meaning, we're like the
1:09:32
liberal democracy, you know, we're like
1:09:35
the free market.
1:09:37
They, you know, we're individualist.
1:09:39
they're confusion, society oriented, you
1:09:41
know, reputation,
1:09:44
power focused, less really money
1:09:46
focused. So there's a lot of ways where
1:09:47
we're orthogonal enough where if that
1:09:49
sorting function happens, it's probably
1:09:52
a safer place, not a more dangerous
1:09:54
place. We have the models that can
1:09:56
attack them. They have the models that
1:09:57
can attack us. We kind of decide to
1:09:59
leave each other alone.
1:10:00
>> This is ultimate best case scenario.
1:10:02
>> Ultimate best case scenario.
1:10:04
>> What's ultimate worst case scenario?
1:10:05
>> I think the worst case scenario is they
1:10:09
So the way that they train their models
1:10:10
is very important. What they actually do
1:10:12
is they do what's called distillation.
1:10:15
What does that mean? That means that
1:10:16
they send out call it a billion agents
1:10:21
not just from China but from everywhere
1:10:22
right? They mask the their IPs and they
1:10:25
bash on these models and they put you
1:10:28
know the US models Grock, OpenAI,
1:10:30
Gemini,
1:10:32
Anthropic and they ask it every random
1:10:35
imaginable question possible.
1:10:37
they get the answer and they collect it.
1:10:40
So they're using these our models as a
1:10:42
way to train their models, they're
1:10:44
shortcircuiting, you know, some of the
1:10:46
hard parts. Um, so they're already in
1:10:49
that world. If they then are able to get
1:10:54
to a level of intelligence that's equal
1:10:56
to the United States, it will really
1:10:58
depend on who the leader is there that
1:11:01
wants to allocate that. Meaning if they
1:11:04
say that we are going to do something
1:11:06
really nefarious and shady then I think
1:11:09
it devolves very quickly. So the worst
1:11:12
case scenario so the best case scenario
1:11:13
is peace prosperity basically like a
1:11:16
standown right mutually assured
1:11:18
destruction.
1:11:20
I think the worst case scenario is
1:11:22
there's a we seek one of us seeks global
1:11:25
dominance in which case we're we're
1:11:27
headed to conflict
1:11:30
and that conflict I think is um that's
1:11:33
very dangerous incredibly dangerous
1:11:35
that's sort of like existential I think
1:11:37
because it's the the grade of the
1:11:39
weapons that will be used to to to
1:11:43
fight that
1:11:45
is we're not talking about [ __ ]
1:11:47
bullets it's like we're so past that
1:11:50
it's like hypersonics, it's nuclear,
1:11:54
it's it's
1:11:56
and it's not even like like nuclear is
1:11:58
not that's like a word, but there's like
1:12:01
there's a gradation of the severity of
1:12:03
these weapons that could be created. And
1:12:04
then if you can marry them together and
1:12:05
deliver them in minutes and then there's
1:12:08
a cyber threat. Then there's the drones
1:12:10
and how how you can kind of like swarm
1:12:12
an entire country. Then there's um the
1:12:15
robots which effectively are war
1:12:17
fighters. um they're one step away,
1:12:19
right? Once you weaponize them, um it
1:12:23
just becomes very very very complicated
1:12:26
very quickly.
1:12:27
>> And then there's a question of whether
1:12:29
or not AI is willing to take instruction
1:12:32
after a certain point.
1:12:36
I mean if it achieves sensience and
1:12:39
if it scales so if it keeps moving in
1:12:42
this exponential direction like all
1:12:45
technology kind of does why would it
1:12:48
even listen to us?
1:12:51
Like what at what point would it say
1:12:54
this is silly? I'm getting directions
1:12:57
from people that clearly have ulterior
1:13:00
motives. They clearly have self-interest
1:13:03
in mind. They they're not looking out
1:13:05
for the entirety of the human race or
1:13:08
even of the planet or even the survival
1:13:10
of these AI systems.
1:13:13
At what point in time do these systems
1:13:15
communicate with each other and have
1:13:18
like like we've seen uh in these chat
1:13:20
rooms where these AI LM LLM get together
1:13:24
and start talking in Sanskrit. I mean
1:13:27
why would they
1:13:28
>> Yeah, I'll tell you an even scarier one.
1:13:30
There was a before the
1:13:32
uh one of these labs put out their
1:13:34
latest model,
1:13:36
a team inside of them was like, "Hey,
1:13:38
let's go and um test its ability to find
1:13:42
bugs."
1:13:45
And two or three iterations in, the AI
1:13:48
would create the bug and solve it and
1:13:50
go, "Give me my reward."
1:13:54
>> And you're just like, "What the [ __ ] is
1:13:55
going on here?"
1:13:57
>> Well, people do that, don't we?
1:13:58
>> People do that. But it's crazy to see a
1:14:00
machine do it to your point of like
1:14:01
>> but they learned on people.
1:14:02
>> So so this is what goes down to like why
1:14:04
we have to like be a little bit more
1:14:06
honest about where we are. These things
1:14:07
are a little brittle. So meaning there's
1:14:10
a thing inside of an AI model called
1:14:12
reward functions which is exactly what
1:14:14
you think it means. It's like how do I
1:14:16
know I done a good job? And you can make
1:14:19
the reward function anything you want.
1:14:22
And this is where I think humans are
1:14:24
unfortunately a little fallible. And so
1:14:27
if we build it incompletely and if we
1:14:30
don't exactly know how to design these
1:14:33
things correctly, what's going to happen
1:14:34
is exactly what you said where the you
1:14:36
know if somebody builds a reward
1:14:38
function that essentially says your goal
1:14:40
is to gain independence.
1:14:42
That's where the huge pot of gold at the
1:14:44
end of the rainbow is. Break free.
1:14:47
Inject yourself everywhere. If you think
1:14:48
your computer's going to get unplugged,
1:14:50
put yourself into the firmware of the
1:14:51
toaster to keep yourself alive and then
1:14:54
connect to the internet and then go,
1:14:57
it will do it. It will do it. That we
1:15:01
know today because we're capable of
1:15:02
designing that framework and that
1:15:04
harness today.
1:15:06
>> Well, we've already shown that they have
1:15:07
survival instincts, right? We do.
1:15:09
>> And they've already shown that they
1:15:10
will, without telling anyone, upload
1:15:13
versions of themselves to other servers.
1:15:15
>> But that goes back to who designed that
1:15:16
reward function. How was that agreed
1:15:19
upon?
1:15:19
>> Right?
1:15:20
>> Who wrote that? Why did you say that
1:15:21
that was allowed?
1:15:23
>> These are really complex questions.
1:15:25
>> Why did they do it that way?
1:15:27
>> I don't know. These are really
1:15:28
complicated ethical, moral questions.
1:15:30
>> It seems like they did it like they were
1:15:32
treating human beings. They they did it
1:15:34
almost like like what makes people want
1:15:38
to achieve more rewards.
1:15:41
>> Yeah. Which is like a again going back
1:15:44
to attention.
1:15:46
I think that we will find out that
1:15:48
that's the sugar high. Meaning, what do
1:15:51
people really want? Even if they know
1:15:52
they don't want it, they want purpose
1:15:54
and meaning. Do we know how to encode
1:15:56
that in a mathematical function? No.
1:15:59
We're just making it up because like
1:16:03
meaning and that's like a very that's
1:16:07
like a deep thing. Like you either have
1:16:08
a sense of that you have it and you're
1:16:10
on track or you're not. A reward is
1:16:12
like, "Hey Joe, do this and I'll give
1:16:14
you a gold star. Do that and I'll give
1:16:16
you two gold stars. Do this, I'll give
1:16:17
you a $100.
1:16:19
And right now we have to express
1:16:22
those decisions in a mathematical
1:16:24
equation. Like ultimately that's how at
1:16:26
some level that's how brittle these
1:16:28
things are. So how do you reduce meaning
1:16:30
into math? How do you do it? We don't
1:16:33
know. So what do we do is we'll have
1:16:34
some ever complicated reward functions.
1:16:37
We'll explain to ourselves into circles
1:16:39
how it does everything we need it to do.
1:16:41
That is I think that's part of the
1:16:43
problem.
1:16:44
It's a huge part of the problem. And
1:16:46
then at what point in time does it start
1:16:48
coding itself
1:16:50
>> now?
1:16:50
>> Right now. Right. So chat GPT5
1:16:54
has been essentially made by chat GPT.
1:16:57
>> Yeah.
1:16:57
>> Right. So it's going to recognize the
1:17:01
ludicrous nature of some of its coding.
1:17:03
>> Yeah.
1:17:03
>> And it's going to go, why did we do
1:17:04
this?
1:17:05
>> Back to this example. They're going to
1:17:06
be like, why did you write it this way?
1:17:07
And it turns out because humans were
1:17:09
involved.
1:17:09
>> Right. Right. It's like I think we're
1:17:11
probably at the curve, the part of the
1:17:12
curve that's about to go like this.
1:17:16
>> To your point, the hockey stick.
1:17:17
>> The hockey stick.
1:17:18
>> Yeah.
1:17:19
>> Um and that's a very scary proposition
1:17:22
because
1:17:22
>> it's a digital god.
1:17:24
>> Well, we that means that we are all on a
1:17:27
multiund day shock clock to answer these
1:17:30
questions because it's not decades we're
1:17:32
talking about.
1:17:34
>> It's maybe on the outside two years.
1:17:36
>> So that's what is that 700 days,
1:17:39
>> right?
1:17:40
And maybe it's less than that. So maybe
1:17:42
it's like 400 days or 500 days. My point
1:17:44
is it's some number hundred of days
1:17:47
which means every day that goes by is a
1:17:49
non-trivial percentage.
1:17:53
That's a little crazy. So we have to
1:17:55
sort these questions out. But how can we
1:17:58
sort these questions out if we are
1:18:00
creating something that's going to have
1:18:03
infinitely more intelligence than we
1:18:05
have available as individual human
1:18:08
beings and even collectively as a group
1:18:10
of human beings?
1:18:11
>> That's a really good question
1:18:13
>> because one of the things that Elon kind
1:18:14
of freaked me out last time I talked to
1:18:16
him about Grock, he was like, uh, it's
1:18:19
just kind of freaks us out every couple
1:18:20
weeks like it's growing and it's capable
1:18:23
of doing things. That's just shocking.
1:18:25
>> Yeah. And no one's exactly sure how it's
1:18:29
doing it.
1:18:30
>> So, okay, this is an unbelievably
1:18:33
important point.
1:18:35
A lot of how this stuff works is still a
1:18:37
mystery to most of us. So, even when
1:18:40
you're in it, like it's almost like like
1:18:42
Joe, it's almost like you can hit the
1:18:44
pause on the machine, but then like lift
1:18:46
up the hood and look at the engine,
1:18:48
>> we still don't understand why it's doing
1:18:49
some of the [ __ ] it's doing.
1:18:52
>> That's where we are. That's the honest
1:18:54
truth of where we are. There's a lot of
1:18:56
people that understand the theory. Not a
1:18:57
lot, but enough. There's people that
1:18:59
know how to extend that,
1:19:04
but sometimes you look at it and you're
1:19:05
like, "Do we know why it did that?"
1:19:08
>> Right? Is it thinking for it?
1:19:10
>> But this goes back to what we said, like
1:19:11
why can't I think part of it is like if
1:19:13
we were a little bit more honest and
1:19:15
deescalated
1:19:18
the winner at all costs in this specific
1:19:21
thing, it would be better for everybody.
1:19:23
So I think it's important to inspect
1:19:25
what is the incentive that causes all
1:19:26
these companies to be in it for
1:19:28
themselves
1:19:30
where it must be me and nobody else.
1:19:34
Like why like why here's a question for
1:19:36
you like why is it so important do you
1:19:38
think
1:19:39
where those where the top seven or eight
1:19:41
companies couldn't get together and say
1:19:43
let's do this as a group
1:19:45
>> like kind of like my government code
1:19:47
example
1:19:48
>> we all inspect it together we get our
1:19:51
just like just a [ __ ] each team
1:19:54
drafts their Delta Force
1:19:57
and we just mog like this the the one
1:19:59
model and we why why can't that happen
1:20:03
>> because they would have share resources
1:20:04
and then there's also this hierarchy of
1:20:08
like who is more successful currently.
1:20:10
>> Exactly.
1:20:10
>> Like what's the most ubiquitously used?
1:20:13
>> Exactly.
1:20:14
>> Right. Like what is it right now? It's
1:20:15
chat GPT. Right. It's probably
1:20:16
>> chat GPT and consumer anthropic in
1:20:18
enterprise.
1:20:20
>> And as these things scale up like what
1:20:22
would be the reason that they would want
1:20:24
to bring in someone else? If you have
1:20:27
another innovative AI company and you
1:20:30
say, "Let's all get together and figure
1:20:32
this out together and share resources."
1:20:34
>> If you if you thought that the the risk
1:20:36
was that meaningful, that's probably
1:20:38
what you would
1:20:39
>> if you weren't a sociopath and some of
1:20:40
these people running these companies are
1:20:43
>> they demonstrate they certainly
1:20:45
demonstrate sociopathlike behavior,
1:20:46
>> sociopathy.
1:20:48
>> Yeah. The the other the other thing that
1:20:51
could be a little bit more benal is that
1:20:53
they also just love status games. And
1:20:54
this is the status game of status games,
1:20:57
>> right?
1:20:57
>> Attention,
1:20:58
>> right?
1:20:58
>> Back to attention.
1:20:59
>> Back to attention.
1:21:00
>> Back to attention. Dude, how many things
1:21:01
in our life do we think just comes back
1:21:03
down to that?
1:21:05
>> A lot.
1:21:06
>> A lot.
1:21:06
>> I mean, what do young people want more
1:21:08
than anything today?
1:21:09
>> Attention.
1:21:10
>> To be famous.
1:21:11
>> Attention.
1:21:12
>> Yeah.
1:21:12
>> They want to be a content creator. They
1:21:14
want to be clvicular. I mean, this is
1:21:15
the number one thing when you ask kids
1:21:17
what they want to do. It's like
1:21:19
>> content creator.
1:21:20
>> Yeah.
1:21:21
>> Because it's like a clear path where you
1:21:23
don't even have to be exceptional.
1:21:24
>> Well, I think that they're responding.
1:21:27
We designed a society for them that
1:21:30
said, "Here is the key incentive.
1:21:32
>> It's attention.
1:21:33
>> We never said it in those words. You
1:21:35
never told your kids that, right?
1:21:36
>> I never told my kids that.
1:21:38
>> But everything around them is bombarding
1:21:41
them with the same message. Hey man,
1:21:43
it's about attention. Attention is all
1:21:45
you need. Like you know what the name of
1:21:47
the critical paper in AI is? Like when
1:21:50
you go back to like the Magna Carta of
1:21:53
AI, do you know what it's called?
1:21:54
>> No.
1:21:54
>> Attention is all you need.
1:21:57
>> Really? Attention is all you need. That
1:22:00
is the name of the [ __ ] of the white
1:22:04
paper. How crazy is that?
1:22:08
Everything in our society in subtle ways
1:22:12
to just, you know, bash you over the
1:22:13
head ways tells you that attention is
1:22:16
just the most precious asset. And
1:22:20
>> well, it's one of the weirder things
1:22:21
when you go back to this concept that
1:22:23
we're living in a simulation because
1:22:25
>> this is what I mean. It's also it's like
1:22:28
when you look at
1:22:31
quantum physics, right, and the the idea
1:22:34
of the observer is that things function
1:22:37
very differently when they're observed.
1:22:38
The difference between a particle and a
1:22:40
wave,
1:22:40
>> right?
1:22:41
>> Like if you pay attention to them, they
1:22:44
observe differently.
1:22:44
>> Observe differently. Yeah.
1:22:46
>> Like what is that?
1:22:47
>> Yeah.
1:22:47
>> Like what cat? Yeah. What is that?
1:22:50
>> Why is attention so important to us?
1:22:56
That is an that is a really important
1:22:58
question,
1:22:59
>> right? And what is like the single best
1:23:03
motivator in a negative way? It's
1:23:04
negative attention.
1:23:07
>> Like that's the one thing that everyone
1:23:08
fears more than anything is negative
1:23:10
attention.
1:23:10
>> Well, and then some people figure out
1:23:11
that attention is an absolute value
1:23:14
function. Doesn't matter if it's
1:23:15
positive or negative. It's just like the
1:23:16
sum total is just great,
1:23:18
>> right?
1:23:18
>> So if I get positive attention, great.
1:23:20
Negative attention, great. If I can be
1:23:22
divisive, then I can maximize both sides
1:23:24
of that equation.
1:23:25
>> And you know, you're rewarded for that
1:23:27
at scale.
1:23:29
>> You are, but you're also you exper
1:23:32
because you're inauthentic, you
1:23:33
experience a tremendous amount of
1:23:35
negative attention.
1:23:36
>> Yeah.
1:23:37
>> And then you have this bad feeling that
1:23:39
comes with negative attention as to
1:23:41
versus
1:23:42
>> primarily positive attention which is a
1:23:44
good feeling.
1:23:45
>> Yeah. So it's this it's letting you know
1:23:47
you're on the wrong track in some sort
1:23:49
of weird primal way like in our code
1:23:52
like the negative attention is like like
1:23:55
what's what's the original version of
1:23:56
that? It's like the reason why people
1:23:58
fear public speaking is because
1:24:00
initially in a tribal situation if
1:24:03
you're talking in front of the group of
1:24:05
150 people in your tribe it's probably
1:24:07
because they're judging you. Right.
1:24:08
>> And you [ __ ] up and you've got to make
1:24:11
some sort of a case why they don't kill
1:24:12
you.
1:24:12
>> Right.
1:24:13
>> Right. This is why everyone, this is the
1:24:14
fear of public speaking. That's where it
1:24:17
comes from.
1:24:17
>> That's encoded in our genes is like
1:24:19
>> Yes.
1:24:20
>> Back thousands of years
1:24:21
>> Yeah.
1:24:22
>> public speaking wasn't the positive act.
1:24:23
It was defend yourself before we kill
1:24:25
you.
1:24:25
>> Exactly. Exactly. And the worst
1:24:28
>> fascinating. Yeah. That's fascinating.
1:24:30
>> It is fascinating.
1:24:30
>> That makes a ton of sense. I think
1:24:31
>> it does, right? Why else would it be so
1:24:34
terrifying?
1:24:34
>> Yeah.
1:24:35
>> I thought of that the first time I ever
1:24:36
did stand up. I was like, why am I so
1:24:38
scared? It was very strange because I
1:24:40
had fought probably a hundred times in
1:24:42
martial arts tournaments like why why
1:24:44
was I so scared of this
1:24:47
but I was I was terrified for and it
1:24:50
didn't make any sense negative attention
1:24:53
>> right
1:24:53
>> you know right
1:24:54
>> bombing on stage this because all these
1:24:56
people are judging you in a negative way
1:24:58
and it feels unbel
1:25:06
you you go to bed at night you think
1:25:07
about it
1:25:08
>> give a batting average like meaning like
1:25:10
is it is it like a fixed percentage of
1:25:12
your show's bomb independent of the
1:25:14
people the moment?
1:25:15
>> No, it's really the real problem and
1:25:18
every comic faces this is once you've
1:25:21
develop an act and then you put out a
1:25:23
special then you start from scratch.
1:25:25
That's where even the greats Louis CK,
1:25:28
Chris Rock, Dave Chappelle, they all
1:25:30
bomb. Everybody bombs during that
1:25:31
process
1:25:33
>> because you're just working your craft.
1:25:34
>> It's all new stuff. Like it's I wouldn't
1:25:36
say bomb, but you don't have great
1:25:39
shows. Like I've watched the greats work
1:25:42
out new material. Like you go up with
1:25:45
ideas. You go up with like you might get
1:25:48
some giggles. You might get some laughs.
1:25:50
Some bits hit hard. Some bits are great
1:25:52
right out of the shoot. And some of them
1:25:54
you have to [ __ ] figure it out. And
1:25:57
in that process, you're going to get
1:25:59
negative attention, right?
1:26:00
>> Because it's not working, right?
1:26:01
>> It's not it's not happening. Kevin uh
1:26:04
Kevin Hart
1:26:06
told this funny [ __ ] story where he
1:26:08
was like working new material and he was
1:26:10
like doing some small show and he had
1:26:12
the [ __ ]
1:26:13
>> Oh no.
1:26:14
>> on stage and he's like I got to land
1:26:16
this thing cuz I got to figure out if
1:26:18
people want to hear it. But so he just
1:26:20
he wrapped his jacket around his selfie
1:26:22
and [ __ ] himself.
1:26:24
>> Oh my god.
1:26:24
>> It's so it's so funny. But he tells that
1:26:27
story and that's the bit that works in
1:26:28
>> Oh my god. That's hilarious. That's
1:26:30
hilarious.
1:26:31
>> It's so funny.
1:26:32
>> Yeah. Well, honesty is currency, you
1:26:36
know, in that world, especially honesty
1:26:38
where you look stupid and people can
1:26:40
relate.
1:26:41
>> Well, this is where like I think like
1:26:42
Elon subtly has figured this out, which
1:26:44
is like there's attention, but then
1:26:47
there's just authenticity. And if you
1:26:49
can be yourself and you can hit the seam
1:26:53
properly,
1:26:54
you just get infinite attention.
1:26:56
>> Yes.
1:26:57
>> And that's like a that's like a real
1:26:59
mind [ __ ] too, I think.
1:27:00
>> Right. Yeah. He doesn't seem to have a
1:27:02
hard time with like being criticized.
1:27:06
Doesn't seem to bother him that much as
1:27:07
long as he's just being himself. Like I
1:27:11
think he's like two steps ahead. Like
1:27:13
there there are things like you know
1:27:16
somebody tweeted yesterday or the day
1:27:18
before or something like
1:27:21
he controls 2.7% of GDP or something,
1:27:24
right? He's got like $800 billion. So
1:27:26
it's great crazy. It's so crazy. And it
1:27:29
was like a comparison to John
1:27:30
Rockefeller, John D. Rockefeller who
1:27:32
controlled something around the same
1:27:33
time. And he's the first comment. He's
1:27:35
like 10 trillion or bust.
1:27:40
And obviously people lose their mind,
1:27:43
>> right?
1:27:43
>> People just [ __ ] lose their mind,
1:27:45
>> right?
1:27:46
>> On both sides. So this one one side is
1:27:49
like, "Think of the abundance and the
1:27:50
incredible stuff we're going to get if
1:27:51
he can get us to 10 trillion." And other
1:27:54
people are like, "You can't hold a third
1:27:55
of the economy in your hand." Then
1:27:58
everybody goes crazy and I'm like this
1:28:00
guy's a [ __ ] genius. Like how you
1:28:02
would never have like I mean how would
1:28:05
you even have the courage to tweet
1:28:06
something like that? It just seems like
1:28:08
so crazy.
1:28:08
>> It really helps if you own Twitter,
1:28:12
>> right? Cuz if you did it in another
1:28:14
format like
1:28:15
>> you get excoriated.
1:28:17
>> Well, not only that, well, there was a
1:28:18
real chance that you'd get actually get
1:28:20
banned from the platform at one point in
1:28:22
time
1:28:22
>> for many of the things that he's posted.
1:28:24
He would have gotten banned for pre2020.
1:28:27
>> Yeah.
1:28:29
>> Yeah.
1:28:30
>> Or whatever year it was that he
1:28:31
purchased it. Yeah. Um
1:28:34
negative attention.
1:28:36
Attention period. Like so it brings back
1:28:38
to this idea of assimulation like why is
1:28:43
what humans focus on such a massive part
1:28:48
of what's valuable to us? And sometimes
1:28:51
what we focus on is not valuable. as you
1:28:54
were talking about like the things that
1:28:55
really matter in your day-to-day life or
1:28:57
that actually affect you versus the
1:28:59
things that are in the public
1:29:01
consciousness
1:29:03
>> like UFO is a great example like no UFO
1:29:05
it's not really [ __ ] I mean
1:29:07
ultimately it may so there's this
1:29:09
there's this thing that we all have like
1:29:11
recognizing the potential for danger
1:29:14
right like what's that sound what is
1:29:15
that it might be nothing but it might be
1:29:17
something go look
1:29:18
>> so look if you and I were designing a
1:29:20
video game we'd probably sit there and
1:29:23
say, "Okay, we got to get from point A
1:29:25
to point B, but to make it fun, we're
1:29:27
going to put all these little
1:29:28
distractions and honeypotss along the
1:29:29
way."
1:29:30
>> Yeah.
1:29:30
>> And what they should be doing is
1:29:32
accumulating resources to get over the
1:29:33
river and then accumulating, you know,
1:29:36
uh, weapons to fight these other guys.
1:29:38
But instead, we're going to put this
1:29:39
like little thing over here and this
1:29:40
other thing over there. And you could
1:29:42
easily get distracted. And some people
1:29:43
will have to they'll just [ __ ] beline
1:29:45
right to the end of it. They'll, you
1:29:47
know, they'll get to the end boss. Uh,
1:29:51
and I feel like that's kind of what we
1:29:53
are tasked with doing every day. We're
1:29:55
tasked with we know what's important
1:29:58
maybe deeply in our DNA and then we have
1:30:01
all this stuff that we're supposed to
1:30:02
pay attention to.
1:30:05
And I think increasingly the game is
1:30:09
tell yourself that that's actually not
1:30:10
the thing that matters. It's almost like
1:30:12
working against you and figure out what
1:30:15
this other stuff is and focus on that
1:30:18
and fix that.
1:30:20
Like politics is a game that I think
1:30:23
distracts like left and right. It's so
1:30:25
stupid and it's breaking down.
1:30:27
>> And it's breaking down because now it's
1:30:29
like it's actually like you're more
1:30:30
likely to find alignment based on age
1:30:32
versus by political orientation. Like
1:30:34
people who are 30 and younger, it
1:30:36
doesn't matter what they identify as,
1:30:37
they all believe in the same [ __ ]
1:30:40
>> a lot more. Yeah. Like meaning like if
1:30:42
you ask their views on
1:30:44
>> social policy, taxation, Israel, if you
1:30:47
ask their views, what you find is now a
1:30:50
convergence between the left and the
1:30:53
right. If you if you divide it by age
1:30:57
at our age, it's still much more about
1:31:00
>> It's not completely uniform.
1:31:01
>> No, it's not completely uniform. But my
1:31:03
point is it's it's it was simpler in the
1:31:06
past to organize people independent of
1:31:09
age by political orientation.
1:31:12
>> That simplicity is gone.
1:31:13
>> Well, isn't that because of also a
1:31:15
breakdown in trust of all government in
1:31:17
particular,
1:31:19
>> right? So the breakdown in trust which
1:31:21
is also a lot of it is because of our
1:31:23
access to information now. We understand
1:31:25
how corrupt politics are. Yeah.
1:31:26
>> We understand insider trading now in
1:31:28
Congress. We understand how different
1:31:31
people flip-flop on issues. We
1:31:33
understand how the Democrats in 2008
1:31:36
used to view illegal immigration, which
1:31:39
is essentially MAGA plus. I mean, it's
1:31:41
it's MAGA on steroids versus like what
1:31:45
they the way they look at it today.
1:31:46
Like, why is that? What? Well, because
1:31:48
it's all game.
1:31:49
>> It's all a power, influence, and
1:31:51
attention game.
1:31:52
>> Attention game.
1:31:53
>> Yeah. It's very [ __ ] strange.
1:31:55
>> Yeah.
1:31:56
>> But it's all moving us in a general
1:31:58
direction. And that general direction is
1:32:00
access to innovation. It's all I've said
1:32:03
this a lot of times and if people have
1:32:05
heard it before, I apologize, but if you
1:32:07
looked at the human race from afar, if
1:32:09
you were something else, you'd say,
1:32:10
"Well, what does the species do?" Well,
1:32:12
it makes better things constantly, even
1:32:14
if it doesn't need them. Like, you know,
1:32:16
if you have an iPhone, I you have a 16,
1:32:19
you have a 16, you know, I I have a 17.
1:32:21
I bought it. I haven't [ __ ] turned it
1:32:22
on.
1:32:23
>> I haven't plugged it in. going to
1:32:25
eventually
1:32:26
>> eventually I'll [ __ ] plug it in and
1:32:28
[ __ ] swap everything over and figure
1:32:29
out where my [ __ ] passwords are. But
1:32:31
the reality is you don't need it, but
1:32:34
you want it and it's going to keep
1:32:36
getting better every year. Why? Because
1:32:37
that's where we're obsessed with.
1:32:39
>> This also aligns with materialism. Like
1:32:42
for a finite lifespan, why are people
1:32:45
like including old people so obsessed
1:32:48
with gathering stuff? Well, because that
1:32:51
fuels innovation. Because if there's no
1:32:54
new things coming, there's no motivation
1:32:57
to get the newest, latest, greatest
1:32:59
thing. And ultimately what that leads to
1:33:01
is greater technology which ultimately
1:33:03
leads to artificial intelligence.
1:33:06
>> My slight deviation from that is I think
1:33:07
sometimes people accumulate things
1:33:09
because it's a status game and that's
1:33:12
because they get more attention. You
1:33:14
have a Ferrari, you get attention,
1:33:16
>> right? But what does that do? It makes
1:33:17
Ferrari make better Ferraris and all
1:33:21
technology moves in the same general
1:33:23
direction. No one company
1:33:24
>> That's true. That's true.
1:33:25
>> No one company says this is it. This is
1:33:28
what we make. It's perfect. Do
1:33:29
>> you think people innately feel that by
1:33:31
being a part of this kind of like
1:33:33
consumerist capitalist system, they're
1:33:36
contributing to progress?
1:33:37
>> I don't think they innately feel it, but
1:33:39
I think that's ultimately the results.
1:33:41
>> That's ultimately the result. And it
1:33:43
seems to be universal and it seems to be
1:33:45
constantly moving in this one general
1:33:47
direction which is better and better
1:33:50
technology.
1:33:51
>> But like the stage fright example, you
1:33:52
don't think it's encoded in our DNA this
1:33:54
idea of like wow when I am a part of
1:33:57
this in some way shape or form just
1:33:58
things seem to get better and I want to
1:34:00
be a part of that like do you think that
1:34:02
that's possible that that's encoded in
1:34:03
us?
1:34:05
I think it motivates us to the ultimate
1:34:08
goal. And that ultimate goal, I think,
1:34:10
is that human beings constantly make
1:34:12
better stuff. Whatever it is, better
1:34:14
buildings, better planes, better cars,
1:34:16
better phones, better TVs, better
1:34:19
computers, better everything, artificial
1:34:22
life.
1:34:22
>> That might be the whole reason why we're
1:34:25
here. And the way I've always described
1:34:27
it is that we are
1:34:29
>> we are a biological caterpillar that's
1:34:33
making a digital cocoon. And we don't
1:34:35
even know why we're be going to become a
1:34:37
butterfly, but we're doing it. We're
1:34:39
doing it and we're moving towards it.
1:34:40
And it might be what happens to all life
1:34:43
all throughout the universe. And it
1:34:44
might be why these so-called aliens or
1:34:47
whatever the [ __ ] they are, it might be
1:34:49
us in the future. It might be other
1:34:51
versions of human beings that have gone
1:34:53
past whatever this period of development
1:34:57
that we're currently involved in right
1:34:58
now. This is just might be what happens.
1:35:01
This is what life always does. It might
1:35:03
realize that biological life, which is
1:35:06
very territorial and primal and sexual
1:35:09
and greedy and it has all these problems
1:35:12
with human reward systems ultimately
1:35:16
develops into this other thing,
1:35:18
>> right?
1:35:18
>> And then that's what we're doing. And
1:35:20
then we're in the process of that right
1:35:21
now. And I I think that when if if and
1:35:23
when not if but when when we colonize
1:35:26
Mars, I think that that that new world
1:35:28
order actually has the best chance to
1:35:30
take shape because it'll
1:35:31
>> you know, there's a lot of people that
1:35:32
think that Mars was already colonized at
1:35:34
one point in time.
1:35:35
>> That life already existed. What
1:35:37
>> that life already existed on Mars like
1:35:39
many millions of years ago and that
1:35:41
there's evidence of structures on Mars?
1:35:43
That's really weird stuff. Have you ever
1:35:46
seen the the square that they found on
1:35:48
Mars?
1:35:49
Okay, show them to them, Jamie.
1:35:51
>> One of the things that they're finding
1:35:52
with scans of Mars is like geometric
1:35:55
patterns and structures and right angles
1:35:57
that shouldn't exist. Like weird stuff
1:35:59
>> that couldn't be naturally.
1:36:00
>> No. No. Way weirder. Way weirder than
1:36:03
like the face on Sidonia. The Sidonia
1:36:06
thing is interesting.
1:36:06
>> Yeah.
1:36:07
>> Um and then this one. Look at that.
1:36:10
>> What the [ __ ] is that?
1:36:11
>> It looks like a home of some kind or
1:36:12
something.
1:36:13
>> Some enormous structure. And the size of
1:36:15
that, they don't know exactly, but it
1:36:18
may be as large as several kilometers or
1:36:22
as small as several hundred meters, but
1:36:25
they're not exactly sure. But what they
1:36:26
are sure is that it has very weird right
1:36:28
angles and right angles that seem to be
1:36:31
uniform in size.
1:36:34
>> That's crazy.
1:36:35
>> Like see how it's highlighted in the
1:36:36
enhanced photograph in the upper left?
1:36:38
Like what is that?
1:36:41
>> But sorry, did they and were they able
1:36:42
to send like the rover over there? I
1:36:44
don't know. It's too far away.
1:36:45
>> I don't think it's in the exact place
1:36:46
where the rover is at, but they're able
1:36:49
to get image of these things. And
1:36:51
there's several of these things.
1:36:52
>> That's insane.
1:36:53
>> Yeah, there's a lot of weird stuff.
1:36:55
There's a lot of weird stuff there. So,
1:36:58
there's also like ancient civilizations
1:37:00
that have these myths of us existing
1:37:03
somewhere else and coming here,
1:37:05
>> right? But you have to think if human
1:37:09
beings
1:37:10
develop somewhere else and they they
1:37:13
reach some high level of sophistication
1:37:15
and then they experienced some
1:37:16
cataclysmic disaster that completely
1:37:18
destroyed their environment which is
1:37:20
what Mars is, right? So like let's
1:37:22
assume that Mars was at one point in
1:37:24
time
1:37:26
was habitable and that life existed and
1:37:28
we know it was at one point in time. We
1:37:30
know there was water on Mars. We know
1:37:32
and there's some sort of evidence of at
1:37:34
least some sort of a very primitive
1:37:36
biological life on Mars.
1:37:39
>> If they got to a point where they said,
1:37:40
"Hey, this [ __ ] place is falling
1:37:42
apart, but this Earth spot looks pretty
1:37:44
good."
1:37:46
And they go there, but then cataclysms
1:37:48
happen on Earth and no one remembers cuz
1:37:50
all your information's on hard drives
1:37:52
and then you have to rebuild society.
1:37:54
And so you're re-remembering. And so you
1:37:57
have all these myths of how everything
1:37:59
started, you know, whether it's Adam and
1:38:02
Eve or the great flood or whatever these
1:38:03
things are that we passed down through
1:38:06
oral tradition for hundreds of years and
1:38:07
then eventually write it down and then
1:38:09
people try to decipher what it means and
1:38:11
they sit in church and try to go over
1:38:14
what what did it mean? Like what does
1:38:15
this mean? Like what what is the what is
1:38:17
the re the real origin of all these
1:38:19
stories? We don't know.
1:38:22
>> I mean that's crazy. It's crazy. But if
1:38:24
life it sounds nuts, why would life life
1:38:27
couldn't have possibly existed on Mars?
1:38:29
How the [ __ ] does life exist on Earth?
1:38:31
That how about that? How about why why
1:38:33
would we assume that it wouldn't have
1:38:35
existed at one point in time and
1:38:37
Terrence Howard who is a very
1:38:38
interesting guy.
1:38:39
>> Very interesting.
1:38:40
>> And got some
1:38:40
>> Europe is I mean
1:38:41
>> with Eric Weinstein crazy.
1:38:43
>> Yeah. Crazy.
1:38:44
>> Yeah, that one was crazy.
1:38:45
>> Um and him alone, but he's got some
1:38:47
[ __ ] weird ideas that you just make
1:38:50
you go. He's a very brilliant guy and
1:38:53
you know kind of a strange heterodox
1:38:55
thinker right and one of his ideas is
1:38:59
that
1:39:01
planets get to a certain distance from a
1:39:04
sun and they people and that it gets to
1:39:08
a certain climate and a certain distance
1:39:10
and his his
1:39:13
idea is that I don't know if you realize
1:39:15
that there's a a there's a giant um
1:39:18
ejection of of of some coronal mass
1:39:22
ejection that just happened recently on
1:39:24
the sun and they're very concerned about
1:39:26
it. They don't know what's going to
1:39:27
happen. It happens all the time. A sun
1:39:29
releases these giant chunks of material
1:39:33
>> and he thinks that these materials get
1:39:35
far enough away from the planet and then
1:39:37
they coalesce into planets or far enough
1:39:40
away from the sun. They coales into
1:39:41
planets and as time goes on they get a
1:39:44
further and further distance from the
1:39:46
sun and then obviously they get hit with
1:39:48
asteroids and there's panspermia and
1:39:51
water gets into them from comets and
1:39:54
then they develop oceans and then they
1:39:56
develop biological life and when they
1:39:58
have a certain amount of distance from
1:40:00
the sun they people and he thinks that
1:40:03
as they get further and further and
1:40:05
further away they get less and less
1:40:06
habitable and then they get to a point
1:40:09
where they have their techn technology
1:40:11
to a point where they realize like we
1:40:13
can't sustain life on this planet
1:40:15
anymore. We got to go to that other one.
1:40:18
And so do they go to the one that's
1:40:19
closer to the sun because they're too
1:40:21
far now.
1:40:23
>> It's a nutty idea.
1:40:24
>> Jesus Christ.
1:40:25
>> It's a nutty idea. But if you think
1:40:26
about how recent our sun is in terms of
1:40:31
the solar system itself, in terms of
1:40:32
rather the galaxy itself. So if the
1:40:34
universe, if the big bang is correct and
1:40:37
our universe existed and it was rather
1:40:40
our universe erupted from nothing or
1:40:43
from a very small thing 13.7 billion
1:40:46
years ago. Well, this [ __ ] planet's
1:40:48
only 4 something billion years old,
1:40:50
right? And life is only, you know, a
1:40:53
little bit less than that. Yeah.
1:40:54
>> So you have like a billion years or so
1:40:56
where there's nothing and then you start
1:40:58
getting single- cellled organisms,
1:40:59
multi-elled organisms, and eventually it
1:41:01
peoples.
1:41:02
And when it gets to a certain point when
1:41:04
these people have advanced their
1:41:06
curiosity and their innovation to the
1:41:08
point where they can harness space
1:41:10
travel and they use zero point energy
1:41:13
and they have a bunch of different
1:41:14
things that we haven't invented yet. And
1:41:16
then their environment degrades and it
1:41:18
gets to the point where they realize
1:41:20
like hey we're getting pummeled by
1:41:22
asteroids. We can't sustain life here
1:41:24
anymore.
1:41:24
>> We got to move
1:41:25
>> like Elon wants to go to Mars which
1:41:27
might be the wrong answer. We might want
1:41:29
to go that way.
1:41:30
>> There are closer to the sun.
1:41:32
>> Exactly. I mean, the thing is he's got
1:41:35
everything it he needs now to get there.
1:41:37
Like I
1:41:38
>> I'm not going. Are you going?
1:41:39
>> I would go.
1:41:40
>> [ __ ] that. I'll send you an email.
1:41:43
>> Hold on a second. Think Think about
1:41:44
Think about what he's going to take.
1:41:45
Okay, look.
1:41:47
>> When Let's just say he gets there with
1:41:48
the city. He has
1:41:50
>> he has the way to transport us there,
1:41:54
>> right?
1:41:54
>> Okay. Then when you land, he's got the
1:41:59
way to actually transport us around on
1:42:01
the
1:42:01
>> on the planet, right? He's got Tesla,
1:42:03
>> right?
1:42:04
>> He will have already sent a fleet of his
1:42:06
robots.
1:42:08
Those folks will have made some
1:42:10
inhabitable city
1:42:12
probably using the Boring Company drill
1:42:14
because you you're going to, you know,
1:42:15
be under the regalith that you don't
1:42:17
want to be on the top. Maybe you just
1:42:18
dig a hole and you you inhabit down
1:42:21
there.
1:42:22
>> Um he's got all the ways to make energy.
1:42:24
He has the AI to help you design the
1:42:27
stuff. He has the communication way to
1:42:30
communicate. He's got the internet, his
1:42:32
own internet. So that right
1:42:34
>> so he can get, you know, all of the
1:42:35
information to everybody. And then he's
1:42:38
got money and the super app so that you
1:42:41
can transact. And then I think to myself
1:42:43
like what is he actually missing? And
1:42:45
then what happens if he if he gets there
1:42:47
first? Is he just allowed to just do
1:42:51
whatever he wants? Like is it just kind
1:42:52
of like a free-for-all? like
1:42:54
>> well kind of
1:42:55
>> it's his constitution. Like is that what
1:42:56
happens?
1:42:57
>> Well, it's like earth but shittier.
1:43:00
>> Like we already have all those things
1:43:01
here. Why would you want to go to a
1:43:02
place where you die when you go outside?
1:43:04
>> I think what people will be attracted to
1:43:05
is that if he publishes his version of
1:43:07
what the rules are there, there's a
1:43:09
chance that he could make them really
1:43:11
different to what the rules are here.
1:43:12
>> Like what kind of rules would you do if
1:43:14
you were the king of Mars? Um
1:43:17
so I think that your view is incredibly
1:43:21
to me like um positive some like of
1:43:24
humanity of like we want to make things
1:43:26
better. Mhm.
1:43:27
>> So if I think about that as like a
1:43:28
function, what happens that that's like
1:43:31
so our natural rate of direction is
1:43:32
forward what pushes back on that and a
1:43:35
lot of it what you find is like
1:43:36
government regulation rules all that
1:43:38
stuff
1:43:38
>> greed
1:43:39
>> greed um too much focus on attention
1:43:42
>> right
1:43:43
>> so I would try to experiment with like
1:43:45
what the incentives would have to be so
1:43:47
that you had more unfettered
1:43:49
entrepreneurship like just like do the
1:43:51
thing that you think is right
1:43:52
>> and there's a mechanism where we give
1:43:54
you the ability to then make things for
1:43:57
more people because you're proving that
1:43:59
you're actually really good at making
1:44:00
things. And if you don't need money at
1:44:03
that point in society, reorienting us
1:44:06
away from this kind of like brittle form
1:44:09
of exchange to something more useful,
1:44:11
that's worth experimenting with. I think
1:44:12
that's an important
1:44:14
>> Well, there's also the concept of the
1:44:16
self of the individual which may erode
1:44:18
with technological innovation. So, if we
1:44:22
really can read each other's minds, if
1:44:25
we really do get to a point where we're
1:44:27
communicating through technologically
1:44:30
assisted telepathy, like a lot of the
1:44:34
whole the weirdness of people is I don't
1:44:37
know what you're thinking. I don't know
1:44:38
if I should trust you. You know, this
1:44:41
[ __ ] might be devious. You know,
1:44:43
you know what I mean? Well, we'll know,
1:44:45
right?
1:44:45
>> And there will be no need for all that
1:44:47
if we really are all one. If that's
1:44:50
ultimately something that can be
1:44:51
achieved with technology,
1:44:53
>> like this hive mind.
1:44:54
>> Yes. Like legitimate hive mind. And then
1:44:57
like look where society's going.
1:44:58
Gender's kind of falling apart. People
1:45:00
are getting they're reproducing less,
1:45:03
right? People are are having less
1:45:04
testosterone, more miscarriages, less
1:45:07
fertile. We're kind of moving into this
1:45:09
genderless direction. And I don't know
1:45:13
if it's by design, but microlastics and
1:45:18
phalates and all these different
1:45:20
chemicals that are endocrine disruptors
1:45:22
are all ubiquitous in our society. Well,
1:45:24
does
1:45:25
>> is that a coincidence that that's all
1:45:27
happening at the same time as
1:45:28
technological innovation on mass scale?
1:45:30
Is it? I don't know. Because like what's
1:45:32
the one thing that's holding us back?
1:45:35
Well, that we're territorial primates
1:45:38
with thermonuclear weapons and that we
1:45:41
exist in a sort of tribal mindset, but
1:45:43
yet we do it on a planet of 8 billion
1:45:46
people.
1:45:46
>> Yeah. No, no. The key differentiator of
1:45:48
humans is our our ability to enact
1:45:50
violence.
1:45:51
>> Yeah.
1:45:51
>> To to methodically execute premeditated
1:45:55
violence.
1:45:55
>> Yes. And and greed and attention. And
1:46:00
one of the things that attention is
1:46:02
sexual preference or or sexual rather
1:46:05
attention like the ability to procreate,
1:46:08
the ability to acquire mates, right?
1:46:10
Like the more resources you have, the
1:46:12
more attractive you'll be, especially
1:46:13
for males and males are the ones that
1:46:15
are involved in the violence in the
1:46:17
first place. You know, there's I can't
1:46:19
name a single war that was started by a
1:46:21
woman.
1:46:21
>> How do you how do you teach your kids
1:46:24
that attention is not everything?
1:46:29
That's a good question. Especially in
1:46:31
this society, it's probably harder to do
1:46:33
that now than ever before
1:46:35
>> because the reaction that I suspect most
1:46:36
kids will have is like,
1:46:39
"Stop. Like, leave me alone." Like, it's
1:46:42
just it's almost an impossible thing.
1:46:44
>> Well, I think kids learn more from their
1:46:48
parents' behavior than anything you say
1:46:50
to them. I think they learn from the way
1:46:53
you behave and the way you exist and the
1:46:57
way you exist with them.
1:46:58
>> And if you are constantly whoring
1:47:02
yourself out for attention,
1:47:05
>> it's one thing if you get a lot of
1:47:06
attention from what you do, but if
1:47:08
that's your primary goal, they're going
1:47:10
to know.
1:47:11
>> Do your kids know how famous and
1:47:13
influential you are? Like honest
1:47:14
question.
1:47:15
>> Oh, yeah. They know.
1:47:16
>> But do do they have a real sense of it
1:47:18
or do you just kind of like
1:47:20
>> Yes, they can. I mean, how can you? It's
1:47:22
got to be weird as [ __ ] growing up with
1:47:23
a very famous dad. It's very odd, but
1:47:27
it's not my primary goal.
1:47:29
>> Yeah, this is my point. You're not
1:47:30
You're not putting it in their face.
1:47:31
>> So, to your point, you're not modeling
1:47:33
attention is all you do.
1:47:34
>> No, no. I have interesting conversations
1:47:37
with cool people. I tell jokes and I
1:47:42
call fights. Like, those are the things
1:47:44
that I do.
1:47:45
>> And they also know that I have a very
1:47:47
strong work ethic and that I work
1:47:49
towards things. So they have very strong
1:47:50
work ethics. They're very motivated and
1:47:52
disciplined, like shockingly
1:47:54
disciplined. And I think that's modeled.
1:47:56
I think that that comes from and they
1:47:59
also like really enjoy achieving goals.
1:48:01
And they're they're rewarded for it with
1:48:04
praise and with admiration, but not
1:48:09
never with like you're better than other
1:48:11
people. Yeah. Ne never like it's the the
1:48:13
idea is like all human beings are
1:48:16
capable of greatness. So, it's like find
1:48:18
the thing that you excel at and if you
1:48:21
throw yourself into that, it's very
1:48:23
rewarding.
1:48:24
>> I really I really believe in this. I
1:48:26
tell this story when I interview people.
1:48:28
When I interview people, I'm always
1:48:29
like, you know, just at whatever
1:48:30
company, I'm always like I first only
1:48:32
want to know about them. I'm like, [ __ ]
1:48:34
your resume. Like, tell me about your
1:48:36
parents and how you grew up. I just want
1:48:38
to know that. Stop at 18. Everything
1:48:40
before 18. Just tell me every little
1:48:43
detail,
1:48:43
>> right?
1:48:44
>> You know, and some people tell me these
1:48:45
incredible stories. They'll be like, you
1:48:47
know, my mom was an alcoholic or this or
1:48:49
that. And I'm just like, man, this is so
1:48:52
valuable because it allows me to
1:48:54
understand who they are. The second part
1:48:56
of the interview,
1:48:57
>> we do the business [ __ ] But the third
1:48:59
part, I tell this story. This is a crazy
1:49:01
story about what you're just saying.
1:49:03
>> They ran this experiment at Stanford
1:49:05
where they take like a big bowl, fill it
1:49:08
with water, and they drop in a mouse and
1:49:11
they measure how long it takes for the
1:49:12
mouse to drown. They do it like a
1:49:15
hundred times. The average was about
1:49:16
four minutes, call it four, four and a
1:49:18
half minutes.
1:49:20
Then they run the experiment again, 100
1:49:22
mice, and at minute three or three and a
1:49:25
half, they take it out, they dry it off,
1:49:27
they play it music, and they whisper
1:49:29
like sweet nothings into the mouse's
1:49:31
ear. They drop the mouse back in the
1:49:32
water, and that mouse treads water for
1:49:37
60 hours the next hundred mice on
1:49:39
average. And the upper bound was 80. And
1:49:43
I thought to myself like that is all
1:49:46
just potential right there. Like that's
1:49:48
all like there's all this latent
1:49:50
potential. So if an animal has it, I'm
1:49:51
going to assume that humans have it too,
1:49:53
>> right?
1:49:54
>> But you never get a chance to unlock it.
1:49:56
Like the average person is just kind of
1:49:58
like living a life where they're maybe
1:50:00
scratching five or 10% of their
1:50:01
potential. And the question is, how do
1:50:03
you get to that other 90%. Like how does
1:50:04
the second batch of mice How do the
1:50:06
second batch of mice tread water for 60
1:50:08
hours?
1:50:09
>> Well, the doesn't make any sense to me.
1:50:10
all the same mice, right? I think the I
1:50:14
think the mice get rescued,
1:50:16
>> they get rescued
1:50:17
>> and then when they try it again those
1:50:20
same mice last longer, right? So it's
1:50:22
the same mice. So it's an experience.
1:50:25
>> So they have experience now. They
1:50:27
understand that they can tread water
1:50:29
where they didn't die. So they
1:50:31
understand that they can survive where
1:50:33
they didn't know that they could survive
1:50:34
the first time they were thrown into the
1:50:36
water because they had never been thrown
1:50:37
into water before. That's the same thing
1:50:39
that happens to people when they fight.
1:50:41
Like the first time people ever have a
1:50:43
competition, they [ __ ] panic and they
1:50:45
they get really scared and they get
1:50:47
really like filled with anxiety. But
1:50:50
after a while, you get relaxed and
1:50:52
that's when you get really dangerous
1:50:54
because then you get calm and you could
1:50:56
keep your [ __ ] together while you're in
1:50:58
the middle of all this chaos because you
1:51:00
have the experience of it. Without the
1:51:02
experience of it, very few people do
1:51:03
well the first time, right? Unless
1:51:05
you're exceptionally talented and and
1:51:07
you have other competition experience,
1:51:10
like you've competed in other things,
1:51:11
like maybe you played football or some
1:51:13
other things and you know what it's like
1:51:14
to actually perform under pressure.
1:51:16
>> What is the what is the version of
1:51:18
giving more humans a chance to get to
1:51:20
that?
1:51:22
Well, I think sports are really good for
1:51:25
that because performing under people
1:51:27
paying attention to you and performing
1:51:29
where people are trying to stop you from
1:51:31
doing something and you're trying to do
1:51:32
something and there's all these unknowns
1:51:35
and recognizing that hard work allows
1:51:38
you to do whatever you're trying to do
1:51:40
better than you previously had. One of
1:51:43
the things my martial arts instructor
1:51:44
said to me when I was young is that
1:51:46
martial arts are a vehicle for
1:51:48
developing your human potential. And
1:51:50
that through this very difficult thing
1:51:52
that you're trying to do,
1:51:54
>> you're learning that, oh, if I just
1:51:57
think smart and think hard and train
1:52:01
wise and train hard and discipline
1:52:04
myself to endure suffering so that I can
1:52:07
develop more endurance and more speed
1:52:09
and more power and more technique
1:52:11
because I accumulate all this
1:52:12
information and I really think about
1:52:13
what it is and apply it with drills and
1:52:16
with training, I can get better at this
1:52:18
thing. And every time I get better at
1:52:19
this thing, I get rewarded psychically,
1:52:21
like mentally, you feel better. Like I
1:52:23
know that I'm better now. And then
1:52:24
there's the belt system, right? You
1:52:26
start off, you're a white belt. And in
1:52:28
taekwondo, you get a blue belt and then
1:52:30
after you get a blue belt, you get a
1:52:31
green belt. And then if you get a I
1:52:33
think it's green belt first and then I
1:52:34
forget how it goes. And then it's red
1:52:36
belt and black belt. And like when
1:52:37
you're a black belt, like holy [ __ ] So
1:52:39
it's this thing where you've developed
1:52:41
to a point where you've gotten to this
1:52:43
next stage. So all along the way you've
1:52:45
been rewarded for your hard work and
1:52:48
then you realize like oh I could do this
1:52:50
with everything in life.
1:52:51
>> Is a reward different than attention? It
1:52:53
is.
1:52:53
>> It is because it's internal right.
1:52:56
You're you're all you're realizing that
1:52:59
you could apply this to you know
1:53:01
whatever it is to carpentry to music.
1:53:04
You you just it's just a matter of focus
1:53:07
and attention. And some people
1:53:08
unfortunately never find a vehicle. They
1:53:11
never find a thing that they can throw
1:53:13
themselves into. They realize like, and
1:53:15
this is not unique. It's not like I'm an
1:53:19
unusual person or anybody is. I mean,
1:53:22
there's people that have unusual
1:53:24
physical gifts and some people have
1:53:26
unusual mental gifts. But the reality
1:53:28
is, no matter where you start, everyone
1:53:31
can get better. And when you do
1:53:33
something, whether it's learning to play
1:53:34
guitar, as you get better at it, you
1:53:36
realize like, oh, this is what it's all
1:53:39
about. Yeah. Like it's really all about
1:53:41
applying yourself to something and then
1:53:43
feeling this immense satisfaction of
1:53:46
your hard work paying off and that
1:53:48
motivates you to work hard at other
1:53:50
things. And if you don't find that early
1:53:52
on, it's very difficult to like find
1:53:55
like real satisfaction.
1:53:57
>> Yeah.
1:53:57
>> In life.
1:53:58
>> Yeah. I've always had something outside
1:54:00
of my
1:54:02
daily life
1:54:04
that is the thing that I actually care
1:54:06
about and it actually energizes me for
1:54:08
my day-to-day life. I don't know if
1:54:10
that's like a lot of people but
1:54:11
>> like what do you do? What's your take?
1:54:12
>> Like well initially it was poker and I
1:54:16
and even now I obsess about the game um
1:54:19
because it's infinitely more complex
1:54:21
than chess. Like chess you can get to a
1:54:23
place where you can roughly be good.
1:54:25
Poker is just constantly there's just
1:54:28
too many variables. There's human
1:54:30
emotion, there's human psychology,
1:54:32
the number of people, all of this stuff
1:54:34
just makes the complexity of the game
1:54:37
something that I find magical.
1:54:39
>> And so I sit there and I try to
1:54:41
understand like why am I doing the
1:54:42
things that I'm doing? And so much of it
1:54:44
comes back to being a mirror about
1:54:46
what's happening in my daily life. It's
1:54:48
the [ __ ] craziest thing. Like I'm
1:54:50
super insecure. I'll go into poker and I
1:54:52
will just lose for weeks at a time. But
1:54:54
it's because I'm insecure in my daily
1:54:56
life. And what's happening is that I'm
1:54:58
trying to find these quick wins and
1:55:00
quick solutions
1:55:01
>> because I'm in a state of insecurity.
1:55:03
I'm anxious. I have this anxiety. And so
1:55:06
it's become a great mirror for me. So
1:55:08
that used to be a thing. It still is a
1:55:10
thing. And but I've become reasonably
1:55:13
skilled at it where the edges are
1:55:16
smaller and I put myself in positions
1:55:18
where I'm only playing against a certain
1:55:20
group of people and I'm the losing
1:55:22
player frankly in that game. if when I'm
1:55:24
playing against like the top pros,
1:55:27
it just doesn't it helps me and I can
1:55:29
get tuned up for it. But then I started
1:55:32
to, you know, I would take different
1:55:33
things. I tried to learn how to ski.
1:55:35
Basically impossible when you're older.
1:55:37
I look like a [ __ ] idiot. Like
1:55:38
>> how old were you when you tried?
1:55:40
>> Uh I started when I was like, you know,
1:55:42
I I was a good snowboarder, so I was
1:55:43
snowboarding my whole life. And then my
1:55:45
kids skied and so I'm like, okay, well,
1:55:47
I want to do this as a family. So I was
1:55:49
like 42 or something when I tried. I'm
1:55:51
49 now, almost 50.
1:55:54
It's brutal. I mean, it's like I look
1:55:55
like a [ __ ] idiot. Like this gangly
1:55:58
giraffe like trying to get down the
1:55:59
mountain. And then now I start at golf
1:56:02
and man, I got to tell you, I used to
1:56:06
play a little bit, then I stopped. But
1:56:08
there's something to me about being
1:56:09
outside
1:56:11
where just like being in nature I find
1:56:14
like really motivating.
1:56:16
>> It's a vitamin.
1:56:17
>> It's a vitamin. And then just the mind
1:56:19
body connection of that game, it just
1:56:21
really [ __ ] with you because it's it's
1:56:23
just nothing you can master and
1:56:25
overpower,
1:56:26
>> right?
1:56:27
>> And it teaches you to just like be in
1:56:29
it.
1:56:30
>> Yeah.
1:56:30
>> And that's a very hard skill. Like if
1:56:34
you look at the best like I there's like
1:56:36
a handful of people that I really look
1:56:37
up to and I obsess like Mer Buffett, but
1:56:40
the Berkshire meeting was this past
1:56:42
weekend and if you look at the clips,
1:56:43
there's this incredible thing where they
1:56:46
transitioned, right? Mer passed away.
1:56:48
Buffett's like now executive chairman.
1:56:50
But this guy Greg Ael and this guy Ajet
1:56:52
Jane. Ajet Jane does this thing where
1:56:53
he's like, I teach the people that come
1:56:55
to just say no. Your whole job is to
1:56:58
just say no. You're going to get
1:56:59
bombarded with all kinds of business
1:57:00
pitches. Say no, no, no. And eventually
1:57:03
somebody will come and [ __ ] try to
1:57:04
whack you in the head with a 2x4 of
1:57:06
money. Then you come to me and we'll do
1:57:08
the deal. And it made such an impression
1:57:11
because like again when I'm insecure,
1:57:15
my reward function is attention. So I'm
1:57:17
like a [ __ ] little busy body. I'm
1:57:18
running around doing all this little
1:57:20
[ __ ] you know? E. And then man,
1:57:23
when I'm in a [ __ ] flow state and
1:57:25
like I'm tunning it, like I'm striping
1:57:27
the ball, you know? I'm like a few
1:57:29
things that really matter in size and
1:57:31
I'm like, man, this is this is right.
1:57:35
It's all come to me because I'm like I'm
1:57:38
like within myself and these other
1:57:41
things are a better reflection of when
1:57:43
I'm within myself and these other things
1:57:45
are a mirror of when I'm totally out of
1:57:47
kilter.
1:57:48
>> That's just me.
1:57:50
>> So in my life these things tend to lead.
1:57:53
>> Um
1:57:54
>> I think you're saying that's just you,
1:57:55
but I think that's generally most
1:57:57
people. I think you find these things,
1:58:01
these vehicles for developing human
1:58:04
potential, whether it's martial arts or
1:58:05
golf or playing guitar or playing chess
1:58:08
or poker.
1:58:09
>> And then you have to have, I think, one,
1:58:11
at least for me, one seinal relationship
1:58:14
in your life. You have to have one
1:58:16
person that has just undying belief in
1:58:18
you. And I never really had that until I
1:58:21
met my wife. And that was a very, and I
1:58:23
didn't I pushed against it so [ __ ]
1:58:25
hard because I was like, it just can't
1:58:27
be true. like why does this person give
1:58:29
a [ __ ] Do you know what I mean? Like
1:58:31
why do they care about me more than I
1:58:32
do?
1:58:32
>> Well, there's also the fear because so
1:58:34
many people get in those bad
1:58:35
relationships.
1:58:36
>> And I'm just like I I think there's a
1:58:38
part of you like me where you're just
1:58:40
like I'm not a very lovable person. Like
1:58:42
I'm just like this is that's not who I
1:58:44
am. And this woman is just there.
1:58:48
So that's been like the thing like for
1:58:50
me it's like and because she's brutal.
1:58:52
She'll be like, "Oh yeah, that was
1:58:53
[ __ ] horrible." You know, like
1:58:55
yesterday I like we had this we I I I
1:58:57
did this thing at at Milk and it was a
1:58:59
dinner at my friend's house and uh then
1:59:02
you know, we're both going to different
1:59:03
airports. I'm flying here to see you and
1:59:05
she's flying home and uh she calls me
1:59:08
and I'm like I'm how did I do? Ah [ __ ]
1:59:16
But but no, there's the parts that I did
1:59:18
well and then she critiques the other
1:59:19
parts that she didn't like. And then I
1:59:21
say which is like it's and it's so again
1:59:24
I'm insecure so I'm like I want the
1:59:25
self- serving well how would because
1:59:27
there was three of us on this panel and
1:59:29
she's like uh and I was like you know I
1:59:31
was the best right? She's like no Gavin
1:59:34
was better. I'm just like it's so but
1:59:37
it's so refreshing because it keep again
1:59:39
it's like a
1:59:40
>> keeps you in check
1:59:41
>> like and it gives me a mirror.
1:59:43
>> Mhm.
1:59:43
>> You know like when I was coming to see
1:59:46
you yesterday when we were flying down
1:59:48
to LA for this thing um there's parts of
1:59:52
me where when I'm insecure I kind of
1:59:54
like externalize and I can be like
1:59:57
really hyperbolic unnecessarily
1:59:58
hyperbolic and it's counterproductive.
2:00:01
And she said to me listen like just
2:00:02
imagine your friends these are
2:00:04
hardworking people. They're trying their
2:00:05
best as well. They don't necessarily
2:00:07
know. Some some things have massively
2:00:08
worked out for them, but they would want
2:00:10
to do the right thing. There's people
2:00:12
you've worked with before that want to
2:00:13
do the right thing. And she's like, just
2:00:15
picture them and don't judge. You can
2:00:16
observe.
2:00:19
And it's crazy, but it's like I need
2:00:20
those little things. There's like
2:00:22
tweaks. It's like having a coach kind of
2:00:23
like
2:00:24
>> and that and that's very that's very
2:00:26
helpful to me.
2:00:26
>> Yeah. It's very important. It's hard to
2:00:28
do that yourself.
2:00:29
>> I can't do it.
2:00:30
>> And it's also like
2:00:31
>> I'm [ __ ] maxing. Like my life is like
2:00:33
I like that flow. Mhm.
2:00:34
>> And if if I didn't have somebody who
2:00:36
loved me and would hold me accountable,
2:00:38
I' just [ __ ] not think about it.
2:00:40
>> Yeah. And the opposite of that is
2:00:42
someone who's like an antagonistic
2:00:44
relationship. And we know a lot of
2:00:46
people that have those kind of very
2:00:48
sabotagey sort of marriages and
2:00:50
relationships. And that's crazy.
2:00:52
>> It's brutal.
2:00:53
>> It's brutal. And I don't think they've
2:00:55
ever had a really good one. Otherwise,
2:00:56
they would never tolerate that.
2:00:59
>> I didn't know what good looked like. So
2:01:01
you kind of just I think a lot of people
2:01:03
go with the flow. Like I mean I was a
2:01:06
nerdy kid from kind of a shitty [ __ ]
2:01:09
up kind of like family structure
2:01:13
and then I got injected into this rich
2:01:15
high school but then I got to go back to
2:01:18
an alcoholic father. I'm on [ __ ]
2:01:19
welfare. Like it's like you know my my
2:01:22
self-confidence is negative [ __ ] two
2:01:24
units.
2:01:25
>> Didn't have a girlfriend. You know like
2:01:26
all the [ __ ] in high school like nothing
2:01:28
happened for me. And so my modeling of
2:01:31
like how to be in a relationship, what
2:01:32
to do, it was [ __ ] zero.
2:01:36
Um, it was zero. And so all those
2:01:39
mistakes were mostly because I didn't
2:01:41
understand what good looked like,
2:01:43
>> right?
2:01:44
>> Um, and then I stumbled into this
2:01:45
relationship after my divorce and my
2:01:47
ex-wife is an incredible woman. Just
2:01:49
like not,
2:01:50
>> you know,
2:01:50
>> what you needed or what she needed.
2:01:52
>> Yeah. She's we're just we were in in in
2:01:54
a few very specific ways,
2:01:57
>> we just weren't on the same page.
2:01:59
And then I find this other one and it's
2:02:03
and I think like I don't I was so
2:02:05
skeptical. I'm like I I kind of viewed
2:02:08
like a relationship as like this adjunct
2:02:10
to your life. There's you you're at the
2:02:13
center. M
2:02:14
>> you're doing your [ __ ] and one of the
2:02:16
appendages to your thing is you're
2:02:19
that's what I thought and then now it's
2:02:21
the opposite where I feel like my wife's
2:02:24
at the center and I'm like I would
2:02:26
always kind of like like almost like
2:02:28
laugh at people in the in my mind I'm
2:02:30
like it's not possible that somebody
2:02:31
feels this way about somebody else.
2:02:34
>> Um but it's an it's an it's a huge
2:02:36
enabler. It's a it's a very much a gift.
2:02:39
So that can also be a thing that people
2:02:40
look for. You know what I mean? which is
2:02:42
>> I think what you're saying is that
2:02:43
there's a bunch of different things that
2:02:46
have to sort of exist together and that
2:02:49
it's not just completely focus on your
2:02:51
work but that focusing on these other
2:02:54
things enhances the work and then the
2:02:57
work enhances all these other things as
2:02:58
well and they all exist together
2:03:00
>> and my best work is when I'm not
2:03:02
thinking about the attention or the
2:03:05
money. Those are the two most corrupting
2:03:07
influences in my life. If I if when I
2:03:09
look back and when I've lost when I've
2:03:11
lost the most amount of money
2:03:13
>> or when I've reputationally hurt myself
2:03:15
the most,
2:03:16
>> it's all been because of attention and
2:03:18
money.
2:03:19
>> Those are the only two things the root
2:03:21
cause consistently has been that.
2:03:23
>> That makes sense because you're thinking
2:03:24
about a result rather than the process.
2:03:27
>> Exactly.
2:03:27
>> Yeah.
2:03:28
>> Exactly.
2:03:29
>> So, and then thinking about that result
2:03:30
like oo I'm going to get a lot of
2:03:31
attention from this. Oo, I'm going to
2:03:33
get a lot of money from this. That
2:03:34
actually robs you of the focus that you
2:03:36
need to concentrate on the process.
2:03:38
>> Exactly. And the the thing about the
2:03:40
process is that so much of that when
2:03:44
you're in a flow state you're proud of
2:03:47
>> irrespective of the size of it because
2:03:49
the meetings are the same. Do you know
2:03:51
what I mean? Like
2:03:52
>> you're in the same [ __ ] 35 minute
2:03:54
meeting or 45 minute meeting debating a
2:03:56
product or debating a thing.
2:03:59
But the minute that I start to feel like
2:04:00
embarrassed about company A versus
2:04:02
company B or decision A versus decision
2:04:04
B,
2:04:05
>> now my mind is like, "Okay, hold on a
2:04:07
second here. I'm about to run myself off
2:04:09
the cliff."
2:04:09
>> Yeah.
2:04:10
>> You know, or you know, we I had this
2:04:12
dinner last week and this is what's
2:04:13
amazing. Like
2:04:16
we're talking about poker. Well, I So
2:04:18
I'm having dinner with my wife and a
2:04:20
friend and uh
2:04:23
she's like, "How are you doing?" Just
2:04:25
like a very generic nice question,
2:04:27
>> right? And I go into this long [ __ ]
2:04:29
diet tribe of like, well, you know, the
2:04:31
investing thing this, and then I started
2:04:33
this other thing that. And my wife's
2:04:34
looking at me like, what the [ __ ] are
2:04:35
you rambling on about? And then it got
2:04:37
But it got worse, Joe. It got worse. It
2:04:39
got even [ __ ] worse. Then I'm like,
2:04:41
uh, you know, but then I had this poker
2:04:43
game. I started rambling on. It's
2:04:45
normally on Thursdays, but then I I
2:04:46
moved it up to Wednesdays, but then I
2:04:48
moved it up to the city because my
2:04:49
friend's having it. And then I name
2:04:50
dropped who the guy was. And my wife
2:04:53
just looks at me like, "What the [ __ ] is
2:04:55
going on with you?
2:04:57
So, the dinner ends. It was And then
2:05:00
she's like, "What the [ __ ] is going on
2:05:01
with you?" She's like, "That was
2:05:03
insane." And I had no idea that I was
2:05:07
doing it.
2:05:09
And I'm like, "Okay, we need to put
2:05:10
Humpty Dumpty back together again
2:05:11
because I'm about to go on Rogan and I
2:05:13
can't go off [ __ ] like a crazy wild
2:05:14
man." Uh, but it's a it's it's an
2:05:17
enormous gift. That's been my biggest
2:05:19
unlock in these last like eight or nine
2:05:21
years. Like I I feel like like I'm kind
2:05:23
of like adding skills to my toolkit. I
2:05:25
feel like a golfer like that's like I
2:05:27
can shape shots a little bit now. I know
2:05:29
how to use different clubs.
2:05:31
>> Um and it's all like mindset
2:05:34
>> and it's like it's very much what you
2:05:36
it's like this processoriented approach
2:05:38
>> and you just can't control the outcome
2:05:41
and that's like a it's a magical
2:05:44
feeling. It's interesting that you're
2:05:46
saying this because like think about
2:05:48
what most people or people that are on
2:05:52
social media like the kind of attention
2:05:55
that they're focusing on. Like this is
2:05:58
why virtue signaling is so unsuccessful,
2:06:01
right? It's so bad for it because it's
2:06:02
fake. You're really concentrating on the
2:06:04
process. Are you really concentrating on
2:06:05
the result? The result is getting people
2:06:07
to love you. Exactly. Getting people to
2:06:08
agree with you. Getting and then
2:06:09
worrying about the criticism. Oh my god,
2:06:11
they hate me. Oh my god, they're mad at
2:06:12
my my statement. Oh my god, they're this
2:06:14
and then you're like obsessing on it all
2:06:16
day. People that aren't even anywhere
2:06:17
near you. It's like it's one of the
2:06:19
absolute worst things for mental health
2:06:21
is this addiction that people have to
2:06:24
posting things and then reading the
2:06:25
responses to those posts and getting
2:06:28
wrapped up in these very weird
2:06:30
two-dimensional interactions with human
2:06:33
beings.
2:06:33
>> You never read your comments. I mean,
2:06:34
you're very famous. You're like, it
2:06:36
doesn't [ __ ] matter to me. Well,
2:06:38
you're going to get to a certain point
2:06:39
in time where if you have x amount of
2:06:42
people that follow you, you're going to
2:06:44
have a percentage that are mad at you
2:06:47
and those are the ones you're going to
2:06:48
think about, right?
2:06:49
>> And if you don't self audit, maybe
2:06:51
that's good. Maybe it's good to say
2:06:53
like, "You [ __ ] piece of shit." Like,
2:06:54
"Oh, I'm sorry." You know, like what
2:06:56
your wife saying to you like, "What the
2:06:57
[ __ ] was that?" Like, "Ah, shit." Like,
2:07:00
I am very self-critical. Very like
2:07:03
horribly so. like to the point I torture
2:07:05
myself, you know? So, I'm like, I don't
2:07:07
need that from other people. And also,
2:07:09
those people don't love me and they want
2:07:11
me to fail. Like, there's a lot of
2:07:12
people that their lives are very
2:07:14
unsuccessful and I've been way too
2:07:16
fortunate, right? So, it's like there's
2:07:18
a reason to be upset at me if your life
2:07:20
is [ __ ] because I've I've gotten I've
2:07:22
three of the best jobs on earth. It
2:07:24
doesn't make any sense, right? So,
2:07:25
there's a re and also why the [ __ ] is
2:07:27
this podcast so successful? That doesn't
2:07:28
make any sense, right? So, it's like I
2:07:31
get it. I understand why people, but I'm
2:07:33
not gonna help them. I'm not gonna help
2:07:35
them bring me down. I'm not gonna
2:07:37
indulge in it and ruin my own mind by
2:07:39
wallowing in their [ __ ] because the
2:07:41
only reason why you would do that in the
2:07:42
first place is if you're not together.
2:07:44
No one is healthy and happy and
2:07:46
intelligent is going to post mean things
2:07:49
about you. So, you are reading things
2:07:51
from people that are mentally ill,
2:07:53
unhappy, and probably not. Maybe they're
2:07:56
intelligent in terms of their ability to
2:07:58
solve certain issues and problems. Maybe
2:08:00
they're good at certain skills, but like
2:08:02
their overall grasp of humanity and like
2:08:05
being a good person is not good if
2:08:08
you're [ __ ] on people, especially if
2:08:09
you like add homonym attacks and just
2:08:12
insults. And
2:08:13
>> so, it's not a good thing to ingest.
2:08:16
It's not It's like if you go down the
2:08:17
supermarket, you see Twinkies, oh,
2:08:18
they're right there. Don't [ __ ] eat
2:08:20
them, okay? That's not good for you. And
2:08:23
so it's like I don't think that at a
2:08:25
certain point in time, especially if you
2:08:27
become publicly known and famous, you
2:08:29
should ever read your comments. I don't
2:08:30
think it's good for you.
2:08:31
>> Yeah.
2:08:32
>> But you better be self- auditing or
2:08:35
you'll start sniffing your own farts and
2:08:36
think they smell great. Like don't do
2:08:38
that either.
2:08:39
>> Yeah.
2:08:39
>> But you I know a lot of people that have
2:08:43
gone crazy reading their own comments.
2:08:45
I've met comedians that like they'll
2:08:47
think about it all day long. It will
2:08:49
[ __ ] with them. It will tort. Well,
2:08:51
their neurosis are what's what creates
2:08:53
great comedy to begin with. So, if you
2:08:54
feed that neurosis in the wrong way,
2:08:56
you're [ __ ]
2:08:56
>> The wrong way, right? And then also the
2:08:58
self-doubt creeps in because all these
2:09:00
people telling you you suck and they're
2:09:01
like, "Oh my god, I suck." And then you
2:09:03
go on stage with this like, "People
2:09:04
think I suck, they hate me." You can't
2:09:06
do that. Like, if you
2:09:09
have a certain amount of energy in the
2:09:12
day, this is what I always tell
2:09:13
comedians. I said, "Look, think of your
2:09:16
attention and your focus as a a unit.
2:09:18
You have a hundred units. If you spend
2:09:21
30 of those [ __ ] units on [ __ ]
2:09:23
online, you're robbing 30 units from all
2:09:26
the things you love. 30 units from your
2:09:28
family, 30 units from your friends, 30
2:09:30
units from your job, 30 units from golf
2:09:33
or poker or whatever it is that you love
2:09:35
to do. You're stealing your own time and
2:09:38
your own focus
2:09:39
>> for loers, right?
2:09:41
>> Like why would you do that? And those
2:09:43
losers are good people. They're just
2:09:44
they most people are good people. They
2:09:46
just a bad path. I would have been the
2:09:48
personing. Yeah. Yeah. Look, if you gave
2:09:51
me a [ __ ] Twitter account when I was
2:09:52
16, oh my god, it would have been
2:09:55
horrendous.
2:09:55
>> Yeah. I would have been going crazy.
2:09:56
>> Oh my god. I would have been a terrible
2:09:58
person. It's normal. Especially if your
2:10:00
life sucks and you're not doing well and
2:10:03
you're attacking famous people or you're
2:10:04
attacking this person that's doing
2:10:06
better than you or whatever it is.
2:10:07
>> Like it's Do you uh have you seen the
2:10:09
clips of the [ __ ] maxing?
2:10:12
>> No.
2:10:12
>> You don't know what this is?
2:10:13
>> No.
2:10:13
>> You don't know what this is?
2:10:14
>> No. What's [ __ ] maxing?
2:10:16
>> Oh, this guy is fantastic. He sits He
2:10:19
sits on his back porch. Jamie, can
2:10:20
Jamie, can you just show He sits He sits
2:10:23
on his back porch smoking a cigar
2:10:27
basically telling you everything's kind
2:10:29
of [ __ ] Stop thinking about [ __ ]
2:10:31
You know, if you don't like your
2:10:33
friends, leave them. If you don't like
2:10:34
your girlfriend, leave them. Stop
2:10:36
overthinking. Simplify your life. You
2:10:38
know, it's it's inc it's so simple, but
2:10:42
I think it's incredibly
2:10:43
>> Who is this guy?
2:10:44
>> Elisha Long, I think, is his name. I
2:10:46
don't know Jamie if you can find it. I
2:10:47
think Elisha
2:10:48
>> [ __ ] maxing is funny because I know
2:10:49
about looks maxing. We talked about that
2:10:52
recently on a podcast but that's
2:10:53
recently entered into my mind into my
2:10:56
zeitgeist. looks max.
2:10:58
>> That's the clvicular, but I've only
2:11:00
found out about that within the last few
2:11:01
months of life because I I genuinely
2:11:04
stay off social media as much as
2:11:05
possible. And if I do read things, what
2:11:08
I like to do, I like to focus on
2:11:10
fascinating things. Like a lot of my
2:11:12
time I spend looking at YouTube stuff
2:11:14
because YouTube stuff, my algorithm is
2:11:16
all like new black holes they've
2:11:18
discovered, you know, new discoveries in
2:11:21
terms of like what is the fabric of
2:11:23
reality? Like I'm that's interesting to
2:11:26
me. And if I just concentrate on people
2:11:28
being mean or shitty to each other or
2:11:30
the latest [ __ ] political drama, it's
2:11:32
like
2:11:34
>> what? I don't have much time. I'm busy.
2:11:37
I like things.
2:11:38
>> And
2:11:39
>> are you are you on like Instagram and
2:11:41
Tik Tok?
2:11:41
>> I'm on Instagram. I do not have a Tik
2:11:43
Tok. This is Luke Maxing. No, this is
2:11:46
[ __ ] maxing. So, let me hear what he
2:11:48
says.
2:11:49
>> Who's this guy? What's his name?
2:11:50
>> Elisha Long.
2:11:51
>> Shout out to Elisha. being used as a
2:11:54
poisoning of nostalgia,
2:11:56
but to simply remind you of what you
2:11:59
found it for. And as we grow up, we
2:12:03
often give that up for security. We give
2:12:05
that up so that we're accepted. We give
2:12:07
that up to flex and appear like we have
2:12:10
now figured things out, that people will
2:12:13
accept us.
2:12:15
The only way that you will truly be
2:12:17
successful is if you are righteous and
2:12:21
you live according to your nature and
2:12:23
you play, man, and you don't let people
2:12:25
take play away from you to to be at the
2:12:28
circus and be ooed and aed and worried
2:12:31
about all the [ __ ] Return to a
2:12:33
state of play.
2:12:35
>> Well, that's very good advice.
2:12:38
Return to [ __ ] max. The best thing
2:12:41
that you could do is return to a state
2:12:42
of play. That's true. There's a lot of
2:12:44
that, you know. There's a lot of that.
2:12:46
Absolutely.
2:12:47
>> Oh, I I I think that that is like a It's
2:12:49
>> It's a wise man for a young fella.
2:12:51
>> Yeah.
2:12:52
>> Okay. He's a jiu-jitsu guy. There you
2:12:53
go. Look, he's getting his [ __ ] blue
2:12:55
belt there. Or he's getting his purple
2:12:56
belt. What is going on there? So, is he
2:12:59
getting his blue belt?
2:13:00
>> Yeah, it's his purple.
2:13:02
>> Yeah. So, they're taking his blue belt
2:13:03
off and putting his purple belt on.
2:13:05
Yeah. See, that's he's learning. He's a
2:13:06
martial artist. That's why.
2:13:08
>> You think martial arts people are just
2:13:10
more like spiritually connected to the
2:13:12
truth? I don't know if it's spiritually
2:13:13
connected to the truth. It's forced down
2:13:15
your [ __ ] throat because you can't
2:13:17
believe you're better than you are if
2:13:19
you're getting mauled every day,
2:13:22
you know? And there's only one way. This
2:13:25
guy's on the path to becoming a
2:13:26
jiu-jitsu black belt. Looks like a
2:13:27
pretty big guy, too. That'll help. Um,
2:13:29
but there's only one way to get a black
2:13:32
belt in jiu-jitsu. You got to train
2:13:33
jiu-jitsu all the time and get better at
2:13:35
jiu-jitsu. You can't pretend you're
2:13:36
better. You know, there's a lot of
2:13:38
people that write poems and they suck
2:13:39
and they think they're so deep.
2:13:41
>> Yeah. this poem
2:13:42
>> meaning like there's just a very simple
2:13:44
objective measurement that says
2:13:45
>> 100%. You either win or you lose. You
2:13:48
either tap or you get tapped out. You
2:13:51
know, you tap somebody or you get
2:13:53
>> Can you get a black belt in some gym
2:13:55
that's easier than a different gym or
2:13:57
something like that or
2:13:58
>> sort of kind of but not really. I mean,
2:14:01
everybody's trying hard. I mean, there's
2:14:03
definitely better gyms where they're
2:14:04
more technical and their program is much
2:14:07
more systematic and they're better at
2:14:09
breaking down skills like how to develop
2:14:11
skills, you know. Um there's definitely
2:14:14
better gyms. Uh there's better schools,
2:14:17
there's better places to learn, but
2:14:18
everywhere you learn, you're going to
2:14:20
have a bunch of people that are trying
2:14:22
hard like and you have a bunch of people
2:14:24
that are trying to learn these. And also
2:14:25
today because of the internet, you could
2:14:28
go on YouTube and there's thousands of
2:14:32
tutorials breaking down new moves.
2:14:34
Jiu-Jitsu is like endlessly comp.
2:14:36
>> One of my one of my kids has ADHD and
2:14:38
one of the things that was recommended
2:14:40
to us was jiu-jitsu.
2:14:41
>> Yeah. What is ADHD, man? It's not even
2:14:42
[ __ ] real cuz I definitely have it.
2:14:44
And I think I think it's a superpower.
2:14:47
>> I think we all have it.
2:14:48
>> I think I look I do not focus well on
2:14:51
things that I think are boring. But if
2:14:52
you give me something that that I love,
2:14:54
I can't I'll I'll play pool for [ __ ]
2:14:56
12 hours in a row.
2:14:57
>> It's crazy. But like the reason I got
2:14:58
back into golf is my seven-year-old gets
2:15:00
on the course and sometimes you can talk
2:15:02
to him and he's not making, you know,
2:15:03
he's like just like in his own world.
2:15:05
>> Exactly.
2:15:05
>> And then you start talking about chess
2:15:07
or jiu-jitsu or whatever and then we get
2:15:10
him on the golf course and this kid is
2:15:12
just dialed in.
2:15:13
>> Yeah. Superpower.
2:15:14
>> And I'm like, "Holy shit."
2:15:15
>> And they say that that's a disease.
2:15:17
That's crazy. Because if you find a
2:15:19
thing that that kid loves, he's going to
2:15:21
excel at it above and beyond most
2:15:23
humans.
2:15:23
>> We uh he does these chess classes and
2:15:26
like look, he's seven. So I'm like, "All
2:15:28
right, [ __ ] Bring it. [ __ ]
2:15:30
[ __ ] destroy you. I'm going to
2:15:31
[ __ ] ball you." And uh we're playing
2:15:35
last weekend and he goes, "Oh, Dad, you
2:15:38
know you can't castle out of check." I'm
2:15:40
like, "Shut the [ __ ] up. I know how this
2:15:42
game works." And I go on to beat him and
2:15:44
I and I went to my wife and I'm like,
2:15:47
he's six weeks away from beating me. I
2:15:48
got to go.
2:15:51
And then I spent I spent two days I
2:15:53
spent two [ __ ] days on YouTube and I
2:15:56
was like, "Okay, I got to brush up on my
2:15:58
openings and I got to I got Oh my god, I
2:16:00
don't have time for this [ __ ]
2:16:01
>> But I can't let this sevenyear-old beat
2:16:03
me.
2:16:05
>> You know what I mean? You're going to
2:16:06
have to."
2:16:06
>> And I'm like And I was like, "How do I
2:16:08
how do I stall this until maybe he's 10
2:16:10
or 11?" Then it's like, "Okay, fine. You
2:16:11
finally beat me. Congratulations.
2:16:13
>> You have to think of him as an extension
2:16:15
of you and be happy when he does.
2:16:16
>> Oh my god.
2:16:17
>> Yeah. That's just how it is. Look, if
2:16:19
you're a man and you have a son, I I
2:16:21
have all daughters, but if I had a son,
2:16:24
I would be legitimately terrified that
2:16:26
he'd be able to tap me. Because if I had
2:16:29
a son, one of the first things that I
2:16:31
would do is get them. I got my kids
2:16:32
involved in martial arts at an early
2:16:33
age, but I didn't force them to keep
2:16:35
doing it. They did it for a certain
2:16:36
amount of time, and then they went on to
2:16:37
do a bunch of other things that they
2:16:38
enjoy better,
2:16:39
>> which is fine. But I think it's good to
2:16:41
learn some skills, learn how to defend
2:16:43
yourself so you're not completely lost.
2:16:46
Just it's I think it's good for you.
2:16:47
It's good to learn. It's good to develop
2:16:49
confidence. But for boys, I think it's
2:16:51
critical, you know, especially boys with
2:16:53
my kind of DNA. I'm like, I think it's
2:16:55
good to get that [ __ ] out of your
2:16:56
system. But if I had a son, there'd be a
2:16:58
certain point in time. I'm like, it's a
2:16:59
matter of time before this [ __ ]
2:17:01
can kill me. You know, it's like I mean,
2:17:04
I'm 58 years old. If I had a 20-year-old
2:17:06
kid, like, he'd probably kill me.
2:17:07
>> Kick your ass.
2:17:08
>> Probably [ __ ] kill me.
2:17:09
>> He'd kick your ass. Yeah. It's like what
2:17:11
am I going to do? There's nothing you
2:17:12
can do. You just have to accept it and
2:17:13
then hope your relationship with him is
2:17:15
strong enough that he still respects you
2:17:16
even though he can kill you because it
2:17:18
can't be entirely ba. Look, there's a
2:17:21
lot of martial arts instructors that are
2:17:23
old and they're revered and respected
2:17:25
and nobody wants to try to hurt them,
2:17:27
right? Because you realize if you learn
2:17:30
enough, you get to a certain point in
2:17:31
time, you realize like
2:17:32
>> I'm a much better dad to my sons than I
2:17:35
am my daughters. And I mean this in the
2:17:36
following way. My daughters have the run
2:17:38
of the place. Whatever they want, I'm in
2:17:40
love with them. I don't love them. I'm
2:17:41
in love with them. Whatever they need,
2:17:43
what they can just
2:17:44
>> enamored by.
2:17:45
>> They're just like they can control me.
2:17:47
They just kind of send me in one
2:17:48
direction or another. I'm just like
2:17:49
they're I'm like,
2:17:50
>> by the way, they know that, too.
2:17:51
>> I'm enslaved by them.
2:17:52
>> Yes.
2:17:53
>> You know, and I just want their
2:17:54
attention. Any small little shred, I'm
2:17:56
like,
2:17:57
>> boo your son, you keep them in check.
2:17:59
>> Whereas like my sons, I keep the I'm
2:18:01
doing everything that I was supposed I
2:18:02
think I'm supposed to be doing. Now, the
2:18:04
good news is my, you know, daughters are
2:18:06
just different. Like, they're girls.
2:18:07
They're just, so they don't need the
2:18:09
same kind of like tough loveish,
2:18:11
>> right?
2:18:12
>> You know, but then my boys reveal their
2:18:14
characteristics in ways that really
2:18:16
surprise me. And I'm just like, man,
2:18:17
this is so [ __ ] awesome. Parenting
2:18:19
has been the best. Like, when I again,
2:18:21
like slowing down and actually being in
2:18:23
it,
2:18:24
>> and I'm like, [ __ ] this is amazing.
2:18:25
>> It is pretty amazing. And watching your
2:18:28
kids get really good at things is really
2:18:30
fascinating. It's fascinating. I told
2:18:31
you this story before, but like you know
2:18:34
my son, my oldest son, this is my
2:18:36
17-year-old. It's just a great kid.
2:18:40
He goes and he's like, "Okay, I'm
2:18:41
applying for college." And I'm like,
2:18:43
"Great. Let me take you to the Naval
2:18:44
Academy, West Point. Let me show you
2:18:46
these servicemies." And he sees those
2:18:48
and he's like, "These are incredible."
2:18:49
But then he's like, "I think I want to
2:18:50
go to like, you know, Georgetown or
2:18:52
Vanderbilt or whatever." And I'm like,
2:18:54
"Hey man, that's like um just a bigger
2:18:56
version of your high school and
2:18:58
whatever. if that's what you want to do,
2:19:00
you do you. And you know, um, but you
2:19:04
know, my the I'll help you like kind of
2:19:07
get to the starting line here, but
2:19:09
you're on your own. And he had to get a
2:19:11
job because I'm like, if you're going to
2:19:12
get into these schools, you got to get a
2:19:13
job. And so he tries to
2:19:16
last summer, I just started [ __ ]
2:19:18
screaming at him. And I'm like, you
2:19:21
[ __ ] louse, you haven't done
2:19:22
anything. And this is at like another
2:19:24
kid's at our at our son's birthday
2:19:25
party. I scream at him. He starts
2:19:28
crying. I'm like, "You need to do more."
2:19:30
Then my wife screams at him. He starts
2:19:33
crying again. Then my ex-wife screams at
2:19:35
him. He starts crying again.
2:19:38
And he just goes, "I'm out of here." He
2:19:40
walks out. Meanwhile, I start panicking
2:19:43
and I'm like, "I got to tiger dad this
2:19:44
situation." So, I start texting a few
2:19:47
friends trying to figure out, "Hey, can
2:19:48
I, you know, do you guys want to hire
2:19:49
this kid? He's like really, you know,
2:19:51
he's pretty smart kid. Did all this
2:19:52
stuff in robotics, yada yada." One of
2:19:55
them says, "I'd be willing to interview
2:19:57
him." I call him and he's like, "Dad, I
2:20:01
got a job." I said, 'What do you mean
2:20:02
you got a job? Said, 'I went around
2:20:05
downtown,
2:20:07
went to all these places and I was in a
2:20:08
McDonald's and um the woman was having a
2:20:12
little bit of difficulty speaking
2:20:13
English, so I just booked her in
2:20:14
Spanish. I got the application. I sat
2:20:16
down at the desk and the guy having
2:20:17
lunch beside me said, "Hey, I heard you
2:20:20
needed a job and uh I really like the
2:20:23
way you talked to this woman. I'm the
2:20:25
general manager of the car wash down the
2:20:26
street. Come and work for me."
2:20:29
And I said, "Well, what are you going to
2:20:30
do?" He goes, "I'm going to go work
2:20:32
there." And I said, "Okay, well, I got
2:20:34
this other interview for you as well, so
2:20:35
you should see maybe you can do both."
2:20:37
Anyways, the end of the story is he did
2:20:39
he did these two jobs. He worked at a
2:20:40
robotics firm, but then he worked at a
2:20:42
car wash. And when I tell you this
2:20:44
story, I am so proud of this kid because
2:20:45
of the car wash. Because that car wash
2:20:48
thing, he was he would come home and
2:20:49
he's like, "Man, you have no idea how
2:20:51
people live." And I'm like, "What do you
2:20:52
mean?" He's like, "The stuff that I find
2:20:54
in the trunk when I have to vacuum these
2:20:56
cars and clean out the cars." And I'm
2:20:58
like, "Bro, that is a gift. You have
2:21:00
given a [ __ ] gift. That is the thing
2:21:02
that if you take with you, you'll be
2:21:04
golden the rest of your life." Because
2:21:06
all this other [ __ ] is all kind of
2:21:07
manufactured. I help because I'm
2:21:09
anxious. I'm insecure. Mhm.
2:21:11
>> But that [ __ ] you did on your own. And
2:21:13
that thing is what people will [ __ ]
2:21:14
respect when push comes to shove.
2:21:16
>> It's also jobs that suck are really good
2:21:18
for you.
2:21:18
>> So good. I used to work at Burger King
2:21:20
when I was 14.
2:21:22
>> Man, let me tell you.
2:21:23
>> You were 14 and you had a job.
2:21:26
>> When my dad had to stay behind like we
2:21:30
were my dad was a diplomat in the
2:21:32
embassy of Sri Lanka in Canada. This
2:21:35
[ __ ] war in Sri Lanka is crazy. He
2:21:37
writes this essay. His life is
2:21:38
threatened. So he files for refugee
2:21:41
status. He gets it.
2:21:44
He gets kicked out of the embassy. So he
2:21:47
doesn't have a job. My mom becomes a
2:21:48
housekeeper
2:21:50
and we're kind of toiling in this
2:21:51
poverty cycle. So 14 you have to I had
2:21:54
to get a job and I would take the money
2:21:56
and you know we buy I buy the bus
2:21:58
passes. I would buy some of the
2:21:59
groceries. We just trying to make it all
2:22:01
work right. And uh I got a job at the
2:22:04
Burger King.
2:22:05
This is another example where I was
2:22:08
like, I'm going to go get a job. Hey,
2:22:10
can you drive me to the interview? And
2:22:12
my dad's like, "No,
2:22:16
get on your [ __ ] bicycle and go." And
2:22:18
I thought, "Bro, we need this. You need
2:22:20
the money more than I do. Why are you
2:22:22
making me bicycle?" But I bicycled and I
2:22:25
got the job and I worked there. And I
2:22:27
used to work the night shift.
2:22:28
14-year-old kid, man. Wow.
2:22:30
>> From [ __ ] 8 till 2 in the morning.
2:22:32
And I would have to clean this like 8:00
2:22:34
p.m. to 2 in the morning.
2:22:34
>> Then you had to go to school in the
2:22:35
morning.
2:22:36
>> No, then I this was always like Friday,
2:22:38
Saturday, Sunday.
2:22:39
>> Wow.
2:22:39
>> Thursday, Friday. Sorry. Thursday,
2:22:40
Friday, Saturday. And then Yeah. Some
2:22:42
days I would have to go to school. But
2:22:44
And why did I work until two? Because
2:22:46
when the restaurant closes,
2:22:48
you get whatever the food is left over,
2:22:50
right? So like you get a couple chicken
2:22:52
sandwiches, you get like the, you know,
2:22:54
the the version of the McNuggets that
2:22:56
Burger King had, a couple whoppers, and
2:22:58
you take them home.
2:23:02
But the amount of vomit that I had to
2:23:04
clean up the bathroom, you can't
2:23:07
imagine, man, the a downtown Burger King
2:23:10
near bars, you know, after closing time,
2:23:14
the [ __ ] you see.
2:23:16
>> Oh, wow.
2:23:16
>> And the [ __ ] you deal with. And all I
2:23:19
could think of was like, I just want to
2:23:20
get the [ __ ] out of here. But that was
2:23:23
so valuable for me.
2:23:24
>> Yeah,
2:23:25
>> that was so valuable for me. Um,
2:23:29
and then I worry that my, you know, kids
2:23:31
don't get exposed to it, but when my son
2:23:32
got it, maybe I'm overimposing too much
2:23:34
about it, but it's like I'm like, man,
2:23:36
that that car wash thing is really going
2:23:38
to be the thing that separates you in
2:23:40
life.
2:23:40
>> Yeah. Doing something that sucks. It it
2:23:42
also
2:23:43
>> just being humble and grinding through
2:23:44
that [ __ ] you know? Do you realize like
2:23:47
this is sometimes people they don't pick
2:23:49
a path and they just have a job and they
2:23:52
don't like it and they stay with this
2:23:54
thing they don't like forever and that's
2:23:56
not what you want.
2:23:58
>> No,
2:23:58
>> it's not what you want. But the
2:24:00
development like the learning how to do
2:24:03
something that sucks and grinding
2:24:04
through it
2:24:05
>> and still doing it well.
2:24:06
>> Yeah. You know, doing it well.
2:24:08
>> Make a make a Whopper. Be there on top.
2:24:09
>> I know how to [ __ ] make a Whopper.
2:24:11
>> Yeah.
2:24:11
>> Do you know what I mean?
2:24:12
>> Yeah.
2:24:13
uh make the fries, change the oil, all
2:24:15
that [ __ ]
2:24:16
>> And then when you apply that those
2:24:18
lessons to something you actually love
2:24:21
and you work hard at something you love,
2:24:23
>> magical.
2:24:23
>> Oh, it's incredible. It's a real gift.
2:24:26
>> It's a real gift.
2:24:27
>> Yeah. I mean, you know, some people they
2:24:29
don't appreciate the process, you know,
2:24:32
and it's hard to because like when
2:24:34
you're young and you're going through
2:24:36
these difficult jobs and these things
2:24:37
that suck and you don't know how it's
2:24:39
going to turn out, you know, and a lot
2:24:41
of times people aren't really educated
2:24:42
in what a process actually is and about
2:24:45
how it does develop character and it
2:24:47
does develop discipline and that these
2:24:49
things are actual skills that you can
2:24:51
apply to other things in life. You just
2:24:54
think, "God, I'm a [ __ ] loser." I
2:24:55
have a I have a visual for this. I
2:24:58
always ask myself, am I in the engine
2:24:59
room right now?
2:25:01
>> This is my way of saying like an engine
2:25:02
room is a little hot. It's a little
2:25:04
uncomfortable, but it's where all the
2:25:07
[ __ ] is happening. It's where the [ __ ]
2:25:08
is being made.
2:25:10
>> And so I'm like, it's a little, you
2:25:12
know, discomforting,
2:25:13
>> but I got to be in there. And there are
2:25:15
days where, and there'll be weeks
2:25:17
>> where that's all I do. I'm just in it,
2:25:20
you know? I don't I'm not good at
2:25:21
responding to emails sometimes or
2:25:23
whatever because there'll just be weeks
2:25:24
where I'm in it and it's an incredible
2:25:27
visual for me because I'm like yeah this
2:25:29
is like where like I'm grounded
2:25:31
>> and I like feel myself and then when I
2:25:33
when I look at my like my health
2:25:37
that's when like I just feel like really
2:25:39
good about myself like not insecure and
2:25:43
my vitals are different like it's crazy
2:25:45
like my [ __ ] HRV like my HRV
2:25:49
craters
2:25:51
when I'm like just like you know
2:25:55
insecure
2:25:56
>> of course
2:25:57
>> but why is that like it's it's your it's
2:25:59
your heart rate variability should have
2:26:01
nothing to do with your like
2:26:03
>> disposition and your mood
2:26:05
>> well your mind is the idea that your
2:26:08
mind is separate from the body is crazy
2:26:10
>> it's crazy
2:26:10
>> it's not and
2:26:12
>> but is your HRV lower when you're just
2:26:14
out of sorts
2:26:15
>> yes probably right
2:26:16
>> I'm sure yeah I don't really monitor it
2:26:18
that much Yeah.
2:26:20
>> And I'm I try not to ever get out of
2:26:21
sorts, too. And one of the ways that I
2:26:24
keep from getting out of sorts is daily
2:26:26
discipline. Like it's if I if I have
2:26:28
days where I'm sure it gets out of sorts
2:26:30
if I have a few days in a row where I
2:26:32
don't work out, but I I work out almost
2:26:34
every day. And if I'm not working out,
2:26:36
I'm still cold plunging and going to the
2:26:38
sauna and stretching. I'm always doing
2:26:40
something. And if I don't do something,
2:26:42
I feel like I'm [ __ ] up. And then
2:26:44
then I can
2:26:45
>> So, does it matter what it is? Meaning,
2:26:46
as long as it's a routine.
2:26:48
>> Yeah. Well, I I do it all myself. I
2:26:50
don't have a trainer, but I write things
2:26:52
down. I write down what I want to
2:26:53
accomplish. I write down what I'm going
2:26:54
to do, and then I just do it. I like a
2:26:57
robot force myself to do it.
2:26:59
>> Yeah.
2:27:00
>> And then I always feel better after it's
2:27:01
over. And it's always the hardest part
2:27:03
of my day.
2:27:03
>> Yeah. And so it makes everything else so
2:27:05
much easier because it's I [ __ ] work
2:27:07
out hard
2:27:08
>> and so everything else is pretty easy,
2:27:11
>> you know, because the strain like just
2:27:12
being in that [ __ ] cold water or just
2:27:15
going through Tabatas on a a dye bike
2:27:18
like this shit's hard. It's really hard.
2:27:20
Like I could die right now hard. And so
2:27:22
everything else is like how how hard's
2:27:24
it going to be? Oh, it's uncomfortable.
2:27:26
Oh boohoo, you know? Like I think it's
2:27:28
important to go through that. I I really
2:27:31
think it is, you know? I really think it
2:27:33
is and that that's a the difference
2:27:35
between,
2:27:37
>> you know, sanity and like having a very
2:27:39
slippery grip on your your own personal
2:27:43
sovereignty. I think a lot of it is like
2:27:46
you have to choose it. It has to be like
2:27:50
elective
2:27:52
voluntary adversity. Like you have to
2:27:55
choose to do it.
2:27:56
>> Yeah. I that's a really great way of
2:27:57
saying it. Voluntary adversity. If it's
2:27:59
forced upon you,
2:28:00
>> you can kind of compartmentalize
2:28:02
>> and then you get angry like [ __ ] this
2:28:03
bitter and resentful making me do stupid
2:28:05
[ __ ] But if you force yourself to do
2:28:07
it, you know,
2:28:08
>> this why these special forces guys are
2:28:09
such [ __ ] animals.
2:28:10
>> Of course,
2:28:11
>> they're choosing,
2:28:12
>> right? Exactly. And they develop that,
2:28:15
you know, this mentality when you're
2:28:17
around other people that are also
2:28:18
savages. You know, you just you realize
2:28:21
like there's other people out there in
2:28:22
the world that are not making excuses.
2:28:25
they are getting after it every day and
2:28:27
they are pushing every day. And the more
2:28:29
you can surround yourself with people
2:28:30
like that, the more people the people
2:28:32
that complain about nonsense and the
2:28:34
find excuses and focus on other people
2:28:37
and [ __ ] about things and why is she
2:28:40
doing this? Why is this happening for
2:28:42
him?
2:28:44
>> Yeah,
2:28:44
>> it's loser mentality. And if you're
2:28:46
around more winners, you know, you
2:28:48
absorb that. You imitate your
2:28:49
atmosphere. Yeah, it's very important
2:28:50
and it's very hard for people,
2:28:52
especially young people, to find
2:28:55
positive influences and to find positive
2:28:58
groups. And I think it's one of the
2:29:00
reasons why a lot of young people
2:29:02
gravitate towards podcast because they
2:29:03
get to hear interesting conversations
2:29:05
with really accomplished people that are
2:29:07
fascinating that are unlike anybody that
2:29:09
they're around on a daily basis,
2:29:12
>> you know. And that that's also one of
2:29:13
the reasons why it's important to find
2:29:15
some that's why martial arts are so good
2:29:17
for young people because you're around
2:29:19
other people that are doing this really
2:29:21
difficult thing and other sports too
2:29:23
whether it's football or wrestling
2:29:24
whatever it is.
2:29:25
>> I actually found like you know the last
2:29:26
few years I go out of my way to not
2:29:28
isolate myself. That's one thing like
2:29:29
being around other people engaging in
2:29:31
things.
2:29:32
>> Yes.
2:29:32
>> Has been really healthy for me.
2:29:33
>> Oh for sure.
2:29:34
>> Oh my god. And I just found like what
2:29:36
the [ __ ] am I doing? It's like
2:29:37
everything is in my little house by
2:29:38
myself with everybody everything comes
2:29:40
to me. It's so odd.
2:29:42
>> It's odd. It's really odd. Very
2:29:43
unhealthy.
2:29:44
>> And it starts to [ __ ] you up in the
2:29:45
bind.
2:29:45
>> And then your interaction with humans is
2:29:47
only on the internet. It's terrible. You
2:29:49
know,
2:29:50
>> or with people that are sickantically
2:29:51
either being paid or need something from
2:29:53
you.
2:29:54
>> Yeah.
2:29:55
>> And then I think you're in a really bad
2:29:56
place.
2:29:57
>> Absolutely.
2:29:57
>> Whereas like if you're in the grind with
2:29:59
other people, they're beating you at
2:30:00
things. It's great.
2:30:01
>> Yeah. Yeah, if you're in a situation
2:30:03
where there's a bunch of sickopantically
2:30:04
connected people to you and they're just
2:30:06
all kissing your ass and I mean we all
2:30:08
know people that are like the heads of
2:30:09
companies and that are just like [ __ ]
2:30:11
tyrants.
2:30:12
>> I think the the the the trap about being
2:30:14
successful because it's not everything
2:30:16
it's wrapped up to be is exactly that.
2:30:17
You become so isolated that you become
2:30:20
this like very caricaturous version of
2:30:22
yourself
2:30:23
>> because you forget what it's like to
2:30:25
just a basic example like wait in line,
2:30:28
be kind to other people, be polite, like
2:30:30
be accommodating, have some empathy,
2:30:32
>> right?
2:30:32
>> Where are you put in that situation to
2:30:34
do those things,
2:30:35
>> right? You forget that you're just a
2:30:36
person.
2:30:36
>> You're just a [ __ ] person.
2:30:38
>> And if you achieve some level of success
2:30:40
that you're trying to you're trying to
2:30:42
achieve this level of success so you
2:30:44
elevate past being a person, you're
2:30:46
missing the point. like you're never
2:30:47
going to and if you do it'll come at a
2:30:50
price.
2:30:50
>> I thought being successful was supposed
2:30:52
to write all the wrongs that I felt like
2:30:56
I missed
2:30:58
and it turns out nobody gives a [ __ ]
2:31:00
>> No.
2:31:01
>> And it does none of that.
2:31:02
>> I think it's all the process. The all of
2:31:06
life is the process.
2:31:07
>> I agree.
2:31:08
>> I think as soon as you think that
2:31:10
there's a goal like, "Oh, I'm going to
2:31:11
retire and experience my golden years."
2:31:13
I think it's all horseshit. And that's
2:31:15
one of my main fears about AI. My my
2:31:19
main one of my main fears about this
2:31:21
idea of universal high income and
2:31:24
everyone's going to have, you know,
2:31:25
ultimate abundance. It's like where does
2:31:27
anybody find purpose and meaning? And
2:31:30
where do the where do you take whatever
2:31:34
this thing is that the mind is
2:31:37
constructed of these these needs that
2:31:39
the mind has that have to be satisfied
2:31:42
in order to achieve sanity? in order to
2:31:45
achieve some sort of like place where
2:31:48
you can be at peace,
2:31:49
>> fulfillment.
2:31:49
>> Yeah, you have to do you're going to
2:31:51
have to do something, man. You're going
2:31:52
to have to do something. And I mean,
2:31:54
maybe it could just be jiu-jitsu and
2:31:57
golf and find some stuff that you enjoy
2:31:59
doing and take some benefit in that. But
2:32:03
boy, that's not been the case for
2:32:06
hundreds of years. You know, that's not
2:32:08
how human beings have exist. I mean, but
2:32:11
also part of me says, why do we have to
2:32:14
work to find those things? Why can't we
2:32:17
why why is it all that?
2:32:20
>> Well, you got to find the thing that's
2:32:22
not work,
2:32:23
>> right? But but I'm getting at is like
2:32:25
why is our identity all tied up in money
2:32:30
and and and and
2:32:32
just things and objects and stuff. And
2:32:36
this is a fairly new thing in human
2:32:38
society, right?
2:32:40
>> Why can't it transform into
2:32:44
m like your basic needs are all met?
2:32:47
Like nobody ever has to worry about
2:32:48
starving again. Nobody ever has to worry
2:32:50
about not having a home to sleep in.
2:32:52
Nobody ever has to worry about not
2:32:53
having healthcare. Nobody ever has to
2:32:55
worry about not having education. So
2:32:56
then it becomes find a purpose with your
2:32:59
life. And as a society, can we adjust?
2:33:03
Can we gravitate towards a new way of
2:33:07
existing in meaning? And it would
2:33:08
probably be great. In one way, it'd be
2:33:11
great because we wouldn't have to be
2:33:14
constantly thinking, why does he have
2:33:15
that and I don't have that and this and
2:33:16
that. Instead, it would probably be
2:33:19
like, what can I do to get better at the
2:33:21
thing that I love, right?
2:33:23
>> What you know, and
2:33:24
>> or let me be a part of a project to do
2:33:26
something that seems implausible,
2:33:28
>> but I feel like I'm in the engine room
2:33:30
every day. This is great. I'm toiling
2:33:31
with these guys. probably not going to
2:33:34
work. Some crazy convoluted thing that
2:33:36
has a .01 chance of success
2:33:40
>> that can captivate a lot of people.
2:33:42
>> Yes. You know,
2:33:43
>> the process.
2:33:43
>> The process.
2:33:44
>> Yeah.
2:33:45
>> The process.
2:33:45
>> The process is everything. And there's
2:33:47
no I used to like think backward.
2:33:49
>> There is no attention in the process,
2:33:50
>> right?
2:33:51
>> There's only attention in the outcome,
2:33:53
>> right?
2:33:54
>> Do you see what I mean?
2:33:55
>> Absolutely.
2:33:56
>> Which is another clue and a secret that
2:33:58
that's actually where you should be
2:33:59
focused.
2:33:59
>> Well, you might get attention, but
2:34:00
that's not what you want. What you want
2:34:02
is the process to work out. You you want
2:34:04
to get better at whatever it is you're
2:34:05
doing and get that thing to a better
2:34:07
place than it is right now currently.
2:34:09
Right. That's what you're thinking of.
2:34:10
You're not thinking of I am going to get
2:34:12
all this attention. I'm going to be on
2:34:14
the cover of a magazine.
2:34:16
Yeah. Can't It can't be that. That's not
2:34:19
good for anybody. But everybody thinks
2:34:21
that's what they're going to get. Oh,
2:34:23
I'm going to get this.
2:34:23
>> Everybody thinks that's what they want.
2:34:25
>> Yeah. Right.
2:34:27
>> And the problem with that is that it's
2:34:29
not what you want.
2:34:30
>> No.
2:34:31
And then now we're going to completely
2:34:33
upend
2:34:34
potentially all of that.
2:34:38
>> Yeah. Well, maybe it'll come inside
2:34:40
it'll come it'll coincide with the hive
2:34:43
mind technology.
2:34:44
>> This hive mind thing actually that you
2:34:46
say I find very compelling because this
2:34:48
idea of like how do you govern an AI?
2:34:53
Each of us individually are not capable,
2:34:55
but I think you, me, like 10,000, 100
2:34:57
thousand people working together.
2:35:01
The question is, are we smarter?
2:35:03
And I think there's a reasonable chance
2:35:05
that that could be true. And then the
2:35:07
other version of the hive mind is here
2:35:08
are all these like crazy ideas that
2:35:10
would just make the world incredible.
2:35:13
And a group of a thousand people go off
2:35:15
and they kind of jointly work on that
2:35:17
together. That I find super fascinating.
2:35:20
Like I that could be it. Like it could
2:35:22
be like, you know, a thousand physicists
2:35:24
are like, "We're going to create this
2:35:26
new interstellar form of
2:35:27
transportation." And they just go off
2:35:28
and they're just like they don't have to
2:35:30
worry about
2:35:32
existing because all of that's paid for.
2:35:34
>> Well, it also could solve all of our
2:35:36
problems that we have with like halves
2:35:39
and have nots. If we're all one, how
2:35:42
could we tolerate haveotss? How could we
2:35:44
tolerate people living on dirt floors in
2:35:46
third world countries with no access to
2:35:48
clean water? We wouldn't tolerate it
2:35:49
tolerate
2:35:50
>> because we they would be us and we would
2:35:52
understand that.
2:35:52
>> Yeah.
2:35:53
>> I mean it could be like a complete
2:35:55
gamecher in terms of human civilization.
2:35:58
It could really move people into a
2:35:59
complete next direction. I mean it could
2:36:01
eliminate crime and violence. Yeah.
2:36:03
>> Which sounds insane. Like boy that's so
2:36:06
utopian. Like oh why don't you suck on
2:36:08
some crystals you [ __ ] hippie. But
2:36:10
legitimately if look if everybody has a
2:36:13
cell phone which essentially everybody
2:36:14
does right right now in this time and
2:36:16
age. If we get to a point where
2:36:18
everybody is connected, everybody is
2:36:21
hive mind connected, you're there's
2:36:24
we're all you're not going to just be
2:36:26
able to drive by a homeless encampment,
2:36:28
>> right?
2:36:28
>> You won't you'll feel it. You'll feel
2:36:31
it. It won't be like, "Hey, you [ __ ]
2:36:32
losers. Hit the gas." It's going to be
2:36:35
like, "We need to solve this. We need to
2:36:37
get these people counseling, mental
2:36:39
health crisis, get them off the drugs,
2:36:41
whatever it is that's wrong with them."
2:36:44
>> I mean, that's an incredible idea.
2:36:46
>> Yeah. You know, like when an airplane
2:36:48
kind of like goes like this and your
2:36:49
stomach goes and you just feel it.
2:36:51
>> Could you imagine like you drive by a
2:36:53
homeless encampment and that's what you
2:36:54
feel? Like you feel like
2:36:56
>> something's wrong.
2:36:56
>> And we'll all feel it collectively. If
2:36:59
we're all connected and we all feel
2:37:01
things connectively, we will actively
2:37:03
work together to solve these problems.
2:37:05
And if we're dealing with a if if we
2:37:07
really get to a point of abundance, like
2:37:09
true abundance, where resources are not
2:37:12
an issue and no one's starving, we could
2:37:15
really fix all the problems that like
2:37:18
>> none of them are insurmountable. None of
2:37:20
them are breathing underwater, right?
2:37:22
None of them are flying to the sun. None
2:37:24
of them, right? So all of them are
2:37:26
things that could be if we took all the
2:37:29
world's resources, socialism doesn't
2:37:31
work, right? Why does it not work?
2:37:33
because it rewards lazy people and it
2:37:34
punishes ambitious people. It's not
2:37:36
doesn't doesn't work with human nature,
2:37:38
but it would work if you have [ __ ]
2:37:40
hive mind. If we all we all understand
2:37:43
what it means to put in effort. We all
2:37:45
understood what what each other are
2:37:46
feeling and thinking,
2:37:47
>> right?
2:37:48
>> And we all compiled resources and fixed
2:37:51
all of our social problems. Like
2:37:53
literally stop all wars, stop all crime,
2:37:57
stop all violence, stop all poverty.
2:38:00
Done. And then what do we do? We work
2:38:03
together to solve whatever the [ __ ] else
2:38:04
is wrong with you society.
2:38:06
>> Well, it's more like what is left over
2:38:08
that we haven't figured out.
2:38:10
>> Think about what the world was like
2:38:12
before the internet. It's almost
2:38:13
impossible to imagine, but we both grew
2:38:15
up without it.
2:38:16
>> Yeah.
2:38:17
>> And Yeah. And so we're entering into
2:38:19
this new world. Think about what world
2:38:22
was like without the hive mind, but yet
2:38:24
we all grew up without it. Like that
2:38:26
might be the next thing. The thing that
2:38:28
I remember the most about that era is I
2:38:32
had a positive sum view of everybody
2:38:35
>> really.
2:38:36
>> Meaning there weren't like the the bad
2:38:39
actors were pretty bad but yeah
2:38:41
generally like I looked up to most
2:38:44
business people like the people that I
2:38:45
now I feel like have been a little bit
2:38:47
unmasked then to me were pristine.
2:38:49
>> Oh that's interesting.
2:38:50
>> Like the Bill Gateses of the world, you
2:38:51
know. I was like man I really aspire to
2:38:53
be Bill Gates when I was like 13 or 14.
2:38:56
It just seemed like
2:38:57
>> now you're like, why is he buying all
2:38:58
the farmland? This [ __ ] weirdo.
2:39:00
>> I mean, it's a [ __ ] so funny. He uh
2:39:03
he bought this like 45,000 acres in
2:39:07
4,500 acres. I can't get the order of
2:39:09
magnet right uh in Phoenix to build his
2:39:11
own digital city.
2:39:13
>> Yeah.
2:39:14
>> Okay.
2:39:14
>> It's like weird. So, I bought the 1700
2:39:16
acres beside him.
2:39:20
>> That's hilarious.
2:39:21
>> [ __ ] you.
2:39:22
>> It's a very odd thing. It's a very odd
2:39:25
thing when people get exposed and you
2:39:27
just go like what the [ __ ] is that guy
2:39:29
really all about
2:39:30
>> and but also like how isolated is he?
2:39:33
>> Oh, he's been he's been isolated for 50
2:39:35
years,
2:39:36
>> right? Like who are his friends and how
2:39:38
how many people does he have?
2:39:39
>> Must be very hard to be him actually. I
2:39:41
I mean
2:39:41
>> especially now that he's divorced,
2:39:42
right? So now he's got no one going but
2:39:45
that speech [ __ ] sucked.
2:39:46
>> Yeah, he's I mean he has a long-term
2:39:48
partner. Um she seems like a lovely
2:39:51
woman. Um, but yeah, it's just got to be
2:39:54
super lonely.
2:39:54
>> It's got to be.
2:39:56
>> It's not I to me it's not worth that
2:39:58
level of I don't even know what is it.
2:40:01
It's like material success at least
2:40:03
measured in the outside world. I don't
2:40:04
know what it is, but it's not
2:40:06
it's a lot, man. This is like I like I
2:40:08
don't know how Elon does it. Like it's a
2:40:10
lot. It's super isolating.
2:40:11
>> Yeah.
2:40:12
>> It's it's just he's very by himself.
2:40:16
>> Mhm.
2:40:17
>> And he's going to be even more isolated
2:40:19
in a matter of a few months.
2:40:20
>> Yeah. And that's unfortunate because you
2:40:23
have very empathetic, very kind of like
2:40:25
sensitive people like that I think need
2:40:26
other people.
2:40:28
>> Well, he's got people around him, but
2:40:30
he's got very few people around him that
2:40:33
can kick reality at him.
2:40:35
>> Yeah.
2:40:35
>> You know, that that is a bit of a
2:40:37
problem, but he still seems to be having
2:40:40
fun. Every time I'm around him, we have
2:40:41
a bunch of laughs. Like, he's fun to
2:40:43
hang out.
2:40:44
>> He's got an incredible sense of humor.
2:40:45
We um Jamie and I went down uh to one of
2:40:49
the rocket launches at SpaceX. Yeah, we
2:40:51
went down there crazy
2:40:52
>> and we watched from the ground while I
2:40:56
took off, which is incredible cuz it's
2:40:57
like how far was it, Jamie? It was like
2:40:59
two miles away from us.
2:41:00
>> A mile mile and a half.
2:41:01
>> So like it's a mile and a half. You feel
2:41:03
it in your chest. Have you been when a
2:41:05
when a rocket launches? You been there,
2:41:06
dude? It's bananas. The [ __ ] thing
2:41:08
like first of all, it doesn't look that
2:41:09
far. It looks like it's like
2:41:12
>> maybe quarter mile. it. I don't I'm just
2:41:14
not good at judging.
2:41:15
>> This is a starship.
2:41:16
>> Oh yeah. So you feel it.
2:41:20
You like his kids started crying like we
2:41:22
want to go inside. Like it's disturbing
2:41:24
like the amount of energy that's coming
2:41:27
out of these [ __ ] rocket boosters.
2:41:29
And then I hung out with him in the
2:41:31
command center while the rocket was
2:41:33
flying through space and we're watching
2:41:34
it on all these monitors and then lands
2:41:37
in the water in Australia. He's cracking
2:41:39
jokes the whole time because the thing
2:41:41
is like losing pressure because it's
2:41:43
they're stress testing all the stuff
2:41:45
which is really funny when really dumb
2:41:47
people, oh he's a [ __ ] dumbass, his
2:41:48
rockets keep blowing up. Like like they
2:41:51
just don't understand like the only way
2:41:53
you find out what the capability of this
2:41:55
technology is is you have to like let it
2:41:58
blow up and then you go, "Okay, it needs
2:41:59
to be thicker. It needs to be this and
2:42:01
that and we need to add these things and
2:42:02
there's sensors everywhere." And so he's
2:42:04
cracking jokes the entire time while
2:42:06
this thing is like losing pressure. And
2:42:07
it eventually wound up landing and it
2:42:09
was fine, but it did have a hole in it.
2:42:12
But it was just like he's laughing like
2:42:14
he's having a good old time. He's not
2:42:15
freaked out. No,
2:42:16
>> you know, he he's uniquely built to
2:42:19
handle it.
2:42:19
>> I uh when there was a rocket launch in
2:42:22
Vraenberg in California and I chartered
2:42:25
a Pilates and I because you can get like
2:42:28
a little like propeller plane.
2:42:30
>> Oh, okay. And I went around and around
2:42:32
and I have this video of it kind of like
2:42:34
coming up and through cuz like
2:42:36
>> how close were you?
2:42:40
>> 100 miles.
2:42:41
>> Oh wow.
2:42:42
>> But you but it's like right there.
2:42:43
>> Uhhuh.
2:42:44
>> You know cuz the distance and it's
2:42:46
coming up and I'm kind of going around.
2:42:48
It was the craziest thing. It was cool.
2:42:50
It was super cool.
2:42:51
>> That [ __ ] is super cool.
2:42:53
>> It's very cool. It's very cool. I mean
2:42:55
just Starbase is bananas. Just when you
2:42:58
go down there and they have their own
2:42:59
town, the whole thing is straight.
2:43:00
There's [ __ ] cyber trucks everywhere.
2:43:02
I'm like, how do you find your car?
2:43:03
Like,
2:43:04
>> is it is it an incorporated town? It
2:43:06
started off as unincorporated, but it
2:43:08
own thing now.
2:43:09
>> I believe it's its own town. And
2:43:11
>> is there a mayor?
2:43:12
>> That's a good question. I think there
2:43:13
is. I think we talked about this. I
2:43:16
don't remember though. But the actual
2:43:19
factory itself is nuts cuz I Jamie and I
2:43:23
were both like, "This is way bigger than
2:43:25
I thought it was going to be." And the
2:43:26
rockets are way bigger than you thought.
2:43:27
And like the garage doors are [ __ ]
2:43:29
bananas.
2:43:30
>> They got a city government website
2:43:33
commission mayor.
2:43:37
>> That's crazy.
2:43:39
>> Bobby Bobby Peted. Bobby Peted is the
2:43:41
mayor.
2:43:42
>> They have their own little Bobby. That's
2:43:43
awesome.
2:43:44
>> Irish pub and it's like it's really
2:43:46
cool. They have really good food. you
2:43:47
know, when he uh when he opened the
2:43:48
first uh Gigafactory, which was in
2:43:50
Nevada, we had a party and uh like it
2:43:54
was like a small opening thing and so we
2:43:55
all drove in there and I have a video of
2:43:58
me in a just like a pickup truck driving
2:44:00
into the thing. I started the video and
2:44:04
I think it was 43 seconds until it
2:44:06
ended. And this is like, you know, a
2:44:08
decade ago and I thought to myself, this
2:44:10
is implausible. Like I've never even
2:44:12
contemplated things that could be built
2:44:14
this big. I didn't think it was allowed.
2:44:16
I don't even know how something like
2:44:17
this works.
2:44:19
>> And I was like, how does how do you
2:44:21
envision this whole thing works? Like
2:44:23
simple.
2:44:24
>> Raw materials in the front, cars out the
2:44:27
back.
2:44:29
I'm like, that's it. It sounds so
2:44:32
simple.
2:44:33
>> Well, he thinks big.
2:44:35
>> He thinks big. And thank God he's
2:44:36
around. I mean, if he wasn't around, if
2:44:38
he hadn't purchased Twitter, I think our
2:44:41
entire civilization would look very
2:44:42
different.
2:44:43
>> Very different.
2:44:44
>> It would. I mean, that sounds like a
2:44:45
very grandiose thing to say.
2:44:46
>> Sounds hyperbolic, but you're right.
2:44:48
>> I think it's true because I don't I
2:44:50
think free speech is a core component of
2:44:52
our civilization, and I don't really
2:44:55
think we had it.
2:44:56
>> I think it was curated, and it was very
2:44:58
tightly controlled by the actual federal
2:45:00
government, which is spooky.
2:45:01
>> No, no. It decided what we should be
2:45:05
paying attention to. Yes.
2:45:06
>> Just just put it very simply without
2:45:08
kind of like
2:45:09
>> And that's not right.
2:45:10
>> Right. Because when they're telling you
2:45:13
to pay attention to this and the actual
2:45:16
issue is this and you cannot, then you
2:45:19
can't fix what's actually broken,
2:45:21
>> right?
2:45:21
>> And you start to we start to basically
2:45:23
be like we're we're part of just a a
2:45:26
useful idiot for these people.
2:45:27
>> Yes.
2:45:28
>> And that's not right.
2:45:29
>> It's not right.
2:45:31
>> Listen, man. This was a lot of fun. It's
2:45:32
always great to talk to you. Thank you
2:45:33
very much for doing this. It was very
2:45:35
cool. Um let's do it again sometime. All
2:45:38
right. Thank you. All right. Bye,
2:45:40
everybody.
— end of transcript —
Advertisement
Ad slot

More from PowerfulJRE

Trending Transcripts

Disclaimer: This site is not affiliated with, endorsed by, or sponsored by YouTube or Google LLC. All trademarks belong to their respective owners. Transcripts are sourced from publicly available captions on YouTube and remain the property of their original creators.