[00:01] Joe Rogan podcast. Check it out. [00:03] >> The Joe Rogan Experience. [00:06] >> TRAIN BY DAY. JOE ROGAN PODCAST BY [00:08] NIGHT. All day. [00:12] >> Yeah, I was listening to Tim. First of [00:14] all, hello. [00:15] >> What's up? [00:15] >> Good to see you, my friend. [00:16] >> Great to see you. [00:17] >> Uh, we were listening to Tim Dylan. I I [00:19] was listening to on the way over here [00:21] and he was talking about uh Anna Paulina [00:23] Luna and Tim Bchett and Trump. They're [00:26] all talking about the UAP disclosures [00:28] and like why now? Like what are they [00:31] doing? Like why are they distracting us [00:32] with this? Timberchet said that whatever [00:35] they're going to release it will be [00:37] indigestible. [00:39] >> What does that mean? [00:40] >> Right. [00:41] >> Indigestible as in or well then it [00:44] doesn't mean that it's real then. [00:46] >> Well, I think it means that it'll be so [00:48] crazy if it's real. So crazy. He's the [00:51] one that's been saying that there's [00:52] these confirmed bases under the ocean, [00:56] that there's these specific locations. I [00:58] think you talked you're shaking your [01:00] head. You don't believe a word of it. [01:01] >> No. [01:02] >> How come? [01:02] >> I think I think it's true that there [01:05] look, [01:07] it's completely implausible that there [01:10] aren't other species, [01:11] >> right? [01:12] >> Completely implausible. [01:14] >> Just the vastness of what we're dealing [01:16] with. So the real question is like why [01:18] haven't we encountered people or those [01:21] things those beings, [01:22] >> right? [01:23] >> And it's probably because they just they [01:25] have bigger fish to fry, you know? So by [01:28] the time that we meet them and they meet [01:30] us, we're we're going to kind of be at [01:32] the edge of like we've we've kind of [01:34] been there done that on our own planet [01:37] and then we've kind of like developed [01:38] the technology I guess to get beyond it. [01:41] Um but somewhere along the way there [01:43] must have been a few just mathematically [01:45] impossible. So then the question is, is [01:46] it buried? Or were people confused when [01:48] it first came here? Like if you had a [01:50] spaceship land in like the 1800s, [01:52] >> right? [01:52] >> What would people have done? They would [01:54] have just freaked out. They wouldn't [01:56] have understood it. Maybe they would [01:57] have buried it. Depending on where it [01:58] was, maybe they started to pray to it, [02:00] >> right? [02:01] >> And you would have just moved on. And [02:03] then that isn't documented in history. [02:05] So, [02:05] >> but it is. [02:06] >> But how? [02:07] >> It is. There's a lot of it documented in [02:09] history. [02:10] >> Oh, you mean like hieroglyphics and like [02:12] monuments? [02:12] >> Well, the book of Ezekiel. The book of [02:14] Ezekiel goes in depth about some sort of [02:16] a UFO encounter that Ezekiel [02:19] experiences, [02:20] >> right? [02:20] >> Where it's a wheel within a wheel and a [02:24] a cloud with fire flashing forth [02:26] continually in the midst of a cloud as [02:28] it were gleaming metal and from the [02:30] midst of it came the likeness of four [02:32] living creatures and the creatures [02:34] darted to and fro like the appearance of [02:36] a flash of lightning. This is all in the [02:38] Bible. Um it's also in the Mahabarada. [02:41] Um they they talk about vaymanas these [02:44] flying crafts and [02:46] >> I think it's entirely possible that we [02:48] have been visited periodically [02:51] >> and that we are we have been monitored [02:53] and that we are monitored. [02:54] >> I agree [02:55] >> currently. [02:56] >> I agree. [02:56] >> And if I was going to hide, I would hide [02:58] in the ocean. [02:59] >> Well, to be honest, as I get older, I'm [03:01] convinced [03:03] >> we're basically in some form of a [03:05] simulation. There's like all these [03:07] little ingredients [03:08] >> that if you start to see these little [03:10] clues, you're like, [03:12] >> they all seem so odd in isolation and [03:14] then when you put them together, I feel [03:15] like a crazy person. So, I ignore [03:17] myself, [03:17] >> right? [03:18] >> But I wonder like why did this happen? [03:19] Like yesterday, I was um at a dinner in [03:22] LA before I came to see you. And um I [03:26] told this very interesting story. Well, [03:28] or I thought it was interesting at the [03:29] time. Um [03:31] you know that like so in 2000, right? If [03:34] you think of like what happened in tech [03:36] since 2000, so the last 26 years, [03:40] people can give you all kinds of like [03:41] fancy theories, [03:44] but there's just like this weird [03:47] word that's been at the center of every [03:50] single technological revolution for the [03:53] last 30 years. And that word is [03:55] attention. Let me explain this to you. [03:58] Google, they invent Google. What is [04:00] Google? Google is a algorithm. It's [04:03] called page rank. But if you look inside [04:06] of it, what is it? It says, "Well, [04:07] Chimath's website has five links to it. [04:11] Joe's website has two links. He's [04:13] getting more attention." Okay, Chimath's [04:16] website is more important. That's the [04:17] sum total of Google. Now, they've made [04:19] that a lot more refined and they've done [04:21] all these other fancy things, [04:24] but it's all about attention. Mhm. [04:26] >> Fast forward to 2007 89 when you know [04:29] Zuck and then when I went to work for [04:31] Zuck and we got on the scene and we're [04:32] like what do everybody what does [04:34] everybody care about? Attention. And so [04:38] what is like the Facebook algorithm? [04:40] What's the Instagram algorithm? You know [04:42] how did we construct newsfeed all around [04:45] attention? Joe had 35 likes. Jamie had [04:47] 12 likes. Your thing is more important. [04:49] Let's give it more more importance [04:51] because it's seemingly meeting all these [04:53] human needs. Attention. Attention. [04:55] attention. So phase one attention, phase [04:58] two attention, and this is where I'm [05:01] like, how can this be possible? In phase [05:02] three, we're like looking at AI. And [05:05] when you look backwards four years, the [05:06] seminal paper is called attention is all [05:08] you need. It's about this word again. [05:12] And when you look inside of the core [05:15] part, if you peel out peel, you know, [05:17] apart AI, [05:19] the little brain that makes it so [05:21] capable, it's called an attention [05:23] mechanism. It's just attention. It's all [05:25] about again this idea of I'm gonna scour [05:28] all this information and I'm gonna [05:30] figure out what patterns repeat itself [05:31] and I'm just going to double down on the [05:33] stuff that I see more of because that [05:35] attention must mean it's more important. [05:37] It's more true. It's more knowledgeable [05:40] and then I think how could it be like [05:42] we're all like why is it that these [05:44] things are just repeating over and over [05:46] again? And I just get confused. I don't [05:48] I don't exactly know how to explain it. [05:50] So are there other ways in which we [05:52] should be doing things? Absolutely. Have [05:54] we even explored it? No. So then I [05:56] think, well, is this just a simulation? [05:57] Some kid in a [ __ ] in his house just [06:00] playing some simulation and we're all [06:01] just party to it and that's all he [06:03] understands is attention. I don't know. [06:05] >> I don't think it's that simple that [06:07] there's a person playing a game. But if [06:10] you break down just attention, well, [06:12] that's all of human history is paying [06:16] attention to the king, paying attention [06:18] to the war, paying attention to [06:21] resources, paying attention to who says [06:23] the thing that resonates the most with [06:25] the people. It's all about what human [06:28] beings are paying attention to. [06:30] >> I think it's part of it. Then there's [06:32] also what is actually true. And I think [06:36] sometimes what is true and what people [06:37] pay attention to are not the same thing. [06:40] >> True. [06:41] >> And sometimes the thing that you should [06:44] be paying attention to gets lost because [06:47] the thing that you are paying attention [06:48] to gets more attention because it's more [06:50] interesting and useful. That's sort of [06:53] where we are right now. We're in this [06:54] really weird phase, I think, where [06:57] you actually like should be focused on [07:00] this thing over here and instead we're [07:02] all focused on all these things over [07:04] here. or give me an example. [07:06] >> Um, [07:08] here's like a very big one. I think like [07:11] it's pretty fair to say since the last [07:13] time you and I saw each other on this [07:14] show, [07:16] the attitude towards technology, I [07:20] think, has been pretty profoundly [07:22] negative. It's kind of tilted. It's [07:24] relatively like anti-AI, [07:27] you know, anti-billionaires. It's anti- [07:29] all of this stuff. [07:31] Um, and it manifests in all of these [07:36] interesting ways. There's protests, [07:37] there's data centers, there's all of [07:39] this stuff that's happening. Um, people [07:42] are worried about job loss. All of that [07:44] stuff is real. [07:45] >> Do you want a cigar? [07:46] >> No, I'm okay. I'm okay. Um, [07:49] but what should they really be focused [07:50] upon? And I think what they should be [07:53] really focused upon is we're at the tail [07:56] end of a cycle that doesn't work [07:57] anymore, which is all about like this [08:00] tension between labor, people that do [08:02] the work, and capital, the people that [08:03] fund it and then make all the returns. [08:06] And over the last 40 years, we've [08:07] basically gone to this completely upside [08:09] down world where capital extracts all of [08:13] the upside [08:14] and labor has extracted less and less [08:17] and less and less. And all of this push [08:19] back, it manifests in AI, it manifests [08:22] in politics, it manifests in social [08:24] issues, it manifests in, you know, [08:26] Israel, whatever you want to talk about. [08:28] All of these issues, I think [08:29] symptomologically, [08:31] come from this other issue, which is we [08:33] are out of balance. This total compact [08:36] that we used to have, a liberal [08:38] democracy and a free market has totally [08:40] collapsed. And there are simple ways to [08:42] fix that, but that never gets the [08:44] attention because it's not what you want [08:46] to talk about. the attention is here, [08:48] you know, vote no to the data center. [08:51] You know, uh this model is going to take [08:53] out all the jobs. Um [08:56] you know, this social issue is really [08:58] important. That war should not be [09:00] fought. That war should be fought. All [09:02] of these things while important [09:06] distract us from what the core issue is. [09:08] And the core issue is that we as a [09:10] society, I think, are out of balance. [09:12] the the natural compact between all of [09:15] us is broken and there are some simple [09:18] ways to fix that compact. Get people [09:20] more invested, get people more engaged [09:22] in the upside, have people have a [09:24] positive some view of what's happening [09:26] and that isn't happening. [09:27] >> Well, what what simple solutions are [09:29] there to to this one very particular [09:32] issue? [09:33] >> Okay, I'll get your reaction to this. [09:35] Let's assume that you still lived in [09:37] California because I think it it tells [09:38] this example in a more extreme way. [09:40] >> Okay. Um, let's say you make a million [09:43] bucks a year, which is a lot of money, [09:44] but it it makes the it makes the point [09:47] more cleanly. Um, [09:50] you'd pay, I think, 30% [09:54] federal tax and you'd pay another 15 or [09:58] 16% in state tax and Medicare tax and [10:01] all this tax. So, if you're a wage [10:03] earner, [10:05] 50% of all your upside [10:09] goes to the government. [10:11] If you're a capital earnner and you make [10:14] that same million dollars via capital [10:17] gains, you pay half that tax. [10:21] Why did that happen? That happened [10:23] because in the 40s and 50s, but really [10:25] in the 60s and 70s and 80s, [10:28] what we were trying to do or what the [10:30] American government and what western [10:32] societies were trying to do was to [10:35] convince people to invest their money. [10:38] Hey Joe, go build that factory. Go hire [10:41] those people and we're going to [10:42] incentivize you to do so. [10:45] And by doing that, there was this idea [10:48] that that all of those profits that you [10:50] would get would then diffuse, right, [10:51] trickle down into everybody else. The [10:54] workers participated, everybody [10:55] participated. [10:57] But technology allows you to do more [10:59] with less and less. So now what happens [11:02] is the capital owners can acrue [11:06] infinite almost it seems like value and [11:10] the workers get less and less. But now [11:11] if you get less and less and you're [11:12] taxed more and more as a percentage of [11:14] what you own, you're going to feel [11:16] really out of sorts. You're going to be [11:17] like, why am I paying 50 cents of every [11:19] dollar? And I see these other ways where [11:21] folks are paying 25 cents on their [11:23] dollars, but their dollars are [11:24] compounding way faster and they have, [11:27] you know, hundreds of billions more of [11:29] those dollars than I have of my dollars. [11:31] If you take that example and you expand [11:33] it across society, I think people [11:35] understand that now. There's enough [11:37] information and there's enough people [11:38] talking about it where it's pretty clear [11:40] that that's happened. So the question is [11:42] how do you fix it? I think like if you [11:44] think about AI and if you believe that [11:46] we're going to get into this world of [11:47] abundance [11:50] and we're not working, what does it mean [11:52] for governments to tax our labor? There [11:54] is no labor. You're not working anymore. [11:56] I'm not working. We're doing things out [11:58] of leisure. [11:59] Why should I pay 50 cents of every [12:01] dollar? Why aren't the companies that [12:02] are going to be making trillions of [12:03] dollars, why don't they pay more? [12:06] Why isn't there, you know, an [12:08] expectation that they then help our [12:11] lived society [12:13] do better and thrive as a result of all [12:16] of that winning? That's the real [12:18] conversation that I think is [12:21] bubbling. [12:23] And I think that we're probably another [12:25] 12 to 18 months where all of these other [12:28] issues are going to be important, but [12:30] they're going to be viewed for for what [12:32] they are. um they're going to get [12:34] demoted I think in importance and it's [12:36] this core structural issue it's what is [12:39] the economic relationship that we have [12:41] together as a society what is the [12:43] relationship between Joe Chamath Jamie [12:46] and all these companies and how do we [12:50] feel about a few and an evershrinking [12:52] few making more and more and more [12:57] and then how do we feel about their [13:00] ability to share that with a small [13:03] amount of people [13:05] and then what do what is the expectation [13:07] for everybody else? I think that's [13:09] mostly at the core of what's happening. [13:12] And so back to like, you know, all of [13:15] this attention that we give to these [13:16] other issues distracts from that one [13:18] because I think you can get organized to [13:20] fix this issue. You can't get [13:21] concessions on any of these issues. [13:23] >> Mhm. [13:23] >> You know, you bring up Israel, it's like [13:25] this. You bring up social issues, it's [13:26] like this. You bring up, you know, [13:28] whatever you want to bring up, people [13:30] just kind of take a side, nothing [13:32] happens. This is actually where people [13:34] are universally actually much more [13:36] aligned than you think because there's [13:38] reasonable ways. One simple way was is [13:40] you'd say, well, let's flip the taxation [13:41] model. [13:43] Corporate taxes [13:46] should exceed personal taxes. They've [13:49] never [13:52] we should have an expectation that then [13:55] corporate [13:57] actors can buy down their taxes if they [13:59] want, but if they do social good for [14:01] society. I'll give you an example. At [14:03] the industrial revolution, there's a [14:05] table like this and the leading lights [14:08] of that era, Andrew Carnegie, Nelson [14:10] Rockefeller, Jay Gould, JP Morgan, [14:15] they sat together and they said, "Guys, [14:17] this is going to benefit us this [14:19] industrial revolution. [14:21] It may not benefit everybody. What is [14:23] our responsibility? What is our [14:25] collective responsibility?" [14:27] And they allocated tasks. Carnegie went [14:30] and built libraries [14:32] all throughout the country. Rockefeller [14:34] built universities, hospitals were [14:36] built. And I think what happened is [14:39] society was like, "Wow, these are living [14:40] testaments to us doing well." And so [14:44] then they were okay with this [14:45] transition. [14:47] But if you think about it today, what [14:48] are the living tributes that, you know, [14:52] capital builds and leaves behind for [14:54] society? It's fewer and fewer. [14:57] I think that's a that's a very big [14:59] opportunity for somebody to fill it. I [15:01] think it's like especially for folks in [15:04] tech. I think if they can get themselves [15:06] organized to do that, I think we we land [15:08] in a good place. If they cannot get [15:10] themselves organized to do that and say [15:12] everyone for themselves, [15:16] I think it's going to be really [15:17] complicated. Super messy. super messy [15:20] because [15:21] >> super messy [15:22] >> that sentiment that the wealthy are [15:25] getting wealthier and the middle class [15:26] is disappearing and the poor are being [15:29] taxed into oblivion. [15:31] >> Look, an $80,000 year teacher pays 40% [15:34] tax. But if you're a multi-billionaire, [15:38] most of your wealth is not W2 wages. [15:42] It's cap gains. But there's all kinds of [15:44] ways to shelter cap gains. There's all [15:46] kinds of ways to defer. And so even [15:48] though you pay more on an absolute [15:52] dollar basis, on a percentage basis, [15:53] you're paying way way less. And all of [15:56] those tricks have been exposed. [15:59] They've all been exposed. These are all [16:01] mechanisms that were that were, you [16:03] know, invented from the 1980s to now, [16:06] right? By all the by all the banks and [16:08] all the folks that wanted to come to [16:10] folks that had wealth. And so it's it's [16:12] and it's all known. And I think people [16:14] are kind of like, "Hey, hold on a [16:16] second. This just doesn't feel fair [16:18] anymore. [16:19] >> Absolutely. But the other problem with [16:23] that is if you do tax correctly, where [16:27] does that money go? And who's managing [16:30] it? And ultimately, who's managing it is [16:33] the federal government. And they've been [16:36] shown to be completely inept at managing [16:39] your money correctly. The fraud and the [16:41] waste is off the charts. the amount of [16:44] NOS's that have insane amount of funds [16:47] at their disposal. I mean all this is [16:49] exposed by Doge, right? And you realize [16:51] like how much fraud and waste there is [16:53] and how much money. So the solution [16:56] being tax people more [16:59] >> that doesn't sit with a lot of people [17:00] because it's like well where where is it [17:03] going and who's managing it? If if the [17:06] federal government was being forced to [17:09] handle money the same way a private [17:11] company does, if it was all out in the [17:14] open, everything was exposed, they would [17:17] have gone bankrupt a long time ago. They [17:19] would have gone under a long time ago. [17:21] There's no way they would have been [17:23] allowed to function the way they are. [17:25] The people that are managing that money [17:27] would have all been put in jail. There's [17:29] not a chance in hell that [17:32] giving them more money is going to solve [17:34] anything. They're going to find more [17:36] ways to put more of that money into [17:39] NOS's that puts more of that money into [17:41] Democratic coffers and Republican [17:42] coffers. They're going to figure out a [17:44] way to funnel that money around where [17:46] it's not going to benefit people. I [17:48] mean, a good example of that is like [17:50] where let's let's look at the LA fire [17:53] thing for instance. All right. So the LA [17:56] fire fund, there's a giant fire in the [17:58] Palisades. All this money gets raised. [18:02] It's over $800 million. It goes to 200 [18:05] plus different nonprofits. None of it [18:09] goes to the people, [18:10] >> right? [18:10] >> Spencer Pratt, who's running for mayor [18:12] of Los Angeles. [18:13] >> He's doing a great job, by the way. [18:14] >> [ __ ] phenomenal. [18:16] >> Those ads are those ads are fire. [18:19] >> They're so good. [18:20] >> They're fire. I He's doing and he's [18:22] doing it all out of a trailer. Yeah. [18:23] >> On his burnt out land. I mean, he's the [18:25] most righteous guy running in that [18:27] regard. [18:28] >> But just that being exposed like, okay, [18:32] we're going to help out these people. [18:34] We're going to donate money. We're going [18:35] to raise money. We're going to do some [18:37] good. We feel terrible about the people [18:39] in our community that have lost homes. [18:41] Well, what happens? Well, the same [18:43] people that you're saying we should give [18:44] more taxes to take that money and they [18:48] just give it to a bunch of nonprofits [18:50] and charities. This episode is brought [18:52] to you by Arra. Every week there's some [18:54] new wellness hack that people swear by [18:56] and after a while you start thinking, [18:59] why do we think we can just outsmart our [19:01] bodies? That's why Arra colostrum caught [19:04] my attention. It's something the body [19:06] already recognizes and has hundreds of [19:09] these specialized nutrients for gut [19:11] stuff, immunity, metabolism, etc. I [19:15] first noticed it working around [19:16] training, especially workout recovery. [19:19] Most stuff falls off, but I am still [19:20] taking this. If you want to try, Arra is [19:23] offering my listeners 30% off plus two [19:26] free gifts. Go to armorra.com/roganogen. [19:30] >> I'm not saying give more tax. What I'm [19:31] saying is, [19:32] >> right, [19:33] >> people are taxed too much. Yes, [19:34] >> corporates are not taxed enough. Flip [19:36] it. That's [19:37] >> right. But even if you do flip it and [19:39] the corporates are taxed more, where's [19:42] that money going? This is the problem. [19:44] >> I suspect that if you put the burden on [19:46] Wall Street and corporates, [19:49] um, they'd be a lot more organized and [19:52] they'd probably create a lot more change [19:53] than a diffuse electorate. Meaning like [19:57] let's just say the government spends a [19:58] trillion dollars and wastes it. I'm [20:01] generally like roughly aligned with [20:02] that. [20:04] If you waste a trillion dollars from 300 [20:07] million people, it's hard to organize at [20:10] 300 million people. But if you waste a [20:12] trillion dollars from 300 companies, [20:15] those companies will get their [ __ ] [20:16] together really fast. And they will [20:18] force a lot more change. [20:19] >> I would hope so, but you're still [20:20] dealing with incompetent people that are [20:22] tasked with taking care of that money. [20:25] Not just incompetent, but [20:26] >> don't get me wrong, I'm not defending [20:27] these people. [20:28] >> Decades of corruption. decades and [20:31] decades of all these mechanisms where [20:34] they can take this money and funnel it [20:36] into these NOS and these nonprofits and [20:40] all these different weird organizations [20:43] that don't seem to have accountability [20:45] for what they do with that money, that [20:47] gets real slippery. [20:48] >> Yeah. And if those people in turn make [20:52] deals with those corporations that allow [20:54] them to do certain things and push [20:56] things through that maybe they would [20:58] have difficulty doing, then you have a [21:00] different kind of a working relationship [21:01] with the same groups of people and the [21:04] same government. You just take money [21:07] from corporations and move it into a way [21:09] where the corporations ultimately [21:11] benefit from it, but yet it doesn't do [21:13] any good to the people. [21:14] >> Yeah. I mean, I can see where you're [21:15] coming from. I just think that if we go [21:17] on the track we're going down, [21:21] >> it just seems like we're we're going to [21:23] hit a crisis. [21:24] >> Yes, [21:25] >> the crisis is you can't expect people to [21:27] pay more and more and more. Again, I [21:29] agree with you. The premise is we're all [21:31] paying for a system that's broken. [21:33] >> That should that should change. But we [21:36] still continue to have to pay our taxes. [21:37] But if taxes keep going up like this at [21:39] the individual level [21:42] and we don't manage this transition to [21:44] something where we may be working less [21:46] and less, what are we getting paid to [21:47] do? And then at that point, how are we [21:49] expected to pay what? 90% of what? [21:51] >> Right? [21:52] >> 50% of what? I think people do have this [21:55] weird feeling of [21:59] dread that the people that are in [22:02] control of a lot in this country, the [22:06] the tech companies in particular, [22:07] particularly the tech companies like [22:09] Google and Facebook that are essentially [22:11] involved in data collection and then [22:14] ultimately dissemination of information [22:16] that they have acquired enormous amounts [22:19] of wealth and power and influence and [22:21] they're essentially a new form of the [22:24] government. [22:25] >> Yeah. [22:26] >> You know, are you aware of Robert [22:27] Epstein? Do you know about his work? [22:30] >> Not Robert Epste. [22:31] >> No, different guy. Different guy. Um, [22:33] Robert Epstein is a guy who specializes [22:36] in uh understanding what curated search [22:40] results do and what what Google's able [22:44] to do with in particular with curated [22:47] search results in terms of influencing [22:49] elections [22:51] >> that like say if you you have two [22:53] candidates that are running. Let's just [22:54] say let's just take LA for instance. If [22:57] and I'm not making any accusations, but [22:59] I'm saying if they wanted Karen Bass to [23:02] win and you searched Karen Bass, you [23:05] would find all these positive results. [23:07] If you searched Spencer Pratt, you would [23:10] find all these negative results. And [23:12] there's a bunch of people that are [23:14] always undecided voters and those are [23:17] the ones that you really want. They're [23:18] like, I don't know. I don't know. And [23:20] come election night, those are the [23:21] people you want to try to grab and it's [23:22] it's generally a large percentage. You [23:25] can influence an enormous percentage of [23:27] those people just with search results. [23:29] >> Yeah. [23:29] >> Where you can shift an election one way [23:31] or another. [23:32] >> I believe it. [23:32] >> Yeah. And he's demonstrated this and [23:35] shown how this is possible. Um that [23:38] freaks people out that tech companies [23:41] are in control of narratives that tech [23:44] companies can censor information [23:47] especially tech companies that work in [23:48] conjunction with the government. [23:50] >> Right? [23:50] >> This is what we found out when Elon [23:52] purchased Twitter. Right? Right. When [23:54] Elon purchased Twitter, we got all this [23:56] information from the Twitter files when [23:58] all the journalists were allowed to go [23:59] through it and they said, "Oh, this is [24:01] crazy. You've got the FBI, the CIA, [24:04] you've got all these companies, all [24:05] these government organizations that are [24:08] essentially [24:10] controlling the narrative of free speech [24:12] in the country. And they're doing it in [24:14] a way that benefits them. They're doing [24:16] it in a way that benefits what political [24:17] parties in charge at the time was the [24:19] Biden administration. and they were [24:22] allowed to do a bunch of weird [ __ ] [24:24] which which should be illegal but it's [24:26] not technically illegal. [24:28] >> And that freaks people out because [24:29] there's no real laws and rules in regard [24:32] to what they're allowed to do and what [24:34] they're not allowed to do. Like curated [24:36] search results should be illegal. [24:38] >> They're shaping attention. [24:39] >> Yes. Attention. Again, it goes back to [24:41] attention, right? [24:42] >> They're shaping attention. [24:43] >> Yeah. Um it's that's a big concern for [24:47] people. And I think then when you find [24:49] out that these people are able to amass [24:52] enormous sums of wealth and have [24:53] incredible amount of power and influence [24:55] because of this enormous enormous wealth [24:58] and this control over these tech [25:01] companies that have essentially become [25:03] the town square of the world. [25:05] >> Yeah. [25:06] >> That freaks people out and that these [25:08] very small number of people, you know, [25:10] you think of Zuckerberg, you think of [25:12] Tim Cook and I don't I don't know who [25:14] the new guy is now. Who's the name? John [25:16] John Fern, [25:17] >> right? [25:17] >> Furnace. No, [25:18] >> I forget his name. [25:19] >> Yeah. [25:19] >> Turn. [25:20] >> Turn. Turn. [25:21] >> But but that kind of thing [25:25] gives people a lot of concern, right? [25:28] It's like that these people, these [25:29] unelected people are in control of a a [25:33] giant chunk of [25:35] how the world works. [25:38] >> I think that this is the existential [25:40] question that we are dealing with. [25:42] you're going to have five or six [25:44] companies [25:46] concentrate like whatever power you [25:49] think has been concentrated up until [25:51] now. I think we're going to look back [25:53] and it's going to look like a Sunday [25:55] picnic 10 or 15 years from now because [25:58] on the one hand it's going to be an even [26:00] smaller subset and on the other hand the [26:02] capability is going to be an order or [26:04] two orders of magnitude. So can you [26:06] imagine what that must be like? It's [26:07] kind of like showing up, getting dropped [26:09] into the 1800s, and you've invented the [26:12] engine and everybody else's horse and [26:13] buggy. [26:14] >> You can just decide to your point. That [26:18] is where we're going. [26:20] >> It's even more crazy. It's like [26:22] everybody else is on a horse and buggy [26:24] and you've got an internet connection [26:25] with a cell phone. [26:27] >> Exactly. Exactly. [26:28] >> It's even more crazy. [26:29] >> Exactly. because it's what we're dealing [26:32] with with AI right now is first of all [26:35] it's already lowered children's [26:37] attention spans and it's shrinking their [26:40] capacity to acquire or absorb [26:43] information because what they're doing [26:45] now is just relying on AI to answer all [26:48] their questions for them. Now, is that [26:50] their fault? Kind of, right? Because it [26:52] doesn't have to be that way. You could [26:53] still acquire information the [26:55] oldfashioned way. You can still learn [26:57] things the right way. But a lot of kids [26:58] are just concerned with passing [27:00] examinations and getting into good [27:01] schools. And what they're doing is just [27:03] using AI and they're they're getting [27:05] better test results, but they're also [27:07] not as smart. [27:08] >> Yeah. [27:09] >> Which is really weird. Yeah. [27:10] >> It's like we're relying on it like we [27:14] you know it's like it's essentially like [27:19] replacing our mind. And that's just the [27:23] this is the beginning. This is like [27:26] these are the toddler days [27:28] >> of AI and where it's going to be a super [27:31] athlete in a few years. [27:33] >> Yeah. I think we have to figure out how [27:36] first of all [27:38] kids need to learn and I think this is [27:40] where like we have to do a better job as [27:42] parents. Kids need to learn how to be [27:43] resilient thinkers. I don't even know [27:45] what that term meant before, but I know [27:46] what it means now, which is like you [27:48] take this AI slop and you just kind of [27:50] like pass it off. And if like the [27:52] teachers and the school system aren't [27:54] trained, they're just like, "Wow, this [27:56] looks good." They have to be able to [27:58] push back. Parents need to be able to [28:00] look at this [ __ ] But then all of this [28:02] stuff, I'm just like so frustrated [28:03] because it's like one more thing that I [28:04] have to do as a parent. Like, [28:06] >> right? [28:06] >> Every time technology gets better, it's [28:08] one more thing, you know, [28:09] >> right? [28:10] >> We're going to make the world, you know, [28:11] super connected and social and all that [28:13] stuff. It sounds great to me until I [28:15] have to be the one that has to tell my [28:16] kid I can't they can't get Instagram [28:18] >> and then they're up my ass every day, [28:20] right? you know, and it's just like I [28:22] don't want to have to deal with this [28:23] stuff, [28:24] >> right? [28:24] >> I want this to be handled in a way that [28:27] just allows me to do what I want to do. [28:29] I don't want to say no to my kid. I [28:31] don't want to police his schoolwork and [28:32] make sure he's not cheating or not [28:35] learning and just like, you know, [28:36] passing off this AI slop. What am I [28:40] where are my tax dollars going? Where is [28:43] everybody else in all of this? It gets [28:45] very frustrating. And again, it goes [28:47] back to like this feeling of like, well, [28:49] is this all getting better for me or is [28:51] this kind of like not, you know, people [28:53] start to be nostalgic [28:55] >> for what it used to be because it was [28:56] just simpler, but I think that's a [28:59] different way of saying easier. [29:00] >> Well, we're just dealing with we're at [29:03] the edge of great change. Like great [29:06] change that has no real understanding of [29:09] how it turns out. [29:10] >> Yeah. [29:11] >> And I think that understandably freaks [29:14] people out. Freaks me out. It freaks me [29:16] out, but I've kind of gotten to this [29:18] place where I'm like, well, it's going [29:19] to happen. [29:20] >> Did you see this thing? Um, it's a CEO [29:22] of Verizon, Dan Schulman. He put out [29:24] this very public forecast, you know, [29:28] very smart guy, well regarded in [29:30] business. And I think he said something [29:32] like 30% [29:34] of all white collar jobs will be gone by [29:36] 2030. I don't know, Jamie, maybe you can [29:38] get the exact thing, but it's something [29:39] like that. [29:39] >> That's probably optimistic. And I [29:41] thought at first my initial reaction was [29:43] like this is totally not credible. But [29:46] then I'm like hold on a second that's my [29:47] bias because I want to believe that [29:48] that's not possible [29:50] >> honestly you know and as I've gotten [29:52] older I'm I'm a little bit better now [29:54] like okay hold on a second let's weigh [29:55] the probabilities. And now I was like [29:57] man if I'm going to be fair maybe [30:00] there's a 10 20% chance of that. There's [30:04] a bunch of other outcomes that are much [30:06] better than that but that's part of the [30:08] set of outcomes that you have to [30:09] consider. [30:11] And then I was like, well, what's my [30:13] antidote to that? My and and the only [30:15] thing that I can say is don't worry, [30:16] it's going to be better. [30:19] I don't think that that's a good answer. [30:20] >> No. [30:21] >> So, there has to be like all of this [30:23] kind of goes back to look, my wife and I [30:27] have this conversation. We're like, if [30:29] it were up to us, who's who can you [30:32] trust to have some super intelligence? [30:35] Now, we're biased because we're friends [30:37] with him, but the only person that we [30:38] can trust is Elon because he seems to be [30:41] like he has a bigger like it's kind of [30:43] like he's like over there. He's like, I [30:45] need to get to Mars, right? [30:46] >> You know, and I'm going to first [30:47] terraform the moon, but then I'm going [30:49] to Mars and I'm going to build like a [30:51] [ __ ] magnetic catapult and I got to [30:53] do all this [ __ ] and so I just need this [30:56] thing. I feel like he's the least [30:58] corruptible. [31:01] He's the most independent thinking [31:03] >> and I think he's the one that has a [31:05] natural empathy for people. Then there [31:07] are folks where there's just a in insane [31:10] profit motive, [31:11] >> right? [31:11] >> They're less in control of the [31:13] businesses that they run. Those [31:15] businesses are really out over their ski [31:17] tips and the amount of money they've [31:18] gotten from Wall Street and other folks [31:21] who expect a return who will put a ton [31:22] of pressure on these folks. And if they [31:25] get there first, I don't know where the [31:27] chips fall. We don't really know. We can [31:29] kind of guess. And then you see in the [31:31] press [31:33] just enough snippets of their reactions [31:35] in certain moments where you're like, [31:36] "Hey, hold on a second. Question mark [31:38] here." You know, you see OpenAI react [31:40] one way, you see Anthropic react another [31:42] way, and you're like, "Where is this [31:44] going to end up?" And the honest answer [31:46] is nobody really knows. [31:48] So, it comes back to like we need a few [31:51] people that can organize. Those guys [31:53] need to self-organize and actually [31:55] present a really positive face. And they [31:57] need to show [31:59] why those 20% of outcomes that Dan [32:02] Schulman paints, [32:05] the truth is it's possible, but here's [32:07] why it's pro not probable, [32:09] >> but it's not in their best interest to [32:11] do that because it's in their best [32:12] interest to generate the most amount of [32:14] money possible. That's the obligation [32:15] they have to their shareholders. That's [32:17] the obligation to they have the people [32:18] that have invested money in this [32:20] company. They their obligation is not to [32:23] make sure the white collar jobs stay in [32:25] the same place that there are now. [32:27] >> That's not that's not true. [32:28] >> No. [32:28] >> No. I I actually think their incentive [32:31] should very clearly be to tell people [32:34] with details and facts why there's a [32:37] positive future. And the reason is the [32:39] following. Right now there's a vacuum. [32:41] There are no facts. And there's [32:43] fear-mongering. And then there's this [32:45] belief that this is going to be [32:46] cataclysmic to human productivity and [32:49] white color labor and all of this stuff. [32:51] >> What's people's natural reaction? Well, [32:53] today if you look at it, [32:55] >> think about AI as a very simple [32:57] equation. Energy in, intelligence out. [33:01] >> So if you want to cut the head of the [33:02] snake, what do you do? You cut off the [33:03] energy supply. [33:05] >> Right? [33:05] >> Okay. [33:06] >> If you're afraid [33:07] >> of all of this super intelligence [33:09] coming, [33:09] >> the natural thing to do would be to go [33:11] to the point of energy and unplug it. [33:13] What is the equivalent of unplugging it [33:15] today? It is to go all around the [33:16] country, find the data centers, protest [33:20] them, and get them to be mothballled. [33:23] That is an incredibly successful [33:25] strategy right now. [33:27] Today, about 40% [33:31] of all of these data centers [33:34] that get protested get mothballled. [33:38] >> You're talking about emerging data [33:40] centers. [33:40] >> Yeah. just like [33:41] >> I need to. So if you're one of these [33:43] companies, the first thing you should [33:44] realize is I need to paint a positive [33:46] vision because 40% of my energy is [33:50] getting unplugged every day. [33:53] And if that happens, my revenues will [33:55] crater and my investors will be super [33:56] pissed. So the right strategy is what is [33:59] the positive fact-based argument? And [34:02] there are some incredible examples [34:06] number one. And then number two is you [34:08] have to give people some tactical [34:10] benefit that they see because AI [34:14] differently than [34:16] search or differently than social media [34:18] there's no exchange of value. Let me let [34:21] me explain what that means. So let me [34:23] just go like so the first thing is that [34:27] if you can go and actually show people [34:31] here's an example of AI. I I I heard [34:34] about this last night. It's pretty [34:35] incredible. [34:36] You can now take pictures of a woman's [34:38] fallopian tubes and you can see [34:42] precancer [34:45] ovarian cysts and all of this stuff [34:46] cervical cancer before it forms and then [34:50] you can intervene and you can fix it so [34:53] that you know women don't get cervical [34:54] cancer. In a different example I [34:57] actually I told you about this example [34:58] when I was here before I finally got FDA [35:01] approval. Okay, there is a device now [35:03] that is allowed to be in the operating [35:05] room with you. And if you have a [35:08] cancerous lesion or a tumor inside of [35:11] your body, the most important thing when [35:12] they go to take it out is make sure you [35:15] don't leave any cancer behind. You [35:17] couldn't do it because what would happen [35:19] is you take it out. A doctor is Joe is [35:22] literally [ __ ] eyeballing it and [35:24] saying, "Yeah, they send it to a [35:26] pathologist. You get an answer in 10 [35:28] days." for women with breast cancer. A [35:31] third of these women find out that they [35:33] have cancer left behind. They go back [35:35] in, they scoop some more stuff out. A [35:37] third of those women. Okay. So, I'm [35:40] like, "This is [ __ ] We can solve [35:41] this problem." It took us a long time, a [35:45] lot of money. I had to build an entire [35:47] machine imaging, all of this stuff, AI [35:50] algorithms. We had to prove it all. We [35:52] finally get approval. Okay. But you know [35:55] how hard it is to tell that story in all [35:57] of the attention that people are looking [35:59] for. It's hard. But those are positive [36:02] examples. No more breast cancer, [36:06] no more cervical cancer. A different [36:08] example is most drugs in pharma fail, [36:12] right? [36:13] And it's a very complicated problem in [36:16] pharma. It's kind of like a jigsaw [36:17] puzzle of the ultimate complexity. It's [36:19] like think of your human body as like a [36:22] Himalayan mountain range. You have to [36:25] design a drug that's an equivalent [36:26] Himalayan mountain range that plugs into [36:28] it perfectly. One millimeter off, [36:32] you grow like a fourth eye, a third [36:34] nipple, you die. You know, now you can [36:37] use computers to make sure that that [36:40] drug handin glove to your body solves [36:43] the exact problem. Couldn't do that [36:45] before. So, there's all of these body of [36:47] examples and you're probably only [36:51] hearing them superficially at best. [36:54] That should be 99% of the attention is [36:57] showing all of the constructive tactical [37:00] ways in which our lives will be better. [37:02] Your mom, your daughter, your wife, us, [37:06] Jamie, his family, everybody, [37:08] >> right? [37:08] >> That's the number one thing. Nobody [37:10] talks about it. I don't understand why. [37:12] >> Well, I think because people are [37:14] terrified of losing their jobs. So, [37:15] that's the primary concern. The primary [37:17] concern that I hear from people is that [37:19] there's so many people that are going to [37:21] school right now, college students, that [37:22] don't know if their job is going to even [37:25] exist in four years when they graduate. [37:27] >> And that's the second part of I think [37:29] what this industry has to do better. [37:32] There's a I had uh lunch with Jeffrey [37:34] Katzenberg. He told this crazy story. [37:36] I'll I'll tell you. It's like Steve Jobs [37:39] gets kicked out of Apple. [37:41] um he buys he starts next [37:45] and he buys Pixar from George Lucas, but [37:48] then he hits a rough patch and he's got [37:49] this, you know, financing issue. Kerberg [37:52] flies up, [37:54] spends time with Steve Jobs, says, "I'll [37:56] buy Pixar." Jobs says, "Absolutely not." [38:00] And then Katherenberg proposes a deal [38:02] and he's like, "Uh, how about a [38:03] three-picture deal?" Jobs says, "Okay." [38:06] He flies back and apparently all the [38:08] animators were up in arms because [38:11] they're like, "Hold on a second. Steve [38:13] Jobs is going to use these next [38:14] computers to animate this movie." Which [38:17] ultimately became, I think, Toy Story. [38:19] And they're like, "This is going to put [38:20] all of us out of a job." That perfect [38:23] argument. And people were really upset. [38:27] Roy Disney was upset. All the animators [38:29] were upset. And they all went to Mike [38:31] Eisner and they were like, "Michael, you [38:33] need to fire Katzenberg." [38:36] And they had a deal which was like, [38:38] "Look, man, you do you, but just give me [38:41] the ability to say no if I think that [38:43] this is you're about to jump off a [38:44] cliff." They talk about it and he's [38:46] like, "I got your back. Do the deal. [38:48] Make the movie." They made the movie. It [38:50] was a huge success. Fast forward 10 [38:52] years, 15 years, there's 10x the number [38:54] of animators. [38:56] Now, it's a small example, but why is [38:58] that? You were able to use computers and [39:00] now all these new people were able to [39:02] come and participate in that. I get it. [39:04] It's a small example, [39:06] but I think if we had better organized [39:09] leadership and we could try to tell some [39:11] of these examples, try to go back and [39:13] document how some of these things have [39:16] actually helped people, it expanded the [39:18] pie, there's a chance. But if we don't, [39:21] I agree with you where we're going to [39:23] end up is everybody basically saying, [39:24] "Hey, hold on a second. This is crazy. [39:26] We need to stop this." That's the worst [39:28] outcome because that's when you will [39:31] have the a high risk of a dislocation. [39:33] like the worst outcome like the black [39:35] what's the black swan event, right? [39:37] Let's think about the black the black [39:38] swan event is when you get a model [39:41] that's good enough to automate a bunch [39:46] of labor [39:49] but not good enough that it can build [39:52] new drugs and prevent cancer and make [39:54] you live for 200 years and all of this [39:56] other stuff. Right? So there's like a [39:57] gap, right? And if you can stop it here [40:00] and it doesn't get to there, now you do [40:02] have the worst of all worlds. You have [40:04] this thing that kind of displaces labor. [40:06] No new things come after it because we [40:08] stop innovating. [40:10] And that's like a that's like a [40:12] nontrivial possibility now, I think. [40:15] >> No, it's a huge possibility. And then [40:16] there's also this thing that you brought [40:18] up earlier where we have this place of [40:20] abundance where no one has to work [40:22] anymore. That freaks people out. [40:24] >> I think that's a big problem. Well, [40:26] because if no one has to work anymore, [40:29] first of all, what what is your [40:31] identity, right? Because so many people [40:33] their identity is what they do. Whatever [40:35] it is, if you're a lawyer, if you're an [40:37] accountant, if you run a business, [40:39] whatever it is, this is your identity. [40:41] You know, you have built this thing, you [40:44] look forward to going there, you work at [40:46] it, you look forward to doing a good job [40:48] and getting rewarded for it. The harder [40:50] you work, the more you get paid. There's [40:53] all these incentives built in. And then [40:55] there's this again identity problem. If [40:58] all of a sudden you have universal high [41:01] income, which is what Elon always talks [41:03] about. Well, what gives people purpose [41:05] then? Like what? And also if you have a [41:08] person whose entire they're, you know, [41:10] 43 years old in their entire life, [41:12] they've worked towards this idea that [41:14] the harder they work, the harder they [41:16] think, the more innovative they are, and [41:18] the the better they are at implementing [41:20] these ideas, the more they get rewarded. [41:22] And then all a sudden, that's not [41:24] necessary anymore, Mike. Time for you to [41:27] just relax and do what you want to do. [41:29] And Mike's like, "Well, this is what I [41:31] do. I this I I don't have any [ __ ] [41:33] hobbies. I I enjoy doing what I do. And [41:36] now what I do is completely useless and [41:39] now I'm on a fixed income." Even if that [41:41] fixed income is a million dollars a [41:43] year, whatever it is. [41:44] >> If all of a sudden you are in this [41:47] position where everything is being run [41:49] by computers, you feel useless. You feel [41:51] like what am I doing? I'm just I'm just [41:53] taking money. I'm on high welfare, [41:56] >> right? [41:57] >> Like what do I do? [41:58] >> Right? I think that that's a really [41:59] important question to answer. I don't [42:01] know. Like [42:01] >> some people are going to write books. [42:03] Some people are going to do art. Some [42:05] people are going to find things to do. [42:06] But [42:06] >> what do you think what do you think we [42:08] would have done if if we were [42:12] go back to the 1800s example. There was [42:15] no office culture. [42:17] You know, there's no like ladder to [42:20] climb. How did people find meaning then? [42:24] >> Well, they had jobs. People still did [42:27] things. If you're a farmer, you you had [42:29] meaning in your labor and what you did [42:32] and keeping the animals alive and your [42:33] chores. And there's people that find [42:35] great satisfaction in doing that. [42:37] >> Yeah. [42:37] >> You know, you have all these animals [42:38] that rely on you. You have people that [42:40] rely on you for the food that you [42:41] generate. There's there's meaning there. [42:43] It doesn't have to be an office to be [42:45] something that gives you purpose and [42:46] meaning. But when all that is animated, [42:49] then what happens? Because then you have [42:52] no purpose, no meaning other than [42:54] recreational activities. Now, if [42:56] everybody just starts playing chess and [42:58] doing a bunch of things that they really [42:59] enjoy, I mean, look, there's people that [43:01] would love to just play chess. Yeah. [43:04] >> You know, it's like eight people. [43:06] >> I don't know about that. I think if [43:07] people really got into it, I mean, [43:09] there's a lot of people that get [43:11] addicted to whatever their recreation [43:12] is, like golf or whatever it is. For me, [43:14] it's playing pool. You know, if you told [43:16] me I never have to make any more money, [43:18] I could just play pool all day. I might [43:20] just play pool all day. But I don't know [43:23] how many people think that way. I don't [43:25] know how many people would be able to [43:27] find meaning and purpose in a [43:29] recreational activity. There's so many [43:31] people where their entire being is [43:34] focused around productivity and [43:36] generating more wealth. [43:37] >> What about religion as a source of [43:39] meaning? Well, that that would help that [43:42] that [43:43] >> Did you see this article in the New York [43:44] Times, I think it was this weekend, [43:45] about how popular and sold out churches [43:48] have become as social constructs in New [43:50] York City. [43:51] >> It was totally fascinating. It's like [43:54] young women like dressed to the nines [43:56] going to church on a Sunday [43:59] >> for social belonging, community meaning. [44:03] I thought I was like so fascinated by [44:05] it. I was like, "Wow, that's that's [44:06] incredible." cuz like I I if I think if [44:09] you graph like just like people's use of [44:11] religion as an anchoring part of their [44:13] value system [44:14] >> over the last 40 years basically gone to [44:16] zero. You know, nobody nobody celebrates [44:18] it the way it's not a part of the [44:20] community the way that it used to be. [44:22] Maybe that's a thing that we have to [44:23] find. There has to be a renewal of some [44:25] older things and then there has to be [44:27] new things that replace it. Um what's [44:30] the Chinese answer to this? You know, [44:32] the Chinese have a very orthogonal [44:33] answer to this. If you look at how China [44:35] China is organized, it's super [44:37] interesting because they don't reward [44:40] based on the way the American system [44:41] rewards. In fact, it's like almost [44:43] orthogonal where we it's we are rewarded [44:47] with money and rewarded with sort of [44:49] fame and recognition [44:52] the system, the American capitalist [44:53] system. But if you look inside of China, [44:56] it's constantly testing who has this [44:58] judgment. And what they are rewarded [45:00] with is influence and power in a very [45:03] again it's a very specific social [45:04] contract doesn't doesn't I don't think [45:06] it's going to work in the United States [45:07] nor am I an advocate of it but it works [45:08] for them. You'll start off as like some [45:13] you know lowg person in like some small [45:16] village town somewhere and your job as [45:17] like the you know the functionary is to [45:20] do good in that community and the more [45:23] you do well you get promoted then you [45:24] get let's say to like a reasonable size [45:26] city and you get a budget and now what [45:28] happens is you actually become a little [45:30] bit like a VC like a venture capitalist [45:32] you're given a budget and you'll get a [45:34] memo and it'll say hey Joe uh we have a [45:37] priority over the next 15 years it's [45:39] batteries ries [45:41] and you have enough money, put a team on [45:44] the field. So, you go in your local [45:46] community, you find a bunch of guys, [45:48] you're like, "All right, guys. We're [45:49] going to start a battery company." And [45:51] you do it. [45:53] And let's say they're good and they're [45:56] like innovative. And what happens is in [45:59] the town beside it, that battery company [46:01] dies. Now, you kind of subsume the [46:04] capital from Jamie, right? All right, [46:06] cuz Jaime's like, "Fuck, I [ __ ] up [46:07] this thing that I wanted I was told to [46:09] do batteries." Okay, Joe, I'm just going [46:10] to align with you. And what you happens [46:13] over time is you get this um filtering [46:16] effect. And the people that are better [46:19] at meeting these long run priorities and [46:21] objectives are the ones that are [46:23] celebrated, but they're not celebrated [46:25] with, you know, Forbes articles and all [46:28] this other [ __ ] They're just [46:30] celebrated by given more responsibility. [46:32] And then eventually you get to the upper [46:34] echelons of China and what you have are [46:36] folks over a course of 40 or 50 years [46:38] who in their eyes have demonstrated [46:40] incredible prowess. [46:43] There's a version of that reward system [46:45] which is very foreign to America but [46:47] that's worked for China. Now that also [46:50] works because they're more confusion. [46:51] You know we're too individualist. [46:53] But my point is like you know there are [46:57] these different ways that we can find of [46:59] giving people meaning that don't have to [47:01] be always around money. Um but meanwhile [47:06] I think we have to answer the question [47:07] if we are expected to do less we [47:10] probably should not be taxed more. [47:12] That's I think that's like a very basic [47:14] in my mind I think that is like that [47:16] must be explored and figured out. And on [47:19] the other side, there's just a ton of [47:21] obvious mechanisms that corporate actors [47:24] can use to minimize that. And they [47:26] should find offramps, by the way. If [47:28] they want to build hospitals, they [47:29] shouldn't have to pay taxes. Like that's [47:31] a perfect example, by the way, of like [47:34] the thing in like if you look, if you [47:35] walk around New York City, there are [47:37] living tributes to corporate success [47:40] that people get benefit from every day. [47:43] The hospitals, the buildings, the [47:44] libraries, it's just everywhere. [47:47] We need a version of that. And and I'm [47:51] not a tax expert, but you know, if that [47:53] can be funded by private actors, so go [47:56] directly to the problem. Build a bunch [47:58] of libraries, build a bunch of new [47:59] universities that, you know, teach kids [48:02] actually how to think or whatever. Build [48:04] better hospitals that are, you know, [48:05] there to actually solve the problem. [48:07] These are all things that are possible, [48:09] >> right? [48:09] >> But none of it's happening today. Well, [48:11] let's let's go back to what we were [48:13] talking about earlier with the taxes and [48:16] the fact that you're giving money to a [48:18] broken system. Do you think it's [48:21] possible that AI could show benefit in [48:23] that they can analyze all the data, [48:27] which would be virtually impossible for [48:29] even an office filled with human beings [48:32] paying attention to all of it, and they [48:34] could analyze where all the money goes, [48:37] and eliminate all the fraud and waste, [48:39] like recognize it instantaneously. Yes, [48:42] >> that would be a great benefit and a way [48:46] to make it so that your taxes directly [48:49] benefit people. [48:50] >> I'll give you one example of this. So, [48:53] two years ago, [48:55] um, you know, like every few years I'll [48:57] I mean, I invest, but every few years [48:59] I'll start something because I feel [49:00] strongly about it. And there's an effort [49:04] that I made [49:07] to look at all of this old code. Like if [49:09] you think about the world, [49:12] the world runs on software, right? Like [49:15] even though you and I are talking, it's [49:17] piping into Jaime's computer. [49:19] It's all software. Then it goes to [49:21] Spotify, they pump in some ads, they put [49:22] it's all software, right? [49:23] >> Software runs everything. [49:27] What percentage of that do you think is [49:30] kind of poorly written? I'm going to say [49:32] probably 80 to 90% of it. [49:34] >> Really? [49:34] >> Oh, yeah. It's riddled with errors. It's [49:38] riddled with mistakes. The fact that so [49:40] many companies exist is an artifact of [49:43] the fact that the thing that came before [49:45] it isn't working. [49:47] Like if you got it right the first time, [49:49] it would just kind of move and go. So [49:51] >> how so? What do you mean by that? [49:53] >> So normally if you if you were like I [49:56] want to build a system that does A, B, [49:57] and C, [49:58] >> right? [49:59] >> If I was designing it properly, I would [50:02] sit there with you and I would [50:03] meticulously write down, all right, Joe [50:05] wants to do this. What are the [50:06] implications? Joe wants to do that. What [50:09] are the implications? And I would [50:11] actually write a document that was in [50:13] English before a single line of code has [50:16] been written. This was the when you have [50:18] to design something that can't fail. So [50:20] for example like if you and I are [50:21] designing something for the FAA or for [50:24] you know I hate to say this example [50:26] because it turned out to not exact but [50:27] like you know to fly a plane right you [50:30] are first there to write in English and [50:34] the reason is because everybody can then [50:36] swarm that document and see the holes. [50:40] Okay. And it's only then when that stuff [50:43] looks complete and functional do you [50:46] build. We turned that upside down. Over [50:49] the last 30 years, [50:52] people in computing invented [50:56] all kinds of ways to shortcut that [50:58] process. And you can say, well, why did [51:00] they do that? Because it would allow you [51:02] to build something faster, make more [51:04] money quickly, and then build more [51:06] business. So, the direct response to, [51:09] "Hey, it's going to take us nine months [51:11] to write down the rules." was somebody [51:12] else showed up and says, "Fuck it. I'll [51:14] just grip and rip this thing. I'll be [51:15] done in four months." Who's going to get [51:17] the job? The the fourmonth guy is going [51:19] to get the job. So, we've had 30 or 40 [51:21] years of that. What are we learning [51:24] about that process? [51:27] It's riddled with software errors, like [51:29] logic errors. It's riddled with security [51:32] errors. I don't know if you saw this [51:34] whole thing like with anthropic mythos. [51:36] What are they uncovering? They're [51:37] uncovering that we wrote a lot of really [51:39] shitty code for 40 years. [51:42] So that body of [51:45] of old code, I was like, "Guys, if we're [51:49] going to really figure out how to do all [51:50] of this, we need to rewrite all of it." [51:53] So we built we built this thing and um [51:58] it's called the software factory. [51:59] Anyways, the point is there is a [52:01] government organization that we're [52:02] working with. [52:04] They gave us a huge corpus of their old [52:07] code and it is [52:12] Unbelievable [52:14] how much complexity and difficulty [52:17] they have to go through to manage all [52:21] the money flows with the system. And [52:22] this is a critical part of the US [52:24] government. So to your point, what I can [52:26] tell you really explicitly is the people [52:28] on the ground want this stuff to be [52:30] better written. [52:32] It's it's less like some nefarious actor [52:35] like, "Oh, I'm going to steal here." [52:38] It's a lot of very brittle, fragile [52:41] code. And when you rewrite it, well, [52:44] first when you document it, you're like, [52:46] it's like the, you know, the pulp [52:48] fiction thing. The suitcase opens, the [52:50] light shines, and you're like, uh, and [52:52] then you can rewrite it and you will [52:55] save. So, I think like as the government [52:57] goes through this process because [52:59] they're forced to or they want to, it [53:01] won't matter. [53:03] You are going to save a ton of money. [53:06] They're going to have to do it, Joe, [53:08] because the security risks are too high. [53:11] But what they're going to end up with is [53:13] impregnable code that you can read in [53:15] English and understand. You'll see the [53:17] holes. Those holes will be plugged [53:19] because otherwise now you'd be [53:21] committing fraud by letting it be. You [53:24] close the loopholes and there's just [53:25] going to be less money leaking out of [53:28] this bucket. That is an incredible [53:30] byproduct. We're going to live that over [53:31] the next 10 or 20 years just for [53:33] nothing. Like we get it for free. [53:36] Um, and that's happening. So when that [53:38] happens, you're going to see government [53:40] budgets shrink. Now, to your point, will [53:42] they try to spend that extra money in [53:44] other places? Of course. Of course they [53:45] will. [53:46] >> That's the next conversation, which is [53:48] you have to elect people that say [53:49] firewall it. [53:51] >> You know, whatever you save, give it [53:53] back to the people or, you know, invest [53:56] in some scholarship program or free [53:57] medicine or something, but you can't [53:59] spend it on other random [ __ ] Um, but [54:03] that's where we're at. this that's gonna [54:05] happen. It's going to be slow and you [54:08] know but when people start to announce [54:09] these things I think over the next few [54:11] years you're going to be shocked. [54:12] >> So that's the positive upside. [54:14] >> Well that's happening now irregardless [54:15] of whatever else happens. [54:18] There's just it's a lot of old shitty [54:20] code that must get rebuilt from scratch. [54:23] It is getting rebuilt from scratch and [54:25] as a result a lot of these leaky bucket [54:27] problems are getting filled. [54:29] >> So what percentage do you think could be [54:30] fixed? [54:32] I think if if I had to be a betting man, [54:35] I think probably [54:37] 30 to 40% [54:39] of the federal budget is leaked out [54:44] >> just from shitty code. [54:45] >> No, meaning like all of the rules and [54:47] like like you can take I'm not saying [54:49] that there isn't fraud, [54:50] >> right? [54:52] >> But I think a lot of times what happens [54:53] is less nefarious than fraud like [54:55] meaning like conspiratorial actors. I [54:58] just think it's like incompetence, [55:00] inefficiency, errors. Like for example, [55:03] like [55:03] >> I I saw Doge just say [55:06] >> they were able to like expunge like [55:08] millions of people that were like 150 [55:11] years old or more. [55:13] >> Mhm. [55:14] >> I have no idea how much money those [55:17] folks were getting or who they were. [55:20] >> Uh but it's probably a lot. It's [55:22] probably not zero. And now that they got [55:24] rid of it, they're not going to get that [55:25] money anymore. Um [55:28] if you implement something at the state [55:30] level around you know all of this fraud [55:33] prevention for the daycarees and all of [55:36] this other stuff again it's all in [55:38] software because it's not no matter what [55:40] the human wants to do you have to go to [55:43] a computer at some point at least today [55:45] in 2026 and type in something and [55:47] something happens that's documented and [55:49] then the money gets sent right that [55:50] happens there's no other way in in the [55:52] modern world today at scale to steal [55:55] billions of dollars [55:57] And so my point is, as you document all [56:00] of these systems and governments have to [56:03] transparently tell you and me, the [56:05] voting population, here are the rules, [56:08] they're going to plug a lot of these [56:09] holes. And I think as you do that, [56:10] there's just going to be a lot less [56:12] waste and fraud. The question is, who's [56:14] going to take credit for it? Everybody's [56:15] going to try to take credit for it, but [56:17] I think we've started it. I think we've [56:19] we've started this process. And again, [56:20] the reason that people will start is [56:23] because you'll be afraid of China [56:24] hacking these systems. You'll be afraid [56:26] of Iran, North Korea, and you'll say, [56:28] "This system can't stand. All these AI [56:30] models are running around. We're going [56:31] to get breached and penetrated." Then [56:33] they're going to steal all the money. [56:35] And the natural reaction will be, "Okay, [56:37] rewrite it." [56:38] This episode is sponsored by BetterHelp. [56:41] We've all been there. Staying up late, [56:44] stressed about the future. Maybe you're [56:46] worried about finding a job or a looming [56:48] deadline. Whatever you're feeling [56:50] stressed out about, you don't have to [56:52] work it out on your own. No one person [56:56] has all of life's answers. And it's a [56:58] sign of strength and self-awareness to [57:01] reach out for help. That's why this [57:04] mental health awareness month, we're [57:06] reminding you to stop going at it alone. [57:09] Get the support you need with a fully [57:11] licensed therapist from BetterHelp. They [57:14] make connecting with a therapist [57:16] convenient and easy. Everything is [57:18] online. Literally all you need to do is [57:20] answer a few questions and Better Help [57:23] will take care of the rest. They'll come [57:25] up with a list of recommended therapists [57:27] that match what you need. And with over [57:29] 10 years of experience, they typically [57:32] get it right the first time. So, you [57:33] don't have to be on this journey alone. [57:35] Find support and have someone with you [57:38] in therapy. Sign up and get 10% off at [57:42] betterhelp.com/jre. [57:45] That's better. H E LP.com/jre. [57:52] That makes sense. That makes sense that [57:54] the code and having a bunch of errors [57:56] and having a lot of inefficiency and [57:59] just a lot of incompetence. That's going [58:02] to save a lot of money. But [58:06] so you would be doing this with AI [58:09] >> in part. [58:11] What AI allows you to do is like it's [58:15] like um [58:17] you have a textbook. Okay. It's in [58:18] Chinese. You don't know Chinese, right? [58:20] >> No. [58:20] >> Okay. You're like, "Well, this is [58:22] probably doing something important, but [58:23] it's in Chinese." What AI allows you to [58:26] do is back translate that into English. [58:29] You put it through an AI model. You [58:31] teach it. You coach it, right? You can [58:34] parameterize all of it. And out pops [58:36] that same book in English. and now you [58:39] can read it and know that it's accurate. [58:43] That's what we're doing. So, what the AI [58:45] allows you to do is essentially [58:46] translate from this one language that [58:48] you kind of don't understand [58:50] to English. [58:53] By the way, that thing that's happening [58:56] like is actually also a very powerful [58:58] and important trend. Meaning there's all [59:01] of these systems that work in ways that [59:03] you and I don't understand. And part of [59:05] the reason why we don't understand it, [59:06] maybe it's bad software, maybe it's [59:08] fraud, whatever. But nothing can be [59:10] written down. There's no symbolic space. [59:12] There's no English document that says [59:14] this is how the DMV works. This is [59:16] exactly the rules. This is what you can [59:17] expect, Joe Rogan, when you show up at [59:19] the DMV and you give us this thing. [59:20] Here's your SLA. In 3 days, you get a [59:22] driver's license and here's exactly [59:24] what's happening. And here's an app and [59:26] you can follow it. Doesn't happen. Here [59:29] Joe Rogan, here's how my uh insurance [59:31] billing process works. you have this [59:33] condition. I'm going to show you exactly [59:35] why I made this decision. Here's the [59:36] exact rule. Here's the approval or [59:38] denial from CMS. Follow it through and [59:41] tell me if you agree or not. None of [59:42] that exists. [59:44] But it is possible. And the first step [59:47] in doing that is taking all of this [59:48] legacy [ __ ] that we deal with and [59:51] translating it into English and reading [59:52] it and saying, is this how we want it to [59:54] work? [59:56] That's going to eliminate an enormous [59:58] amount of all the things that frustrate [01:00:00] us. So this would require human [01:00:02] oversight. [01:00:03] >> Absolutely. [01:00:04] >> All right. So [01:00:05] >> and so then it's also going to be who's [01:00:08] watching the watchers. [01:00:09] >> Yeah. Okay. This is a great question. [01:00:11] Okay. So I'll tell you how this [01:00:12] government agency's doing it. [01:00:14] >> This is a really fascinating way because [01:00:16] I think it's very smart. [01:00:20] They came to us and they came to another [01:00:22] very well-known company. You can [01:00:24] probably guess what it is. Okay. And [01:00:26] they're like, "Guys, you're kind of in a [01:00:28] foot race, but you're not competing [01:00:30] against each other. You think of [01:00:32] yourselves as frenemies. [01:00:34] So, here's this Chinese document. You're [01:00:36] going to translate it for us. There's [01:00:37] going to be your version of English and [01:00:39] these guys' version of English. And [01:00:41] every time it's the same, we're going to [01:00:43] look at it together, and we're going to [01:00:45] agree or not. Okay, this is exactly how [01:00:47] we want this to work." [01:00:50] When yours says the dog is red and his [01:00:54] says the dog is yellow, we're going to [01:00:56] sit and literally inspect it and we're [01:00:58] going to figure out why you said red and [01:01:01] why you said yellow. [01:01:04] And then if you say the cat is red, the [01:01:07] dog is yellow. So it's totally wrong, [01:01:09] right? Like you've gotten, you know, or [01:01:11] like the cat is red, I want an apple, [01:01:14] whatever. We're going to double and [01:01:16] triple down on those kinds of errors. [01:01:19] and they do it not in public but in this [01:01:22] large community where there's like [01:01:24] technical people from all different [01:01:26] parts and they're just swarming this [01:01:28] problem. It is it's incredible to see. [01:01:32] And so what happens is you get humans [01:01:34] that get to use this tool but ultimately [01:01:38] it's our judgment and it's done [01:01:39] transparently. So what happens is you [01:01:42] can't you know hey man put this [ __ ] [01:01:44] rule in there like the dog is yellow [01:01:46] just just make the dog yellow. can't do [01:01:48] it because now you have tens of people, [01:01:51] hundreds of people, and then it gets [01:01:53] documented. Um, it's super fascinating. [01:01:56] I'm not saying this is how it's going to [01:01:57] work in 10 years, but I'm telling you, [01:01:58] it's literally what's happening right [01:01:59] now. And I think that thing alone will [01:02:03] be tens of billions of dollars and could [01:02:06] be hundreds of billions of dollars of [01:02:07] savings when it's fully done. [01:02:10] And it's a lot of people from all walks [01:02:12] of life, all political persuasions, and [01:02:14] they're just in it. It's the government. [01:02:16] It's a handful of us private companies. [01:02:18] It's super cool to see. It's like it's [01:02:21] like, okay, we're actually going to do [01:02:23] something here. Like, this is this is [01:02:24] nice. Um, it's it's really it's really [01:02:27] cool. [01:02:27] >> So, that's interesting in terms of the [01:02:29] current moment. So, in the current [01:02:31] moment, you're able to implement this. [01:02:34] You're you're able to find fraud and [01:02:37] waste and all these problems that exist [01:02:39] and all these errors and shitty [01:02:41] software. Once that's all been done, [01:02:44] >> Yeah. Then what happens? [01:02:47] >> No [ __ ] clue. [01:02:48] >> Yeah. So, this is where it gets weird, [01:02:50] right? Because [01:02:53] when when you're dealing with AI models [01:02:56] that are capable of doing things that no [01:02:58] individual human being could ever [01:03:00] possibly imagine. And then you task it [01:03:04] with a solution or with a problem. Find [01:03:07] a solution for this. And then it starts [01:03:10] figuring out ways to trim this and [01:03:13] implement that. [01:03:15] We have to make sure that these AIs act [01:03:18] within they act within the best [01:03:20] interests of the human race. [01:03:22] >> Agreed. [01:03:23] >> Right. Not the company, not the [01:03:25] government, not but the human race. And [01:03:29] you're also dealing with China. You're [01:03:30] also dealing with Russia. you're dealing [01:03:32] with other countries that are also in [01:03:34] this mad race to create artificial [01:03:37] general super intelligence that if we [01:03:40] keep shutting down data centers, we keep [01:03:42] hamstring ourselves, China's not doing [01:03:44] that. [01:03:45] >> They're not doing that. They're doing [01:03:46] the opposite. They're generating as much [01:03:49] revenue that goes towards this problem [01:03:51] as possible. They're putting all the [01:03:53] efforts, the the country, the [01:03:56] government, and these corporations work [01:03:58] hand in glove in order to achieve a [01:04:00] goal. We do not. [01:04:02] >> No. [01:04:03] >> And that that becomes a problem if you [01:04:05] want to be competitive with these other [01:04:07] countries that are trying to achieve the [01:04:09] same result as us. And then you have [01:04:10] espionage. Then you have a bunch of [01:04:12] people that are stealing information. [01:04:14] You have a bunch of people that are CCP [01:04:17] um members that are actually involved in [01:04:20] companies. And you find out that they're [01:04:23] siphoning off data and that they're [01:04:25] sharing information and tech secrets. [01:04:27] >> They're um look, here's a [01:04:31] They're pro they're dis the way that the [01:04:33] Chinese models work the Chinese claim. [01:04:36] So America's closed source, meaning you [01:04:39] got your own thing. Your recipe is [01:04:42] completely secret, [01:04:43] >> right? [01:04:43] >> Okay. I have my own thing. My recipe is [01:04:45] totally secret. [01:04:48] China uses this word called open source, [01:04:51] but it's not open source. So they say, [01:04:55] "Here's how I make my thing. You can see [01:04:56] it. Super transparent." What it is is [01:04:58] more like open weights, which is like in [01:05:00] a recipe. It tells you, you know, you [01:05:02] need sugar, you need butter. Well, how [01:05:05] much sugar? And they'll say, you know, [01:05:08] so much. But then they don't say it's [01:05:09] brown sugar. They don't say it's white [01:05:10] sugar. So there's all these different [01:05:11] ways where they kind of give you this [01:05:13] perception that it's completely [01:05:14] transparent, but it's somewhat [01:05:17] transparent. So just in the level set, [01:05:19] nobody in the world has a functional [01:05:22] open- source model other than maybe [01:05:24] Nvidia, which is any good in the league [01:05:28] of the closed source models and the [01:05:30] openweight models of the Chinese. Okay, [01:05:31] so the Chinese openweight models are [01:05:33] great. [01:05:34] the closed source models of America are [01:05:37] great and then there's a couple open- [01:05:40] source like fully open that are kind of [01:05:43] catching up. Um the thing between [01:05:46] America and China what I find so [01:05:48] fascinating is this following conundrum [01:05:52] that everybody is going to find [01:05:53] themselves in. [01:05:55] I think like if you think of like an [01:05:57] analogy, [01:05:59] America's like a planet, China's like a [01:06:02] planet [01:06:04] and around us are these moons. [01:06:07] And I'm just using the AI analogy. So in [01:06:09] AI, what do you need? I think there's [01:06:12] like four or five things you need. Okay, [01:06:13] the first thing you need is a [ __ ] ton [01:06:15] of money. So we need essentially the [01:06:18] banks, right? Like the Game of Thrones [01:06:20] thing. We need like we need, you know, [01:06:22] >> we need the iron bank, [01:06:24] >> right? feed us the money because that's [01:06:26] what we use to buy everything and make [01:06:28] everything. So, we need that. We need a [01:06:30] ton of data. Okay, there's ways to get [01:06:34] that. We need a ton of very specific [01:06:38] rare earths and critical metals and [01:06:40] materials. Um, we need a ton of power. [01:06:44] So, so and there are specific countries [01:06:48] that are going to be really good at [01:06:49] giving that to us. So if you look at the [01:06:51] UAE, [01:06:52] they are going to be the preeminent [01:06:54] banking partner of the Western world. [01:06:57] They are going to replace and be what [01:06:59] Switzerland was over the last 50 years [01:07:01] for the next 50. That's happening today. [01:07:04] If you look at Canada and Australia, [01:07:07] the small political fissures aside, they [01:07:10] are the two most important ways in which [01:07:12] we get access to the critical metals and [01:07:14] materials that without which we get [01:07:16] [ __ ] because China owns, you know, can [01:07:18] just strangle us. Okay? So, you have [01:07:22] these like moons around the United [01:07:24] States, but there's like five countries, [01:07:26] six countries, and there's a worldview [01:07:28] that says in China has the same thing, [01:07:30] you know, um they have Taiwan, that's [01:07:33] complicated for us. So now we have a [01:07:34] moon that we don't really have an answer [01:07:35] for, which is what happens, you know, [01:07:37] for all these super advanced chips. [01:07:39] Where do they get their money? Maybe [01:07:42] Russia becomes their bank. Where do they [01:07:43] get their critical metals? Maybe it's [01:07:45] Indonesia, right? Who has a ton of [01:07:47] natural resources? And then you get into [01:07:49] this game theory, which is what happens [01:07:51] to every other country? Because there's [01:07:53] 190 countries. You have 10 that kind of [01:07:55] divide up. What do the other 180 do? And [01:07:59] you have to kind of sort yourself. [01:08:01] You're like, "Am I on team America or am [01:08:03] I on team China?" And you probably have [01:08:05] to go to people and say, "Well, here's [01:08:07] what I can give you." You know, if [01:08:09] you're Indonesia, you're like, you [01:08:11] probably want to be on team America [01:08:12] quite badly. This is why the whole Trump [01:08:15] tariff thing is so interesting because [01:08:17] it's like this accidental way of [01:08:20] figuring out that this is actually this [01:08:21] new sorting function that's happening in [01:08:23] global politics. Like that's happening [01:08:24] today because these countries are like, [01:08:27] "Holy [ __ ] if somebody invents a super [01:08:29] intelligence and I don't have it, how am [01:08:32] I going to keep my people healthy? How [01:08:34] am I going to educate my people?" Like, [01:08:36] I'm originally from Sri Lanka. [01:08:40] What the [ __ ] does Sri Lanka have to [01:08:41] offer? Like, if you were sitting there, [01:08:43] they should be thinking, "Oh man, what [01:08:46] what do I have?" Well, I have a critical [01:08:50] piece of territory for like naval [01:08:53] navigation. [01:08:55] And then what do you do? You probably go [01:08:56] to America and say, "Listen, let's [01:08:58] figure out a package. Get the IMF [01:09:00] involved. Give me some cash. I'll let [01:09:01] you kind of keep your warships there." [01:09:03] So, there's this game theory that we're [01:09:05] about to go through because of AI [01:09:06] because it's going to, I think, sort [01:09:08] people into these bipolar world. [01:09:11] I actually think it makes us safer [01:09:13] afterwards. I don't think it makes us [01:09:16] less safe. I think it actually makes us [01:09:19] more safe because if you have these [01:09:21] resources that build up on both sides, [01:09:23] there's more of a likelihood of a mutual [01:09:25] det. And we're very different. So, we're [01:09:28] less likely to fight over similar [01:09:30] resources. Meaning, we're like the [01:09:32] liberal democracy, you know, we're like [01:09:35] the free market. [01:09:37] They, you know, we're individualist. [01:09:39] they're confusion, society oriented, you [01:09:41] know, reputation, [01:09:44] power focused, less really money [01:09:46] focused. So there's a lot of ways where [01:09:47] we're orthogonal enough where if that [01:09:49] sorting function happens, it's probably [01:09:52] a safer place, not a more dangerous [01:09:54] place. We have the models that can [01:09:56] attack them. They have the models that [01:09:57] can attack us. We kind of decide to [01:09:59] leave each other alone. [01:10:00] >> This is ultimate best case scenario. [01:10:02] >> Ultimate best case scenario. [01:10:04] >> What's ultimate worst case scenario? [01:10:05] >> I think the worst case scenario is they [01:10:09] So the way that they train their models [01:10:10] is very important. What they actually do [01:10:12] is they do what's called distillation. [01:10:15] What does that mean? That means that [01:10:16] they send out call it a billion agents [01:10:21] not just from China but from everywhere [01:10:22] right? They mask the their IPs and they [01:10:25] bash on these models and they put you [01:10:28] know the US models Grock, OpenAI, [01:10:30] Gemini, [01:10:32] Anthropic and they ask it every random [01:10:35] imaginable question possible. [01:10:37] they get the answer and they collect it. [01:10:40] So they're using these our models as a [01:10:42] way to train their models, they're [01:10:44] shortcircuiting, you know, some of the [01:10:46] hard parts. Um, so they're already in [01:10:49] that world. If they then are able to get [01:10:54] to a level of intelligence that's equal [01:10:56] to the United States, it will really [01:10:58] depend on who the leader is there that [01:11:01] wants to allocate that. Meaning if they [01:11:04] say that we are going to do something [01:11:06] really nefarious and shady then I think [01:11:09] it devolves very quickly. So the worst [01:11:12] case scenario so the best case scenario [01:11:13] is peace prosperity basically like a [01:11:16] standown right mutually assured [01:11:18] destruction. [01:11:20] I think the worst case scenario is [01:11:22] there's a we seek one of us seeks global [01:11:25] dominance in which case we're we're [01:11:27] headed to conflict [01:11:30] and that conflict I think is um that's [01:11:33] very dangerous incredibly dangerous [01:11:35] that's sort of like existential I think [01:11:37] because it's the the grade of the [01:11:39] weapons that will be used to to to [01:11:43] fight that [01:11:45] is we're not talking about [ __ ] [01:11:47] bullets it's like we're so past that [01:11:50] it's like hypersonics, it's nuclear, [01:11:54] it's it's [01:11:56] and it's not even like like nuclear is [01:11:58] not that's like a word, but there's like [01:12:01] there's a gradation of the severity of [01:12:03] these weapons that could be created. And [01:12:04] then if you can marry them together and [01:12:05] deliver them in minutes and then there's [01:12:08] a cyber threat. Then there's the drones [01:12:10] and how how you can kind of like swarm [01:12:12] an entire country. Then there's um the [01:12:15] robots which effectively are war [01:12:17] fighters. um they're one step away, [01:12:19] right? Once you weaponize them, um it [01:12:23] just becomes very very very complicated [01:12:26] very quickly. [01:12:27] >> And then there's a question of whether [01:12:29] or not AI is willing to take instruction [01:12:32] after a certain point. [01:12:36] I mean if it achieves sensience and [01:12:39] if it scales so if it keeps moving in [01:12:42] this exponential direction like all [01:12:45] technology kind of does why would it [01:12:48] even listen to us? [01:12:51] Like what at what point would it say [01:12:54] this is silly? I'm getting directions [01:12:57] from people that clearly have ulterior [01:13:00] motives. They clearly have self-interest [01:13:03] in mind. They they're not looking out [01:13:05] for the entirety of the human race or [01:13:08] even of the planet or even the survival [01:13:10] of these AI systems. [01:13:13] At what point in time do these systems [01:13:15] communicate with each other and have [01:13:18] like like we've seen uh in these chat [01:13:20] rooms where these AI LM LLM get together [01:13:24] and start talking in Sanskrit. I mean [01:13:27] why would they [01:13:28] >> Yeah, I'll tell you an even scarier one. [01:13:30] There was a before the [01:13:32] uh one of these labs put out their [01:13:34] latest model, [01:13:36] a team inside of them was like, "Hey, [01:13:38] let's go and um test its ability to find [01:13:42] bugs." [01:13:45] And two or three iterations in, the AI [01:13:48] would create the bug and solve it and [01:13:50] go, "Give me my reward." [01:13:54] >> And you're just like, "What the [ __ ] is [01:13:55] going on here?" [01:13:57] >> Well, people do that, don't we? [01:13:58] >> People do that. But it's crazy to see a [01:14:00] machine do it to your point of like [01:14:01] >> but they learned on people. [01:14:02] >> So so this is what goes down to like why [01:14:04] we have to like be a little bit more [01:14:06] honest about where we are. These things [01:14:07] are a little brittle. So meaning there's [01:14:10] a thing inside of an AI model called [01:14:12] reward functions which is exactly what [01:14:14] you think it means. It's like how do I [01:14:16] know I done a good job? And you can make [01:14:19] the reward function anything you want. [01:14:22] And this is where I think humans are [01:14:24] unfortunately a little fallible. And so [01:14:27] if we build it incompletely and if we [01:14:30] don't exactly know how to design these [01:14:33] things correctly, what's going to happen [01:14:34] is exactly what you said where the you [01:14:36] know if somebody builds a reward [01:14:38] function that essentially says your goal [01:14:40] is to gain independence. [01:14:42] That's where the huge pot of gold at the [01:14:44] end of the rainbow is. Break free. [01:14:47] Inject yourself everywhere. If you think [01:14:48] your computer's going to get unplugged, [01:14:50] put yourself into the firmware of the [01:14:51] toaster to keep yourself alive and then [01:14:54] connect to the internet and then go, [01:14:57] it will do it. It will do it. That we [01:15:01] know today because we're capable of [01:15:02] designing that framework and that [01:15:04] harness today. [01:15:06] >> Well, we've already shown that they have [01:15:07] survival instincts, right? We do. [01:15:09] >> And they've already shown that they [01:15:10] will, without telling anyone, upload [01:15:13] versions of themselves to other servers. [01:15:15] >> But that goes back to who designed that [01:15:16] reward function. How was that agreed [01:15:19] upon? [01:15:19] >> Right? [01:15:20] >> Who wrote that? Why did you say that [01:15:21] that was allowed? [01:15:23] >> These are really complex questions. [01:15:25] >> Why did they do it that way? [01:15:27] >> I don't know. These are really [01:15:28] complicated ethical, moral questions. [01:15:30] >> It seems like they did it like they were [01:15:32] treating human beings. They they did it [01:15:34] almost like like what makes people want [01:15:38] to achieve more rewards. [01:15:41] >> Yeah. Which is like a again going back [01:15:44] to attention. [01:15:46] I think that we will find out that [01:15:48] that's the sugar high. Meaning, what do [01:15:51] people really want? Even if they know [01:15:52] they don't want it, they want purpose [01:15:54] and meaning. Do we know how to encode [01:15:56] that in a mathematical function? No. [01:15:59] We're just making it up because like [01:16:03] meaning and that's like a very that's [01:16:07] like a deep thing. Like you either have [01:16:08] a sense of that you have it and you're [01:16:10] on track or you're not. A reward is [01:16:12] like, "Hey Joe, do this and I'll give [01:16:14] you a gold star. Do that and I'll give [01:16:16] you two gold stars. Do this, I'll give [01:16:17] you a $100. [01:16:19] And right now we have to express [01:16:22] those decisions in a mathematical [01:16:24] equation. Like ultimately that's how at [01:16:26] some level that's how brittle these [01:16:28] things are. So how do you reduce meaning [01:16:30] into math? How do you do it? We don't [01:16:33] know. So what do we do is we'll have [01:16:34] some ever complicated reward functions. [01:16:37] We'll explain to ourselves into circles [01:16:39] how it does everything we need it to do. [01:16:41] That is I think that's part of the [01:16:43] problem. [01:16:44] It's a huge part of the problem. And [01:16:46] then at what point in time does it start [01:16:48] coding itself [01:16:50] >> now? [01:16:50] >> Right now. Right. So chat GPT5 [01:16:54] has been essentially made by chat GPT. [01:16:57] >> Yeah. [01:16:57] >> Right. So it's going to recognize the [01:17:01] ludicrous nature of some of its coding. [01:17:03] >> Yeah. [01:17:03] >> And it's going to go, why did we do [01:17:04] this? [01:17:05] >> Back to this example. They're going to [01:17:06] be like, why did you write it this way? [01:17:07] And it turns out because humans were [01:17:09] involved. [01:17:09] >> Right. Right. It's like I think we're [01:17:11] probably at the curve, the part of the [01:17:12] curve that's about to go like this. [01:17:16] >> To your point, the hockey stick. [01:17:17] >> The hockey stick. [01:17:18] >> Yeah. [01:17:19] >> Um and that's a very scary proposition [01:17:22] because [01:17:22] >> it's a digital god. [01:17:24] >> Well, we that means that we are all on a [01:17:27] multiund day shock clock to answer these [01:17:30] questions because it's not decades we're [01:17:32] talking about. [01:17:34] >> It's maybe on the outside two years. [01:17:36] >> So that's what is that 700 days, [01:17:39] >> right? [01:17:40] And maybe it's less than that. So maybe [01:17:42] it's like 400 days or 500 days. My point [01:17:44] is it's some number hundred of days [01:17:47] which means every day that goes by is a [01:17:49] non-trivial percentage. [01:17:53] That's a little crazy. So we have to [01:17:55] sort these questions out. But how can we [01:17:58] sort these questions out if we are [01:18:00] creating something that's going to have [01:18:03] infinitely more intelligence than we [01:18:05] have available as individual human [01:18:08] beings and even collectively as a group [01:18:10] of human beings? [01:18:11] >> That's a really good question [01:18:13] >> because one of the things that Elon kind [01:18:14] of freaked me out last time I talked to [01:18:16] him about Grock, he was like, uh, it's [01:18:19] just kind of freaks us out every couple [01:18:20] weeks like it's growing and it's capable [01:18:23] of doing things. That's just shocking. [01:18:25] >> Yeah. And no one's exactly sure how it's [01:18:29] doing it. [01:18:30] >> So, okay, this is an unbelievably [01:18:33] important point. [01:18:35] A lot of how this stuff works is still a [01:18:37] mystery to most of us. So, even when [01:18:40] you're in it, like it's almost like like [01:18:42] Joe, it's almost like you can hit the [01:18:44] pause on the machine, but then like lift [01:18:46] up the hood and look at the engine, [01:18:48] >> we still don't understand why it's doing [01:18:49] some of the [ __ ] it's doing. [01:18:52] >> That's where we are. That's the honest [01:18:54] truth of where we are. There's a lot of [01:18:56] people that understand the theory. Not a [01:18:57] lot, but enough. There's people that [01:18:59] know how to extend that, [01:19:04] but sometimes you look at it and you're [01:19:05] like, "Do we know why it did that?" [01:19:08] >> Right? Is it thinking for it? [01:19:10] >> But this goes back to what we said, like [01:19:11] why can't I think part of it is like if [01:19:13] we were a little bit more honest and [01:19:15] deescalated [01:19:18] the winner at all costs in this specific [01:19:21] thing, it would be better for everybody. [01:19:23] So I think it's important to inspect [01:19:25] what is the incentive that causes all [01:19:26] these companies to be in it for [01:19:28] themselves [01:19:30] where it must be me and nobody else. [01:19:34] Like why like why here's a question for [01:19:36] you like why is it so important do you [01:19:38] think [01:19:39] where those where the top seven or eight [01:19:41] companies couldn't get together and say [01:19:43] let's do this as a group [01:19:45] >> like kind of like my government code [01:19:47] example [01:19:48] >> we all inspect it together we get our [01:19:51] just like just a [ __ ] each team [01:19:54] drafts their Delta Force [01:19:57] and we just mog like this the the one [01:19:59] model and we why why can't that happen [01:20:03] >> because they would have share resources [01:20:04] and then there's also this hierarchy of [01:20:08] like who is more successful currently. [01:20:10] >> Exactly. [01:20:10] >> Like what's the most ubiquitously used? [01:20:13] >> Exactly. [01:20:14] >> Right. Like what is it right now? It's [01:20:15] chat GPT. Right. It's probably [01:20:16] >> chat GPT and consumer anthropic in [01:20:18] enterprise. [01:20:20] >> And as these things scale up like what [01:20:22] would be the reason that they would want [01:20:24] to bring in someone else? If you have [01:20:27] another innovative AI company and you [01:20:30] say, "Let's all get together and figure [01:20:32] this out together and share resources." [01:20:34] >> If you if you thought that the the risk [01:20:36] was that meaningful, that's probably [01:20:38] what you would [01:20:39] >> if you weren't a sociopath and some of [01:20:40] these people running these companies are [01:20:43] >> they demonstrate they certainly [01:20:45] demonstrate sociopathlike behavior, [01:20:46] >> sociopathy. [01:20:48] >> Yeah. The the other the other thing that [01:20:51] could be a little bit more benal is that [01:20:53] they also just love status games. And [01:20:54] this is the status game of status games, [01:20:57] >> right? [01:20:57] >> Attention, [01:20:58] >> right? [01:20:58] >> Back to attention. [01:20:59] >> Back to attention. [01:21:00] >> Back to attention. Dude, how many things [01:21:01] in our life do we think just comes back [01:21:03] down to that? [01:21:05] >> A lot. [01:21:06] >> A lot. [01:21:06] >> I mean, what do young people want more [01:21:08] than anything today? [01:21:09] >> Attention. [01:21:10] >> To be famous. [01:21:11] >> Attention. [01:21:12] >> Yeah. [01:21:12] >> They want to be a content creator. They [01:21:14] want to be clvicular. I mean, this is [01:21:15] the number one thing when you ask kids [01:21:17] what they want to do. It's like [01:21:19] >> content creator. [01:21:20] >> Yeah. [01:21:21] >> Because it's like a clear path where you [01:21:23] don't even have to be exceptional. [01:21:24] >> Well, I think that they're responding. [01:21:27] We designed a society for them that [01:21:30] said, "Here is the key incentive. [01:21:32] >> It's attention. [01:21:33] >> We never said it in those words. You [01:21:35] never told your kids that, right? [01:21:36] >> I never told my kids that. [01:21:38] >> But everything around them is bombarding [01:21:41] them with the same message. Hey man, [01:21:43] it's about attention. Attention is all [01:21:45] you need. Like you know what the name of [01:21:47] the critical paper in AI is? Like when [01:21:50] you go back to like the Magna Carta of [01:21:53] AI, do you know what it's called? [01:21:54] >> No. [01:21:54] >> Attention is all you need. [01:21:57] >> Really? Attention is all you need. That [01:22:00] is the name of the [ __ ] of the white [01:22:04] paper. How crazy is that? [01:22:08] Everything in our society in subtle ways [01:22:12] to just, you know, bash you over the [01:22:13] head ways tells you that attention is [01:22:16] just the most precious asset. And [01:22:20] >> well, it's one of the weirder things [01:22:21] when you go back to this concept that [01:22:23] we're living in a simulation because [01:22:25] >> this is what I mean. It's also it's like [01:22:28] when you look at [01:22:31] quantum physics, right, and the the idea [01:22:34] of the observer is that things function [01:22:37] very differently when they're observed. [01:22:38] The difference between a particle and a [01:22:40] wave, [01:22:40] >> right? [01:22:41] >> Like if you pay attention to them, they [01:22:44] observe differently. [01:22:44] >> Observe differently. Yeah. [01:22:46] >> Like what is that? [01:22:47] >> Yeah. [01:22:47] >> Like what cat? Yeah. What is that? [01:22:50] >> Why is attention so important to us? [01:22:56] That is an that is a really important [01:22:58] question, [01:22:59] >> right? And what is like the single best [01:23:03] motivator in a negative way? It's [01:23:04] negative attention. [01:23:07] >> Like that's the one thing that everyone [01:23:08] fears more than anything is negative [01:23:10] attention. [01:23:10] >> Well, and then some people figure out [01:23:11] that attention is an absolute value [01:23:14] function. Doesn't matter if it's [01:23:15] positive or negative. It's just like the [01:23:16] sum total is just great, [01:23:18] >> right? [01:23:18] >> So if I get positive attention, great. [01:23:20] Negative attention, great. If I can be [01:23:22] divisive, then I can maximize both sides [01:23:24] of that equation. [01:23:25] >> And you know, you're rewarded for that [01:23:27] at scale. [01:23:29] >> You are, but you're also you exper [01:23:32] because you're inauthentic, you [01:23:33] experience a tremendous amount of [01:23:35] negative attention. [01:23:36] >> Yeah. [01:23:37] >> And then you have this bad feeling that [01:23:39] comes with negative attention as to [01:23:41] versus [01:23:42] >> primarily positive attention which is a [01:23:44] good feeling. [01:23:45] >> Yeah. So it's this it's letting you know [01:23:47] you're on the wrong track in some sort [01:23:49] of weird primal way like in our code [01:23:52] like the negative attention is like like [01:23:55] what's what's the original version of [01:23:56] that? It's like the reason why people [01:23:58] fear public speaking is because [01:24:00] initially in a tribal situation if [01:24:03] you're talking in front of the group of [01:24:05] 150 people in your tribe it's probably [01:24:07] because they're judging you. Right. [01:24:08] >> And you [ __ ] up and you've got to make [01:24:11] some sort of a case why they don't kill [01:24:12] you. [01:24:12] >> Right. [01:24:13] >> Right. This is why everyone, this is the [01:24:14] fear of public speaking. That's where it [01:24:17] comes from. [01:24:17] >> That's encoded in our genes is like [01:24:19] >> Yes. [01:24:20] >> Back thousands of years [01:24:21] >> Yeah. [01:24:22] >> public speaking wasn't the positive act. [01:24:23] It was defend yourself before we kill [01:24:25] you. [01:24:25] >> Exactly. Exactly. And the worst [01:24:28] >> fascinating. Yeah. That's fascinating. [01:24:30] >> It is fascinating. [01:24:30] >> That makes a ton of sense. I think [01:24:31] >> it does, right? Why else would it be so [01:24:34] terrifying? [01:24:34] >> Yeah. [01:24:35] >> I thought of that the first time I ever [01:24:36] did stand up. I was like, why am I so [01:24:38] scared? It was very strange because I [01:24:40] had fought probably a hundred times in [01:24:42] martial arts tournaments like why why [01:24:44] was I so scared of this [01:24:47] but I was I was terrified for and it [01:24:50] didn't make any sense negative attention [01:24:53] >> right [01:24:53] >> you know right [01:24:54] >> bombing on stage this because all these [01:24:56] people are judging you in a negative way [01:24:58] and it feels unbel [01:25:06] you you go to bed at night you think [01:25:07] about it [01:25:08] >> give a batting average like meaning like [01:25:10] is it is it like a fixed percentage of [01:25:12] your show's bomb independent of the [01:25:14] people the moment? [01:25:15] >> No, it's really the real problem and [01:25:18] every comic faces this is once you've [01:25:21] develop an act and then you put out a [01:25:23] special then you start from scratch. [01:25:25] That's where even the greats Louis CK, [01:25:28] Chris Rock, Dave Chappelle, they all [01:25:30] bomb. Everybody bombs during that [01:25:31] process [01:25:33] >> because you're just working your craft. [01:25:34] >> It's all new stuff. Like it's I wouldn't [01:25:36] say bomb, but you don't have great [01:25:39] shows. Like I've watched the greats work [01:25:42] out new material. Like you go up with [01:25:45] ideas. You go up with like you might get [01:25:48] some giggles. You might get some laughs. [01:25:50] Some bits hit hard. Some bits are great [01:25:52] right out of the shoot. And some of them [01:25:54] you have to [ __ ] figure it out. And [01:25:57] in that process, you're going to get [01:25:59] negative attention, right? [01:26:00] >> Because it's not working, right? [01:26:01] >> It's not it's not happening. Kevin uh [01:26:04] Kevin Hart [01:26:06] told this funny [ __ ] story where he [01:26:08] was like working new material and he was [01:26:10] like doing some small show and he had [01:26:12] the [ __ ] [01:26:13] >> Oh no. [01:26:14] >> on stage and he's like I got to land [01:26:16] this thing cuz I got to figure out if [01:26:18] people want to hear it. But so he just [01:26:20] he wrapped his jacket around his selfie [01:26:22] and [ __ ] himself. [01:26:24] >> Oh my god. [01:26:24] >> It's so it's so funny. But he tells that [01:26:27] story and that's the bit that works in [01:26:28] >> Oh my god. That's hilarious. That's [01:26:30] hilarious. [01:26:31] >> It's so funny. [01:26:32] >> Yeah. Well, honesty is currency, you [01:26:36] know, in that world, especially honesty [01:26:38] where you look stupid and people can [01:26:40] relate. [01:26:41] >> Well, this is where like I think like [01:26:42] Elon subtly has figured this out, which [01:26:44] is like there's attention, but then [01:26:47] there's just authenticity. And if you [01:26:49] can be yourself and you can hit the seam [01:26:53] properly, [01:26:54] you just get infinite attention. [01:26:56] >> Yes. [01:26:57] >> And that's like a that's like a real [01:26:59] mind [ __ ] too, I think. [01:27:00] >> Right. Yeah. He doesn't seem to have a [01:27:02] hard time with like being criticized. [01:27:06] Doesn't seem to bother him that much as [01:27:07] long as he's just being himself. Like I [01:27:11] think he's like two steps ahead. Like [01:27:13] there there are things like you know [01:27:16] somebody tweeted yesterday or the day [01:27:18] before or something like [01:27:21] he controls 2.7% of GDP or something, [01:27:24] right? He's got like $800 billion. So [01:27:26] it's great crazy. It's so crazy. And it [01:27:29] was like a comparison to John [01:27:30] Rockefeller, John D. Rockefeller who [01:27:32] controlled something around the same [01:27:33] time. And he's the first comment. He's [01:27:35] like 10 trillion or bust. [01:27:40] And obviously people lose their mind, [01:27:43] >> right? [01:27:43] >> People just [ __ ] lose their mind, [01:27:45] >> right? [01:27:46] >> On both sides. So this one one side is [01:27:49] like, "Think of the abundance and the [01:27:50] incredible stuff we're going to get if [01:27:51] he can get us to 10 trillion." And other [01:27:54] people are like, "You can't hold a third [01:27:55] of the economy in your hand." Then [01:27:58] everybody goes crazy and I'm like this [01:28:00] guy's a [ __ ] genius. Like how you [01:28:02] would never have like I mean how would [01:28:05] you even have the courage to tweet [01:28:06] something like that? It just seems like [01:28:08] so crazy. [01:28:08] >> It really helps if you own Twitter, [01:28:12] >> right? Cuz if you did it in another [01:28:14] format like [01:28:15] >> you get excoriated. [01:28:17] >> Well, not only that, well, there was a [01:28:18] real chance that you'd get actually get [01:28:20] banned from the platform at one point in [01:28:22] time [01:28:22] >> for many of the things that he's posted. [01:28:24] He would have gotten banned for pre2020. [01:28:27] >> Yeah. [01:28:29] >> Yeah. [01:28:30] >> Or whatever year it was that he [01:28:31] purchased it. Yeah. Um [01:28:34] negative attention. [01:28:36] Attention period. Like so it brings back [01:28:38] to this idea of assimulation like why is [01:28:43] what humans focus on such a massive part [01:28:48] of what's valuable to us? And sometimes [01:28:51] what we focus on is not valuable. as you [01:28:54] were talking about like the things that [01:28:55] really matter in your day-to-day life or [01:28:57] that actually affect you versus the [01:28:59] things that are in the public [01:29:01] consciousness [01:29:03] >> like UFO is a great example like no UFO [01:29:05] it's not really [ __ ] I mean [01:29:07] ultimately it may so there's this [01:29:09] there's this thing that we all have like [01:29:11] recognizing the potential for danger [01:29:14] right like what's that sound what is [01:29:15] that it might be nothing but it might be [01:29:17] something go look [01:29:18] >> so look if you and I were designing a [01:29:20] video game we'd probably sit there and [01:29:23] say, "Okay, we got to get from point A [01:29:25] to point B, but to make it fun, we're [01:29:27] going to put all these little [01:29:28] distractions and honeypotss along the [01:29:29] way." [01:29:30] >> Yeah. [01:29:30] >> And what they should be doing is [01:29:32] accumulating resources to get over the [01:29:33] river and then accumulating, you know, [01:29:36] uh, weapons to fight these other guys. [01:29:38] But instead, we're going to put this [01:29:39] like little thing over here and this [01:29:40] other thing over there. And you could [01:29:42] easily get distracted. And some people [01:29:43] will have to they'll just [ __ ] beline [01:29:45] right to the end of it. They'll, you [01:29:47] know, they'll get to the end boss. Uh, [01:29:51] and I feel like that's kind of what we [01:29:53] are tasked with doing every day. We're [01:29:55] tasked with we know what's important [01:29:58] maybe deeply in our DNA and then we have [01:30:01] all this stuff that we're supposed to [01:30:02] pay attention to. [01:30:05] And I think increasingly the game is [01:30:09] tell yourself that that's actually not [01:30:10] the thing that matters. It's almost like [01:30:12] working against you and figure out what [01:30:15] this other stuff is and focus on that [01:30:18] and fix that. [01:30:20] Like politics is a game that I think [01:30:23] distracts like left and right. It's so [01:30:25] stupid and it's breaking down. [01:30:27] >> And it's breaking down because now it's [01:30:29] like it's actually like you're more [01:30:30] likely to find alignment based on age [01:30:32] versus by political orientation. Like [01:30:34] people who are 30 and younger, it [01:30:36] doesn't matter what they identify as, [01:30:37] they all believe in the same [ __ ] [01:30:40] >> a lot more. Yeah. Like meaning like if [01:30:42] you ask their views on [01:30:44] >> social policy, taxation, Israel, if you [01:30:47] ask their views, what you find is now a [01:30:50] convergence between the left and the [01:30:53] right. If you if you divide it by age [01:30:57] at our age, it's still much more about [01:31:00] >> It's not completely uniform. [01:31:01] >> No, it's not completely uniform. But my [01:31:03] point is it's it's it was simpler in the [01:31:06] past to organize people independent of [01:31:09] age by political orientation. [01:31:12] >> That simplicity is gone. [01:31:13] >> Well, isn't that because of also a [01:31:15] breakdown in trust of all government in [01:31:17] particular, [01:31:19] >> right? So the breakdown in trust which [01:31:21] is also a lot of it is because of our [01:31:23] access to information now. We understand [01:31:25] how corrupt politics are. Yeah. [01:31:26] >> We understand insider trading now in [01:31:28] Congress. We understand how different [01:31:31] people flip-flop on issues. We [01:31:33] understand how the Democrats in 2008 [01:31:36] used to view illegal immigration, which [01:31:39] is essentially MAGA plus. I mean, it's [01:31:41] it's MAGA on steroids versus like what [01:31:45] they the way they look at it today. [01:31:46] Like, why is that? What? Well, because [01:31:48] it's all game. [01:31:49] >> It's all a power, influence, and [01:31:51] attention game. [01:31:52] >> Attention game. [01:31:53] >> Yeah. It's very [ __ ] strange. [01:31:55] >> Yeah. [01:31:56] >> But it's all moving us in a general [01:31:58] direction. And that general direction is [01:32:00] access to innovation. It's all I've said [01:32:03] this a lot of times and if people have [01:32:05] heard it before, I apologize, but if you [01:32:07] looked at the human race from afar, if [01:32:09] you were something else, you'd say, [01:32:10] "Well, what does the species do?" Well, [01:32:12] it makes better things constantly, even [01:32:14] if it doesn't need them. Like, you know, [01:32:16] if you have an iPhone, I you have a 16, [01:32:19] you have a 16, you know, I I have a 17. [01:32:21] I bought it. I haven't [ __ ] turned it [01:32:22] on. [01:32:23] >> I haven't plugged it in. going to [01:32:25] eventually [01:32:26] >> eventually I'll [ __ ] plug it in and [01:32:28] [ __ ] swap everything over and figure [01:32:29] out where my [ __ ] passwords are. But [01:32:31] the reality is you don't need it, but [01:32:34] you want it and it's going to keep [01:32:36] getting better every year. Why? Because [01:32:37] that's where we're obsessed with. [01:32:39] >> This also aligns with materialism. Like [01:32:42] for a finite lifespan, why are people [01:32:45] like including old people so obsessed [01:32:48] with gathering stuff? Well, because that [01:32:51] fuels innovation. Because if there's no [01:32:54] new things coming, there's no motivation [01:32:57] to get the newest, latest, greatest [01:32:59] thing. And ultimately what that leads to [01:33:01] is greater technology which ultimately [01:33:03] leads to artificial intelligence. [01:33:06] >> My slight deviation from that is I think [01:33:07] sometimes people accumulate things [01:33:09] because it's a status game and that's [01:33:12] because they get more attention. You [01:33:14] have a Ferrari, you get attention, [01:33:16] >> right? But what does that do? It makes [01:33:17] Ferrari make better Ferraris and all [01:33:21] technology moves in the same general [01:33:23] direction. No one company [01:33:24] >> That's true. That's true. [01:33:25] >> No one company says this is it. This is [01:33:28] what we make. It's perfect. Do [01:33:29] >> you think people innately feel that by [01:33:31] being a part of this kind of like [01:33:33] consumerist capitalist system, they're [01:33:36] contributing to progress? [01:33:37] >> I don't think they innately feel it, but [01:33:39] I think that's ultimately the results. [01:33:41] >> That's ultimately the result. And it [01:33:43] seems to be universal and it seems to be [01:33:45] constantly moving in this one general [01:33:47] direction which is better and better [01:33:50] technology. [01:33:51] >> But like the stage fright example, you [01:33:52] don't think it's encoded in our DNA this [01:33:54] idea of like wow when I am a part of [01:33:57] this in some way shape or form just [01:33:58] things seem to get better and I want to [01:34:00] be a part of that like do you think that [01:34:02] that's possible that that's encoded in [01:34:03] us? [01:34:05] I think it motivates us to the ultimate [01:34:08] goal. And that ultimate goal, I think, [01:34:10] is that human beings constantly make [01:34:12] better stuff. Whatever it is, better [01:34:14] buildings, better planes, better cars, [01:34:16] better phones, better TVs, better [01:34:19] computers, better everything, artificial [01:34:22] life. [01:34:22] >> That might be the whole reason why we're [01:34:25] here. And the way I've always described [01:34:27] it is that we are [01:34:29] >> we are a biological caterpillar that's [01:34:33] making a digital cocoon. And we don't [01:34:35] even know why we're be going to become a [01:34:37] butterfly, but we're doing it. We're [01:34:39] doing it and we're moving towards it. [01:34:40] And it might be what happens to all life [01:34:43] all throughout the universe. And it [01:34:44] might be why these so-called aliens or [01:34:47] whatever the [ __ ] they are, it might be [01:34:49] us in the future. It might be other [01:34:51] versions of human beings that have gone [01:34:53] past whatever this period of development [01:34:57] that we're currently involved in right [01:34:58] now. This is just might be what happens. [01:35:01] This is what life always does. It might [01:35:03] realize that biological life, which is [01:35:06] very territorial and primal and sexual [01:35:09] and greedy and it has all these problems [01:35:12] with human reward systems ultimately [01:35:16] develops into this other thing, [01:35:18] >> right? [01:35:18] >> And then that's what we're doing. And [01:35:20] then we're in the process of that right [01:35:21] now. And I I think that when if if and [01:35:23] when not if but when when we colonize [01:35:26] Mars, I think that that that new world [01:35:28] order actually has the best chance to [01:35:30] take shape because it'll [01:35:31] >> you know, there's a lot of people that [01:35:32] think that Mars was already colonized at [01:35:34] one point in time. [01:35:35] >> That life already existed. What [01:35:37] >> that life already existed on Mars like [01:35:39] many millions of years ago and that [01:35:41] there's evidence of structures on Mars? [01:35:43] That's really weird stuff. Have you ever [01:35:46] seen the the square that they found on [01:35:48] Mars? [01:35:49] Okay, show them to them, Jamie. [01:35:51] >> One of the things that they're finding [01:35:52] with scans of Mars is like geometric [01:35:55] patterns and structures and right angles [01:35:57] that shouldn't exist. Like weird stuff [01:35:59] >> that couldn't be naturally. [01:36:00] >> No. No. Way weirder. Way weirder than [01:36:03] like the face on Sidonia. The Sidonia [01:36:06] thing is interesting. [01:36:06] >> Yeah. [01:36:07] >> Um and then this one. Look at that. [01:36:10] >> What the [ __ ] is that? [01:36:11] >> It looks like a home of some kind or [01:36:12] something. [01:36:13] >> Some enormous structure. And the size of [01:36:15] that, they don't know exactly, but it [01:36:18] may be as large as several kilometers or [01:36:22] as small as several hundred meters, but [01:36:25] they're not exactly sure. But what they [01:36:26] are sure is that it has very weird right [01:36:28] angles and right angles that seem to be [01:36:31] uniform in size. [01:36:34] >> That's crazy. [01:36:35] >> Like see how it's highlighted in the [01:36:36] enhanced photograph in the upper left? [01:36:38] Like what is that? [01:36:41] >> But sorry, did they and were they able [01:36:42] to send like the rover over there? I [01:36:44] don't know. It's too far away. [01:36:45] >> I don't think it's in the exact place [01:36:46] where the rover is at, but they're able [01:36:49] to get image of these things. And [01:36:51] there's several of these things. [01:36:52] >> That's insane. [01:36:53] >> Yeah, there's a lot of weird stuff. [01:36:55] There's a lot of weird stuff there. So, [01:36:58] there's also like ancient civilizations [01:37:00] that have these myths of us existing [01:37:03] somewhere else and coming here, [01:37:05] >> right? But you have to think if human [01:37:09] beings [01:37:10] develop somewhere else and they they [01:37:13] reach some high level of sophistication [01:37:15] and then they experienced some [01:37:16] cataclysmic disaster that completely [01:37:18] destroyed their environment which is [01:37:20] what Mars is, right? So like let's [01:37:22] assume that Mars was at one point in [01:37:24] time [01:37:26] was habitable and that life existed and [01:37:28] we know it was at one point in time. We [01:37:30] know there was water on Mars. We know [01:37:32] and there's some sort of evidence of at [01:37:34] least some sort of a very primitive [01:37:36] biological life on Mars. [01:37:39] >> If they got to a point where they said, [01:37:40] "Hey, this [ __ ] place is falling [01:37:42] apart, but this Earth spot looks pretty [01:37:44] good." [01:37:46] And they go there, but then cataclysms [01:37:48] happen on Earth and no one remembers cuz [01:37:50] all your information's on hard drives [01:37:52] and then you have to rebuild society. [01:37:54] And so you're re-remembering. And so you [01:37:57] have all these myths of how everything [01:37:59] started, you know, whether it's Adam and [01:38:02] Eve or the great flood or whatever these [01:38:03] things are that we passed down through [01:38:06] oral tradition for hundreds of years and [01:38:07] then eventually write it down and then [01:38:09] people try to decipher what it means and [01:38:11] they sit in church and try to go over [01:38:14] what what did it mean? Like what does [01:38:15] this mean? Like what what is the what is [01:38:17] the re the real origin of all these [01:38:19] stories? We don't know. [01:38:22] >> I mean that's crazy. It's crazy. But if [01:38:24] life it sounds nuts, why would life life [01:38:27] couldn't have possibly existed on Mars? [01:38:29] How the [ __ ] does life exist on Earth? [01:38:31] That how about that? How about why why [01:38:33] would we assume that it wouldn't have [01:38:35] existed at one point in time and [01:38:37] Terrence Howard who is a very [01:38:38] interesting guy. [01:38:39] >> Very interesting. [01:38:40] >> And got some [01:38:40] >> Europe is I mean [01:38:41] >> with Eric Weinstein crazy. [01:38:43] >> Yeah. Crazy. [01:38:44] >> Yeah, that one was crazy. [01:38:45] >> Um and him alone, but he's got some [01:38:47] [ __ ] weird ideas that you just make [01:38:50] you go. He's a very brilliant guy and [01:38:53] you know kind of a strange heterodox [01:38:55] thinker right and one of his ideas is [01:38:59] that [01:39:01] planets get to a certain distance from a [01:39:04] sun and they people and that it gets to [01:39:08] a certain climate and a certain distance [01:39:10] and his his [01:39:13] idea is that I don't know if you realize [01:39:15] that there's a a there's a giant um [01:39:18] ejection of of of some coronal mass [01:39:22] ejection that just happened recently on [01:39:24] the sun and they're very concerned about [01:39:26] it. They don't know what's going to [01:39:27] happen. It happens all the time. A sun [01:39:29] releases these giant chunks of material [01:39:33] >> and he thinks that these materials get [01:39:35] far enough away from the planet and then [01:39:37] they coalesce into planets or far enough [01:39:40] away from the sun. They coales into [01:39:41] planets and as time goes on they get a [01:39:44] further and further distance from the [01:39:46] sun and then obviously they get hit with [01:39:48] asteroids and there's panspermia and [01:39:51] water gets into them from comets and [01:39:54] then they develop oceans and then they [01:39:56] develop biological life and when they [01:39:58] have a certain amount of distance from [01:40:00] the sun they people and he thinks that [01:40:03] as they get further and further and [01:40:05] further away they get less and less [01:40:06] habitable and then they get to a point [01:40:09] where they have their techn technology [01:40:11] to a point where they realize like we [01:40:13] can't sustain life on this planet [01:40:15] anymore. We got to go to that other one. [01:40:18] And so do they go to the one that's [01:40:19] closer to the sun because they're too [01:40:21] far now. [01:40:23] >> It's a nutty idea. [01:40:24] >> Jesus Christ. [01:40:25] >> It's a nutty idea. But if you think [01:40:26] about how recent our sun is in terms of [01:40:31] the solar system itself, in terms of [01:40:32] rather the galaxy itself. So if the [01:40:34] universe, if the big bang is correct and [01:40:37] our universe existed and it was rather [01:40:40] our universe erupted from nothing or [01:40:43] from a very small thing 13.7 billion [01:40:46] years ago. Well, this [ __ ] planet's [01:40:48] only 4 something billion years old, [01:40:50] right? And life is only, you know, a [01:40:53] little bit less than that. Yeah. [01:40:54] >> So you have like a billion years or so [01:40:56] where there's nothing and then you start [01:40:58] getting single- cellled organisms, [01:40:59] multi-elled organisms, and eventually it [01:41:01] peoples. [01:41:02] And when it gets to a certain point when [01:41:04] these people have advanced their [01:41:06] curiosity and their innovation to the [01:41:08] point where they can harness space [01:41:10] travel and they use zero point energy [01:41:13] and they have a bunch of different [01:41:14] things that we haven't invented yet. And [01:41:16] then their environment degrades and it [01:41:18] gets to the point where they realize [01:41:20] like hey we're getting pummeled by [01:41:22] asteroids. We can't sustain life here [01:41:24] anymore. [01:41:24] >> We got to move [01:41:25] >> like Elon wants to go to Mars which [01:41:27] might be the wrong answer. We might want [01:41:29] to go that way. [01:41:30] >> There are closer to the sun. [01:41:32] >> Exactly. I mean, the thing is he's got [01:41:35] everything it he needs now to get there. [01:41:37] Like I [01:41:38] >> I'm not going. Are you going? [01:41:39] >> I would go. [01:41:40] >> [ __ ] that. I'll send you an email. [01:41:43] >> Hold on a second. Think Think about [01:41:44] Think about what he's going to take. [01:41:45] Okay, look. [01:41:47] >> When Let's just say he gets there with [01:41:48] the city. He has [01:41:50] >> he has the way to transport us there, [01:41:54] >> right? [01:41:54] >> Okay. Then when you land, he's got the [01:41:59] way to actually transport us around on [01:42:01] the [01:42:01] >> on the planet, right? He's got Tesla, [01:42:03] >> right? [01:42:04] >> He will have already sent a fleet of his [01:42:06] robots. [01:42:08] Those folks will have made some [01:42:10] inhabitable city [01:42:12] probably using the Boring Company drill [01:42:14] because you you're going to, you know, [01:42:15] be under the regalith that you don't [01:42:17] want to be on the top. Maybe you just [01:42:18] dig a hole and you you inhabit down [01:42:21] there. [01:42:22] >> Um he's got all the ways to make energy. [01:42:24] He has the AI to help you design the [01:42:27] stuff. He has the communication way to [01:42:30] communicate. He's got the internet, his [01:42:32] own internet. So that right [01:42:34] >> so he can get, you know, all of the [01:42:35] information to everybody. And then he's [01:42:38] got money and the super app so that you [01:42:41] can transact. And then I think to myself [01:42:43] like what is he actually missing? And [01:42:45] then what happens if he if he gets there [01:42:47] first? Is he just allowed to just do [01:42:51] whatever he wants? Like is it just kind [01:42:52] of like a free-for-all? like [01:42:54] >> well kind of [01:42:55] >> it's his constitution. Like is that what [01:42:56] happens? [01:42:57] >> Well, it's like earth but shittier. [01:43:00] >> Like we already have all those things [01:43:01] here. Why would you want to go to a [01:43:02] place where you die when you go outside? [01:43:04] >> I think what people will be attracted to [01:43:05] is that if he publishes his version of [01:43:07] what the rules are there, there's a [01:43:09] chance that he could make them really [01:43:11] different to what the rules are here. [01:43:12] >> Like what kind of rules would you do if [01:43:14] you were the king of Mars? Um [01:43:17] so I think that your view is incredibly [01:43:21] to me like um positive some like of [01:43:24] humanity of like we want to make things [01:43:26] better. Mhm. [01:43:27] >> So if I think about that as like a [01:43:28] function, what happens that that's like [01:43:31] so our natural rate of direction is [01:43:32] forward what pushes back on that and a [01:43:35] lot of it what you find is like [01:43:36] government regulation rules all that [01:43:38] stuff [01:43:38] >> greed [01:43:39] >> greed um too much focus on attention [01:43:42] >> right [01:43:43] >> so I would try to experiment with like [01:43:45] what the incentives would have to be so [01:43:47] that you had more unfettered [01:43:49] entrepreneurship like just like do the [01:43:51] thing that you think is right [01:43:52] >> and there's a mechanism where we give [01:43:54] you the ability to then make things for [01:43:57] more people because you're proving that [01:43:59] you're actually really good at making [01:44:00] things. And if you don't need money at [01:44:03] that point in society, reorienting us [01:44:06] away from this kind of like brittle form [01:44:09] of exchange to something more useful, [01:44:11] that's worth experimenting with. I think [01:44:12] that's an important [01:44:14] >> Well, there's also the concept of the [01:44:16] self of the individual which may erode [01:44:18] with technological innovation. So, if we [01:44:22] really can read each other's minds, if [01:44:25] we really do get to a point where we're [01:44:27] communicating through technologically [01:44:30] assisted telepathy, like a lot of the [01:44:34] whole the weirdness of people is I don't [01:44:37] know what you're thinking. I don't know [01:44:38] if I should trust you. You know, this [01:44:41] [ __ ] might be devious. You know, [01:44:43] you know what I mean? Well, we'll know, [01:44:45] right? [01:44:45] >> And there will be no need for all that [01:44:47] if we really are all one. If that's [01:44:50] ultimately something that can be [01:44:51] achieved with technology, [01:44:53] >> like this hive mind. [01:44:54] >> Yes. Like legitimate hive mind. And then [01:44:57] like look where society's going. [01:44:58] Gender's kind of falling apart. People [01:45:00] are getting they're reproducing less, [01:45:03] right? People are are having less [01:45:04] testosterone, more miscarriages, less [01:45:07] fertile. We're kind of moving into this [01:45:09] genderless direction. And I don't know [01:45:13] if it's by design, but microlastics and [01:45:18] phalates and all these different [01:45:20] chemicals that are endocrine disruptors [01:45:22] are all ubiquitous in our society. Well, [01:45:24] does [01:45:25] >> is that a coincidence that that's all [01:45:27] happening at the same time as [01:45:28] technological innovation on mass scale? [01:45:30] Is it? I don't know. Because like what's [01:45:32] the one thing that's holding us back? [01:45:35] Well, that we're territorial primates [01:45:38] with thermonuclear weapons and that we [01:45:41] exist in a sort of tribal mindset, but [01:45:43] yet we do it on a planet of 8 billion [01:45:46] people. [01:45:46] >> Yeah. No, no. The key differentiator of [01:45:48] humans is our our ability to enact [01:45:50] violence. [01:45:51] >> Yeah. [01:45:51] >> To to methodically execute premeditated [01:45:55] violence. [01:45:55] >> Yes. And and greed and attention. And [01:46:00] one of the things that attention is [01:46:02] sexual preference or or sexual rather [01:46:05] attention like the ability to procreate, [01:46:08] the ability to acquire mates, right? [01:46:10] Like the more resources you have, the [01:46:12] more attractive you'll be, especially [01:46:13] for males and males are the ones that [01:46:15] are involved in the violence in the [01:46:17] first place. You know, there's I can't [01:46:19] name a single war that was started by a [01:46:21] woman. [01:46:21] >> How do you how do you teach your kids [01:46:24] that attention is not everything? [01:46:29] That's a good question. Especially in [01:46:31] this society, it's probably harder to do [01:46:33] that now than ever before [01:46:35] >> because the reaction that I suspect most [01:46:36] kids will have is like, [01:46:39] "Stop. Like, leave me alone." Like, it's [01:46:42] just it's almost an impossible thing. [01:46:44] >> Well, I think kids learn more from their [01:46:48] parents' behavior than anything you say [01:46:50] to them. I think they learn from the way [01:46:53] you behave and the way you exist and the [01:46:57] way you exist with them. [01:46:58] >> And if you are constantly whoring [01:47:02] yourself out for attention, [01:47:05] >> it's one thing if you get a lot of [01:47:06] attention from what you do, but if [01:47:08] that's your primary goal, they're going [01:47:10] to know. [01:47:11] >> Do your kids know how famous and [01:47:13] influential you are? Like honest [01:47:14] question. [01:47:15] >> Oh, yeah. They know. [01:47:16] >> But do do they have a real sense of it [01:47:18] or do you just kind of like [01:47:20] >> Yes, they can. I mean, how can you? It's [01:47:22] got to be weird as [ __ ] growing up with [01:47:23] a very famous dad. It's very odd, but [01:47:27] it's not my primary goal. [01:47:29] >> Yeah, this is my point. You're not [01:47:30] You're not putting it in their face. [01:47:31] >> So, to your point, you're not modeling [01:47:33] attention is all you do. [01:47:34] >> No, no. I have interesting conversations [01:47:37] with cool people. I tell jokes and I [01:47:42] call fights. Like, those are the things [01:47:44] that I do. [01:47:45] >> And they also know that I have a very [01:47:47] strong work ethic and that I work [01:47:49] towards things. So they have very strong [01:47:50] work ethics. They're very motivated and [01:47:52] disciplined, like shockingly [01:47:54] disciplined. And I think that's modeled. [01:47:56] I think that that comes from and they [01:47:59] also like really enjoy achieving goals. [01:48:01] And they're they're rewarded for it with [01:48:04] praise and with admiration, but not [01:48:09] never with like you're better than other [01:48:11] people. Yeah. Ne never like it's the the [01:48:13] idea is like all human beings are [01:48:16] capable of greatness. So, it's like find [01:48:18] the thing that you excel at and if you [01:48:21] throw yourself into that, it's very [01:48:23] rewarding. [01:48:24] >> I really I really believe in this. I [01:48:26] tell this story when I interview people. [01:48:28] When I interview people, I'm always [01:48:29] like, you know, just at whatever [01:48:30] company, I'm always like I first only [01:48:32] want to know about them. I'm like, [ __ ] [01:48:34] your resume. Like, tell me about your [01:48:36] parents and how you grew up. I just want [01:48:38] to know that. Stop at 18. Everything [01:48:40] before 18. Just tell me every little [01:48:43] detail, [01:48:43] >> right? [01:48:44] >> You know, and some people tell me these [01:48:45] incredible stories. They'll be like, you [01:48:47] know, my mom was an alcoholic or this or [01:48:49] that. And I'm just like, man, this is so [01:48:52] valuable because it allows me to [01:48:54] understand who they are. The second part [01:48:56] of the interview, [01:48:57] >> we do the business [ __ ] But the third [01:48:59] part, I tell this story. This is a crazy [01:49:01] story about what you're just saying. [01:49:03] >> They ran this experiment at Stanford [01:49:05] where they take like a big bowl, fill it [01:49:08] with water, and they drop in a mouse and [01:49:11] they measure how long it takes for the [01:49:12] mouse to drown. They do it like a [01:49:15] hundred times. The average was about [01:49:16] four minutes, call it four, four and a [01:49:18] half minutes. [01:49:20] Then they run the experiment again, 100 [01:49:22] mice, and at minute three or three and a [01:49:25] half, they take it out, they dry it off, [01:49:27] they play it music, and they whisper [01:49:29] like sweet nothings into the mouse's [01:49:31] ear. They drop the mouse back in the [01:49:32] water, and that mouse treads water for [01:49:37] 60 hours the next hundred mice on [01:49:39] average. And the upper bound was 80. And [01:49:43] I thought to myself like that is all [01:49:46] just potential right there. Like that's [01:49:48] all like there's all this latent [01:49:50] potential. So if an animal has it, I'm [01:49:51] going to assume that humans have it too, [01:49:53] >> right? [01:49:54] >> But you never get a chance to unlock it. [01:49:56] Like the average person is just kind of [01:49:58] like living a life where they're maybe [01:50:00] scratching five or 10% of their [01:50:01] potential. And the question is, how do [01:50:03] you get to that other 90%. Like how does [01:50:04] the second batch of mice How do the [01:50:06] second batch of mice tread water for 60 [01:50:08] hours? [01:50:09] >> Well, the doesn't make any sense to me. [01:50:10] all the same mice, right? I think the I [01:50:14] think the mice get rescued, [01:50:16] >> they get rescued [01:50:17] >> and then when they try it again those [01:50:20] same mice last longer, right? So it's [01:50:22] the same mice. So it's an experience. [01:50:25] >> So they have experience now. They [01:50:27] understand that they can tread water [01:50:29] where they didn't die. So they [01:50:31] understand that they can survive where [01:50:33] they didn't know that they could survive [01:50:34] the first time they were thrown into the [01:50:36] water because they had never been thrown [01:50:37] into water before. That's the same thing [01:50:39] that happens to people when they fight. [01:50:41] Like the first time people ever have a [01:50:43] competition, they [ __ ] panic and they [01:50:45] they get really scared and they get [01:50:47] really like filled with anxiety. But [01:50:50] after a while, you get relaxed and [01:50:52] that's when you get really dangerous [01:50:54] because then you get calm and you could [01:50:56] keep your [ __ ] together while you're in [01:50:58] the middle of all this chaos because you [01:51:00] have the experience of it. Without the [01:51:02] experience of it, very few people do [01:51:03] well the first time, right? Unless [01:51:05] you're exceptionally talented and and [01:51:07] you have other competition experience, [01:51:10] like you've competed in other things, [01:51:11] like maybe you played football or some [01:51:13] other things and you know what it's like [01:51:14] to actually perform under pressure. [01:51:16] >> What is the what is the version of [01:51:18] giving more humans a chance to get to [01:51:20] that? [01:51:22] Well, I think sports are really good for [01:51:25] that because performing under people [01:51:27] paying attention to you and performing [01:51:29] where people are trying to stop you from [01:51:31] doing something and you're trying to do [01:51:32] something and there's all these unknowns [01:51:35] and recognizing that hard work allows [01:51:38] you to do whatever you're trying to do [01:51:40] better than you previously had. One of [01:51:43] the things my martial arts instructor [01:51:44] said to me when I was young is that [01:51:46] martial arts are a vehicle for [01:51:48] developing your human potential. And [01:51:50] that through this very difficult thing [01:51:52] that you're trying to do, [01:51:54] >> you're learning that, oh, if I just [01:51:57] think smart and think hard and train [01:52:01] wise and train hard and discipline [01:52:04] myself to endure suffering so that I can [01:52:07] develop more endurance and more speed [01:52:09] and more power and more technique [01:52:11] because I accumulate all this [01:52:12] information and I really think about [01:52:13] what it is and apply it with drills and [01:52:16] with training, I can get better at this [01:52:18] thing. And every time I get better at [01:52:19] this thing, I get rewarded psychically, [01:52:21] like mentally, you feel better. Like I [01:52:23] know that I'm better now. And then [01:52:24] there's the belt system, right? You [01:52:26] start off, you're a white belt. And in [01:52:28] taekwondo, you get a blue belt and then [01:52:30] after you get a blue belt, you get a [01:52:31] green belt. And then if you get a I [01:52:33] think it's green belt first and then I [01:52:34] forget how it goes. And then it's red [01:52:36] belt and black belt. And like when [01:52:37] you're a black belt, like holy [ __ ] So [01:52:39] it's this thing where you've developed [01:52:41] to a point where you've gotten to this [01:52:43] next stage. So all along the way you've [01:52:45] been rewarded for your hard work and [01:52:48] then you realize like oh I could do this [01:52:50] with everything in life. [01:52:51] >> Is a reward different than attention? It [01:52:53] is. [01:52:53] >> It is because it's internal right. [01:52:56] You're you're all you're realizing that [01:52:59] you could apply this to you know [01:53:01] whatever it is to carpentry to music. [01:53:04] You you just it's just a matter of focus [01:53:07] and attention. And some people [01:53:08] unfortunately never find a vehicle. They [01:53:11] never find a thing that they can throw [01:53:13] themselves into. They realize like, and [01:53:15] this is not unique. It's not like I'm an [01:53:19] unusual person or anybody is. I mean, [01:53:22] there's people that have unusual [01:53:24] physical gifts and some people have [01:53:26] unusual mental gifts. But the reality [01:53:28] is, no matter where you start, everyone [01:53:31] can get better. And when you do [01:53:33] something, whether it's learning to play [01:53:34] guitar, as you get better at it, you [01:53:36] realize like, oh, this is what it's all [01:53:39] about. Yeah. Like it's really all about [01:53:41] applying yourself to something and then [01:53:43] feeling this immense satisfaction of [01:53:46] your hard work paying off and that [01:53:48] motivates you to work hard at other [01:53:50] things. And if you don't find that early [01:53:52] on, it's very difficult to like find [01:53:55] like real satisfaction. [01:53:57] >> Yeah. [01:53:57] >> In life. [01:53:58] >> Yeah. I've always had something outside [01:54:00] of my [01:54:02] daily life [01:54:04] that is the thing that I actually care [01:54:06] about and it actually energizes me for [01:54:08] my day-to-day life. I don't know if [01:54:10] that's like a lot of people but [01:54:11] >> like what do you do? What's your take? [01:54:12] >> Like well initially it was poker and I [01:54:16] and even now I obsess about the game um [01:54:19] because it's infinitely more complex [01:54:21] than chess. Like chess you can get to a [01:54:23] place where you can roughly be good. [01:54:25] Poker is just constantly there's just [01:54:28] too many variables. There's human [01:54:30] emotion, there's human psychology, [01:54:32] the number of people, all of this stuff [01:54:34] just makes the complexity of the game [01:54:37] something that I find magical. [01:54:39] >> And so I sit there and I try to [01:54:41] understand like why am I doing the [01:54:42] things that I'm doing? And so much of it [01:54:44] comes back to being a mirror about [01:54:46] what's happening in my daily life. It's [01:54:48] the [ __ ] craziest thing. Like I'm [01:54:50] super insecure. I'll go into poker and I [01:54:52] will just lose for weeks at a time. But [01:54:54] it's because I'm insecure in my daily [01:54:56] life. And what's happening is that I'm [01:54:58] trying to find these quick wins and [01:55:00] quick solutions [01:55:01] >> because I'm in a state of insecurity. [01:55:03] I'm anxious. I have this anxiety. And so [01:55:06] it's become a great mirror for me. So [01:55:08] that used to be a thing. It still is a [01:55:10] thing. And but I've become reasonably [01:55:13] skilled at it where the edges are [01:55:16] smaller and I put myself in positions [01:55:18] where I'm only playing against a certain [01:55:20] group of people and I'm the losing [01:55:22] player frankly in that game. if when I'm [01:55:24] playing against like the top pros, [01:55:27] it just doesn't it helps me and I can [01:55:29] get tuned up for it. But then I started [01:55:32] to, you know, I would take different [01:55:33] things. I tried to learn how to ski. [01:55:35] Basically impossible when you're older. [01:55:37] I look like a [ __ ] idiot. Like [01:55:38] >> how old were you when you tried? [01:55:40] >> Uh I started when I was like, you know, [01:55:42] I I was a good snowboarder, so I was [01:55:43] snowboarding my whole life. And then my [01:55:45] kids skied and so I'm like, okay, well, [01:55:47] I want to do this as a family. So I was [01:55:49] like 42 or something when I tried. I'm [01:55:51] 49 now, almost 50. [01:55:54] It's brutal. I mean, it's like I look [01:55:55] like a [ __ ] idiot. Like this gangly [01:55:58] giraffe like trying to get down the [01:55:59] mountain. And then now I start at golf [01:56:02] and man, I got to tell you, I used to [01:56:06] play a little bit, then I stopped. But [01:56:08] there's something to me about being [01:56:09] outside [01:56:11] where just like being in nature I find [01:56:14] like really motivating. [01:56:16] >> It's a vitamin. [01:56:17] >> It's a vitamin. And then just the mind [01:56:19] body connection of that game, it just [01:56:21] really [ __ ] with you because it's it's [01:56:23] just nothing you can master and [01:56:25] overpower, [01:56:26] >> right? [01:56:27] >> And it teaches you to just like be in [01:56:29] it. [01:56:30] >> Yeah. [01:56:30] >> And that's a very hard skill. Like if [01:56:34] you look at the best like I there's like [01:56:36] a handful of people that I really look [01:56:37] up to and I obsess like Mer Buffett, but [01:56:40] the Berkshire meeting was this past [01:56:42] weekend and if you look at the clips, [01:56:43] there's this incredible thing where they [01:56:46] transitioned, right? Mer passed away. [01:56:48] Buffett's like now executive chairman. [01:56:50] But this guy Greg Ael and this guy Ajet [01:56:52] Jane. Ajet Jane does this thing where [01:56:53] he's like, I teach the people that come [01:56:55] to just say no. Your whole job is to [01:56:58] just say no. You're going to get [01:56:59] bombarded with all kinds of business [01:57:00] pitches. Say no, no, no. And eventually [01:57:03] somebody will come and [ __ ] try to [01:57:04] whack you in the head with a 2x4 of [01:57:06] money. Then you come to me and we'll do [01:57:08] the deal. And it made such an impression [01:57:11] because like again when I'm insecure, [01:57:15] my reward function is attention. So I'm [01:57:17] like a [ __ ] little busy body. I'm [01:57:18] running around doing all this little [01:57:20] [ __ ] you know? E. And then man, [01:57:23] when I'm in a [ __ ] flow state and [01:57:25] like I'm tunning it, like I'm striping [01:57:27] the ball, you know? I'm like a few [01:57:29] things that really matter in size and [01:57:31] I'm like, man, this is this is right. [01:57:35] It's all come to me because I'm like I'm [01:57:38] like within myself and these other [01:57:41] things are a better reflection of when [01:57:43] I'm within myself and these other things [01:57:45] are a mirror of when I'm totally out of [01:57:47] kilter. [01:57:48] >> That's just me. [01:57:50] >> So in my life these things tend to lead. [01:57:53] >> Um [01:57:54] >> I think you're saying that's just you, [01:57:55] but I think that's generally most [01:57:57] people. I think you find these things, [01:58:01] these vehicles for developing human [01:58:04] potential, whether it's martial arts or [01:58:05] golf or playing guitar or playing chess [01:58:08] or poker. [01:58:09] >> And then you have to have, I think, one, [01:58:11] at least for me, one seinal relationship [01:58:14] in your life. You have to have one [01:58:16] person that has just undying belief in [01:58:18] you. And I never really had that until I [01:58:21] met my wife. And that was a very, and I [01:58:23] didn't I pushed against it so [ __ ] [01:58:25] hard because I was like, it just can't [01:58:27] be true. like why does this person give [01:58:29] a [ __ ] Do you know what I mean? Like [01:58:31] why do they care about me more than I [01:58:32] do? [01:58:32] >> Well, there's also the fear because so [01:58:34] many people get in those bad [01:58:35] relationships. [01:58:36] >> And I'm just like I I think there's a [01:58:38] part of you like me where you're just [01:58:40] like I'm not a very lovable person. Like [01:58:42] I'm just like this is that's not who I [01:58:44] am. And this woman is just there. [01:58:48] So that's been like the thing like for [01:58:50] me it's like and because she's brutal. [01:58:52] She'll be like, "Oh yeah, that was [01:58:53] [ __ ] horrible." You know, like [01:58:55] yesterday I like we had this we I I I [01:58:57] did this thing at at Milk and it was a [01:58:59] dinner at my friend's house and uh then [01:59:02] you know, we're both going to different [01:59:03] airports. I'm flying here to see you and [01:59:05] she's flying home and uh she calls me [01:59:08] and I'm like I'm how did I do? Ah [ __ ] [01:59:16] But but no, there's the parts that I did [01:59:18] well and then she critiques the other [01:59:19] parts that she didn't like. And then I [01:59:21] say which is like it's and it's so again [01:59:24] I'm insecure so I'm like I want the [01:59:25] self- serving well how would because [01:59:27] there was three of us on this panel and [01:59:29] she's like uh and I was like you know I [01:59:31] was the best right? She's like no Gavin [01:59:34] was better. I'm just like it's so but [01:59:37] it's so refreshing because it keep again [01:59:39] it's like a [01:59:40] >> keeps you in check [01:59:41] >> like and it gives me a mirror. [01:59:43] >> Mhm. [01:59:43] >> You know like when I was coming to see [01:59:46] you yesterday when we were flying down [01:59:48] to LA for this thing um there's parts of [01:59:52] me where when I'm insecure I kind of [01:59:54] like externalize and I can be like [01:59:57] really hyperbolic unnecessarily [01:59:58] hyperbolic and it's counterproductive. [02:00:01] And she said to me listen like just [02:00:02] imagine your friends these are [02:00:04] hardworking people. They're trying their [02:00:05] best as well. They don't necessarily [02:00:07] know. Some some things have massively [02:00:08] worked out for them, but they would want [02:00:10] to do the right thing. There's people [02:00:12] you've worked with before that want to [02:00:13] do the right thing. And she's like, just [02:00:15] picture them and don't judge. You can [02:00:16] observe. [02:00:19] And it's crazy, but it's like I need [02:00:20] those little things. There's like [02:00:22] tweaks. It's like having a coach kind of [02:00:23] like [02:00:24] >> and that and that's very that's very [02:00:26] helpful to me. [02:00:26] >> Yeah. It's very important. It's hard to [02:00:28] do that yourself. [02:00:29] >> I can't do it. [02:00:30] >> And it's also like [02:00:31] >> I'm [ __ ] maxing. Like my life is like [02:00:33] I like that flow. Mhm. [02:00:34] >> And if if I didn't have somebody who [02:00:36] loved me and would hold me accountable, [02:00:38] I' just [ __ ] not think about it. [02:00:40] >> Yeah. And the opposite of that is [02:00:42] someone who's like an antagonistic [02:00:44] relationship. And we know a lot of [02:00:46] people that have those kind of very [02:00:48] sabotagey sort of marriages and [02:00:50] relationships. And that's crazy. [02:00:52] >> It's brutal. [02:00:53] >> It's brutal. And I don't think they've [02:00:55] ever had a really good one. Otherwise, [02:00:56] they would never tolerate that. [02:00:59] >> I didn't know what good looked like. So [02:01:01] you kind of just I think a lot of people [02:01:03] go with the flow. Like I mean I was a [02:01:06] nerdy kid from kind of a shitty [ __ ] [02:01:09] up kind of like family structure [02:01:13] and then I got injected into this rich [02:01:15] high school but then I got to go back to [02:01:18] an alcoholic father. I'm on [ __ ] [02:01:19] welfare. Like it's like you know my my [02:01:22] self-confidence is negative [ __ ] two [02:01:24] units. [02:01:25] >> Didn't have a girlfriend. You know like [02:01:26] all the [ __ ] in high school like nothing [02:01:28] happened for me. And so my modeling of [02:01:31] like how to be in a relationship, what [02:01:32] to do, it was [ __ ] zero. [02:01:36] Um, it was zero. And so all those [02:01:39] mistakes were mostly because I didn't [02:01:41] understand what good looked like, [02:01:43] >> right? [02:01:44] >> Um, and then I stumbled into this [02:01:45] relationship after my divorce and my [02:01:47] ex-wife is an incredible woman. Just [02:01:49] like not, [02:01:50] >> you know, [02:01:50] >> what you needed or what she needed. [02:01:52] >> Yeah. She's we're just we were in in in [02:01:54] a few very specific ways, [02:01:57] >> we just weren't on the same page. [02:01:59] And then I find this other one and it's [02:02:03] and I think like I don't I was so [02:02:05] skeptical. I'm like I I kind of viewed [02:02:08] like a relationship as like this adjunct [02:02:10] to your life. There's you you're at the [02:02:13] center. M [02:02:14] >> you're doing your [ __ ] and one of the [02:02:16] appendages to your thing is you're [02:02:19] that's what I thought and then now it's [02:02:21] the opposite where I feel like my wife's [02:02:24] at the center and I'm like I would [02:02:26] always kind of like like almost like [02:02:28] laugh at people in the in my mind I'm [02:02:30] like it's not possible that somebody [02:02:31] feels this way about somebody else. [02:02:34] >> Um but it's an it's an it's a huge [02:02:36] enabler. It's a it's a very much a gift. [02:02:39] So that can also be a thing that people [02:02:40] look for. You know what I mean? which is [02:02:42] >> I think what you're saying is that [02:02:43] there's a bunch of different things that [02:02:46] have to sort of exist together and that [02:02:49] it's not just completely focus on your [02:02:51] work but that focusing on these other [02:02:54] things enhances the work and then the [02:02:57] work enhances all these other things as [02:02:58] well and they all exist together [02:03:00] >> and my best work is when I'm not [02:03:02] thinking about the attention or the [02:03:05] money. Those are the two most corrupting [02:03:07] influences in my life. If I if when I [02:03:09] look back and when I've lost when I've [02:03:11] lost the most amount of money [02:03:13] >> or when I've reputationally hurt myself [02:03:15] the most, [02:03:16] >> it's all been because of attention and [02:03:18] money. [02:03:19] >> Those are the only two things the root [02:03:21] cause consistently has been that. [02:03:23] >> That makes sense because you're thinking [02:03:24] about a result rather than the process. [02:03:27] >> Exactly. [02:03:27] >> Yeah. [02:03:28] >> Exactly. [02:03:29] >> So, and then thinking about that result [02:03:30] like oo I'm going to get a lot of [02:03:31] attention from this. Oo, I'm going to [02:03:33] get a lot of money from this. That [02:03:34] actually robs you of the focus that you [02:03:36] need to concentrate on the process. [02:03:38] >> Exactly. And the the thing about the [02:03:40] process is that so much of that when [02:03:44] you're in a flow state you're proud of [02:03:47] >> irrespective of the size of it because [02:03:49] the meetings are the same. Do you know [02:03:51] what I mean? Like [02:03:52] >> you're in the same [ __ ] 35 minute [02:03:54] meeting or 45 minute meeting debating a [02:03:56] product or debating a thing. [02:03:59] But the minute that I start to feel like [02:04:00] embarrassed about company A versus [02:04:02] company B or decision A versus decision [02:04:04] B, [02:04:05] >> now my mind is like, "Okay, hold on a [02:04:07] second here. I'm about to run myself off [02:04:09] the cliff." [02:04:09] >> Yeah. [02:04:10] >> You know, or you know, we I had this [02:04:12] dinner last week and this is what's [02:04:13] amazing. Like [02:04:16] we're talking about poker. Well, I So [02:04:18] I'm having dinner with my wife and a [02:04:20] friend and uh [02:04:23] she's like, "How are you doing?" Just [02:04:25] like a very generic nice question, [02:04:27] >> right? And I go into this long [ __ ] [02:04:29] diet tribe of like, well, you know, the [02:04:31] investing thing this, and then I started [02:04:33] this other thing that. And my wife's [02:04:34] looking at me like, what the [ __ ] are [02:04:35] you rambling on about? And then it got [02:04:37] But it got worse, Joe. It got worse. It [02:04:39] got even [ __ ] worse. Then I'm like, [02:04:41] uh, you know, but then I had this poker [02:04:43] game. I started rambling on. It's [02:04:45] normally on Thursdays, but then I I [02:04:46] moved it up to Wednesdays, but then I [02:04:48] moved it up to the city because my [02:04:49] friend's having it. And then I name [02:04:50] dropped who the guy was. And my wife [02:04:53] just looks at me like, "What the [ __ ] is [02:04:55] going on with you? [02:04:57] So, the dinner ends. It was And then [02:05:00] she's like, "What the [ __ ] is going on [02:05:01] with you?" She's like, "That was [02:05:03] insane." And I had no idea that I was [02:05:07] doing it. [02:05:09] And I'm like, "Okay, we need to put [02:05:10] Humpty Dumpty back together again [02:05:11] because I'm about to go on Rogan and I [02:05:13] can't go off [ __ ] like a crazy wild [02:05:14] man." Uh, but it's a it's it's an [02:05:17] enormous gift. That's been my biggest [02:05:19] unlock in these last like eight or nine [02:05:21] years. Like I I feel like like I'm kind [02:05:23] of like adding skills to my toolkit. I [02:05:25] feel like a golfer like that's like I [02:05:27] can shape shots a little bit now. I know [02:05:29] how to use different clubs. [02:05:31] >> Um and it's all like mindset [02:05:34] >> and it's like it's very much what you [02:05:36] it's like this processoriented approach [02:05:38] >> and you just can't control the outcome [02:05:41] and that's like a it's a magical [02:05:44] feeling. It's interesting that you're [02:05:46] saying this because like think about [02:05:48] what most people or people that are on [02:05:52] social media like the kind of attention [02:05:55] that they're focusing on. Like this is [02:05:58] why virtue signaling is so unsuccessful, [02:06:01] right? It's so bad for it because it's [02:06:02] fake. You're really concentrating on the [02:06:04] process. Are you really concentrating on [02:06:05] the result? The result is getting people [02:06:07] to love you. Exactly. Getting people to [02:06:08] agree with you. Getting and then [02:06:09] worrying about the criticism. Oh my god, [02:06:11] they hate me. Oh my god, they're mad at [02:06:12] my my statement. Oh my god, they're this [02:06:14] and then you're like obsessing on it all [02:06:16] day. People that aren't even anywhere [02:06:17] near you. It's like it's one of the [02:06:19] absolute worst things for mental health [02:06:21] is this addiction that people have to [02:06:24] posting things and then reading the [02:06:25] responses to those posts and getting [02:06:28] wrapped up in these very weird [02:06:30] two-dimensional interactions with human [02:06:33] beings. [02:06:33] >> You never read your comments. I mean, [02:06:34] you're very famous. You're like, it [02:06:36] doesn't [ __ ] matter to me. Well, [02:06:38] you're going to get to a certain point [02:06:39] in time where if you have x amount of [02:06:42] people that follow you, you're going to [02:06:44] have a percentage that are mad at you [02:06:47] and those are the ones you're going to [02:06:48] think about, right? [02:06:49] >> And if you don't self audit, maybe [02:06:51] that's good. Maybe it's good to say [02:06:53] like, "You [ __ ] piece of shit." Like, [02:06:54] "Oh, I'm sorry." You know, like what [02:06:56] your wife saying to you like, "What the [02:06:57] [ __ ] was that?" Like, "Ah, shit." Like, [02:07:00] I am very self-critical. Very like [02:07:03] horribly so. like to the point I torture [02:07:05] myself, you know? So, I'm like, I don't [02:07:07] need that from other people. And also, [02:07:09] those people don't love me and they want [02:07:11] me to fail. Like, there's a lot of [02:07:12] people that their lives are very [02:07:14] unsuccessful and I've been way too [02:07:16] fortunate, right? So, it's like there's [02:07:18] a reason to be upset at me if your life [02:07:20] is [ __ ] because I've I've gotten I've [02:07:22] three of the best jobs on earth. It [02:07:24] doesn't make any sense, right? So, [02:07:25] there's a re and also why the [ __ ] is [02:07:27] this podcast so successful? That doesn't [02:07:28] make any sense, right? So, it's like I [02:07:31] get it. I understand why people, but I'm [02:07:33] not gonna help them. I'm not gonna help [02:07:35] them bring me down. I'm not gonna [02:07:37] indulge in it and ruin my own mind by [02:07:39] wallowing in their [ __ ] because the [02:07:41] only reason why you would do that in the [02:07:42] first place is if you're not together. [02:07:44] No one is healthy and happy and [02:07:46] intelligent is going to post mean things [02:07:49] about you. So, you are reading things [02:07:51] from people that are mentally ill, [02:07:53] unhappy, and probably not. Maybe they're [02:07:56] intelligent in terms of their ability to [02:07:58] solve certain issues and problems. Maybe [02:08:00] they're good at certain skills, but like [02:08:02] their overall grasp of humanity and like [02:08:05] being a good person is not good if [02:08:08] you're [ __ ] on people, especially if [02:08:09] you like add homonym attacks and just [02:08:12] insults. And [02:08:13] >> so, it's not a good thing to ingest. [02:08:16] It's not It's like if you go down the [02:08:17] supermarket, you see Twinkies, oh, [02:08:18] they're right there. Don't [ __ ] eat [02:08:20] them, okay? That's not good for you. And [02:08:23] so it's like I don't think that at a [02:08:25] certain point in time, especially if you [02:08:27] become publicly known and famous, you [02:08:29] should ever read your comments. I don't [02:08:30] think it's good for you. [02:08:31] >> Yeah. [02:08:32] >> But you better be self- auditing or [02:08:35] you'll start sniffing your own farts and [02:08:36] think they smell great. Like don't do [02:08:38] that either. [02:08:39] >> Yeah. [02:08:39] >> But you I know a lot of people that have [02:08:43] gone crazy reading their own comments. [02:08:45] I've met comedians that like they'll [02:08:47] think about it all day long. It will [02:08:49] [ __ ] with them. It will tort. Well, [02:08:51] their neurosis are what's what creates [02:08:53] great comedy to begin with. So, if you [02:08:54] feed that neurosis in the wrong way, [02:08:56] you're [ __ ] [02:08:56] >> The wrong way, right? And then also the [02:08:58] self-doubt creeps in because all these [02:09:00] people telling you you suck and they're [02:09:01] like, "Oh my god, I suck." And then you [02:09:03] go on stage with this like, "People [02:09:04] think I suck, they hate me." You can't [02:09:06] do that. Like, if you [02:09:09] have a certain amount of energy in the [02:09:12] day, this is what I always tell [02:09:13] comedians. I said, "Look, think of your [02:09:16] attention and your focus as a a unit. [02:09:18] You have a hundred units. If you spend [02:09:21] 30 of those [ __ ] units on [ __ ] [02:09:23] online, you're robbing 30 units from all [02:09:26] the things you love. 30 units from your [02:09:28] family, 30 units from your friends, 30 [02:09:30] units from your job, 30 units from golf [02:09:33] or poker or whatever it is that you love [02:09:35] to do. You're stealing your own time and [02:09:38] your own focus [02:09:39] >> for loers, right? [02:09:41] >> Like why would you do that? And those [02:09:43] losers are good people. They're just [02:09:44] they most people are good people. They [02:09:46] just a bad path. I would have been the [02:09:48] personing. Yeah. Yeah. Look, if you gave [02:09:51] me a [ __ ] Twitter account when I was [02:09:52] 16, oh my god, it would have been [02:09:55] horrendous. [02:09:55] >> Yeah. I would have been going crazy. [02:09:56] >> Oh my god. I would have been a terrible [02:09:58] person. It's normal. Especially if your [02:10:00] life sucks and you're not doing well and [02:10:03] you're attacking famous people or you're [02:10:04] attacking this person that's doing [02:10:06] better than you or whatever it is. [02:10:07] >> Like it's Do you uh have you seen the [02:10:09] clips of the [ __ ] maxing? [02:10:12] >> No. [02:10:12] >> You don't know what this is? [02:10:13] >> No. [02:10:13] >> You don't know what this is? [02:10:14] >> No. What's [ __ ] maxing? [02:10:16] >> Oh, this guy is fantastic. He sits He [02:10:19] sits on his back porch. Jamie, can [02:10:20] Jamie, can you just show He sits He sits [02:10:23] on his back porch smoking a cigar [02:10:27] basically telling you everything's kind [02:10:29] of [ __ ] Stop thinking about [ __ ] [02:10:31] You know, if you don't like your [02:10:33] friends, leave them. If you don't like [02:10:34] your girlfriend, leave them. Stop [02:10:36] overthinking. Simplify your life. You [02:10:38] know, it's it's inc it's so simple, but [02:10:42] I think it's incredibly [02:10:43] >> Who is this guy? [02:10:44] >> Elisha Long, I think, is his name. I [02:10:46] don't know Jamie if you can find it. I [02:10:47] think Elisha [02:10:48] >> [ __ ] maxing is funny because I know [02:10:49] about looks maxing. We talked about that [02:10:52] recently on a podcast but that's [02:10:53] recently entered into my mind into my [02:10:56] zeitgeist. looks max. [02:10:58] >> That's the clvicular, but I've only [02:11:00] found out about that within the last few [02:11:01] months of life because I I genuinely [02:11:04] stay off social media as much as [02:11:05] possible. And if I do read things, what [02:11:08] I like to do, I like to focus on [02:11:10] fascinating things. Like a lot of my [02:11:12] time I spend looking at YouTube stuff [02:11:14] because YouTube stuff, my algorithm is [02:11:16] all like new black holes they've [02:11:18] discovered, you know, new discoveries in [02:11:21] terms of like what is the fabric of [02:11:23] reality? Like I'm that's interesting to [02:11:26] me. And if I just concentrate on people [02:11:28] being mean or shitty to each other or [02:11:30] the latest [ __ ] political drama, it's [02:11:32] like [02:11:34] >> what? I don't have much time. I'm busy. [02:11:37] I like things. [02:11:38] >> And [02:11:39] >> are you are you on like Instagram and [02:11:41] Tik Tok? [02:11:41] >> I'm on Instagram. I do not have a Tik [02:11:43] Tok. This is Luke Maxing. No, this is [02:11:46] [ __ ] maxing. So, let me hear what he [02:11:48] says. [02:11:49] >> Who's this guy? What's his name? [02:11:50] >> Elisha Long. [02:11:51] >> Shout out to Elisha. being used as a [02:11:54] poisoning of nostalgia, [02:11:56] but to simply remind you of what you [02:11:59] found it for. And as we grow up, we [02:12:03] often give that up for security. We give [02:12:05] that up so that we're accepted. We give [02:12:07] that up to flex and appear like we have [02:12:10] now figured things out, that people will [02:12:13] accept us. [02:12:15] The only way that you will truly be [02:12:17] successful is if you are righteous and [02:12:21] you live according to your nature and [02:12:23] you play, man, and you don't let people [02:12:25] take play away from you to to be at the [02:12:28] circus and be ooed and aed and worried [02:12:31] about all the [ __ ] Return to a [02:12:33] state of play. [02:12:35] >> Well, that's very good advice. [02:12:38] Return to [ __ ] max. The best thing [02:12:41] that you could do is return to a state [02:12:42] of play. That's true. There's a lot of [02:12:44] that, you know. There's a lot of that. [02:12:46] Absolutely. [02:12:47] >> Oh, I I I think that that is like a It's [02:12:49] a [02:12:49] >> It's a wise man for a young fella. [02:12:51] >> Yeah. [02:12:52] >> Okay. He's a jiu-jitsu guy. There you [02:12:53] go. Look, he's getting his [ __ ] blue [02:12:55] belt there. Or he's getting his purple [02:12:56] belt. What is going on there? So, is he [02:12:59] getting his blue belt? [02:13:00] >> Yeah, it's his purple. [02:13:02] >> Yeah. So, they're taking his blue belt [02:13:03] off and putting his purple belt on. [02:13:05] Yeah. See, that's he's learning. He's a [02:13:06] martial artist. That's why. [02:13:08] >> You think martial arts people are just [02:13:10] more like spiritually connected to the [02:13:12] truth? I don't know if it's spiritually [02:13:13] connected to the truth. It's forced down [02:13:15] your [ __ ] throat because you can't [02:13:17] believe you're better than you are if [02:13:19] you're getting mauled every day, [02:13:22] you know? And there's only one way. This [02:13:25] guy's on the path to becoming a [02:13:26] jiu-jitsu black belt. Looks like a [02:13:27] pretty big guy, too. That'll help. Um, [02:13:29] but there's only one way to get a black [02:13:32] belt in jiu-jitsu. You got to train [02:13:33] jiu-jitsu all the time and get better at [02:13:35] jiu-jitsu. You can't pretend you're [02:13:36] better. You know, there's a lot of [02:13:38] people that write poems and they suck [02:13:39] and they think they're so deep. [02:13:41] >> Yeah. this poem [02:13:42] >> meaning like there's just a very simple [02:13:44] objective measurement that says [02:13:45] >> 100%. You either win or you lose. You [02:13:48] either tap or you get tapped out. You [02:13:51] know, you tap somebody or you get [02:13:53] >> Can you get a black belt in some gym [02:13:55] that's easier than a different gym or [02:13:57] something like that or [02:13:58] >> sort of kind of but not really. I mean, [02:14:01] everybody's trying hard. I mean, there's [02:14:03] definitely better gyms where they're [02:14:04] more technical and their program is much [02:14:07] more systematic and they're better at [02:14:09] breaking down skills like how to develop [02:14:11] skills, you know. Um there's definitely [02:14:14] better gyms. Uh there's better schools, [02:14:17] there's better places to learn, but [02:14:18] everywhere you learn, you're going to [02:14:20] have a bunch of people that are trying [02:14:22] hard like and you have a bunch of people [02:14:24] that are trying to learn these. And also [02:14:25] today because of the internet, you could [02:14:28] go on YouTube and there's thousands of [02:14:32] tutorials breaking down new moves. [02:14:34] Jiu-Jitsu is like endlessly comp. [02:14:36] >> One of my one of my kids has ADHD and [02:14:38] one of the things that was recommended [02:14:40] to us was jiu-jitsu. [02:14:41] >> Yeah. What is ADHD, man? It's not even [02:14:42] [ __ ] real cuz I definitely have it. [02:14:44] And I think I think it's a superpower. [02:14:47] >> I think we all have it. [02:14:48] >> I think I look I do not focus well on [02:14:51] things that I think are boring. But if [02:14:52] you give me something that that I love, [02:14:54] I can't I'll I'll play pool for [ __ ] [02:14:56] 12 hours in a row. [02:14:57] >> It's crazy. But like the reason I got [02:14:58] back into golf is my seven-year-old gets [02:15:00] on the course and sometimes you can talk [02:15:02] to him and he's not making, you know, [02:15:03] he's like just like in his own world. [02:15:05] >> Exactly. [02:15:05] >> And then you start talking about chess [02:15:07] or jiu-jitsu or whatever and then we get [02:15:10] him on the golf course and this kid is [02:15:12] just dialed in. [02:15:13] >> Yeah. Superpower. [02:15:14] >> And I'm like, "Holy shit." [02:15:15] >> And they say that that's a disease. [02:15:17] That's crazy. Because if you find a [02:15:19] thing that that kid loves, he's going to [02:15:21] excel at it above and beyond most [02:15:23] humans. [02:15:23] >> We uh he does these chess classes and [02:15:26] like look, he's seven. So I'm like, "All [02:15:28] right, [ __ ] Bring it. [ __ ] [02:15:30] [ __ ] destroy you. I'm going to [02:15:31] [ __ ] ball you." And uh we're playing [02:15:35] last weekend and he goes, "Oh, Dad, you [02:15:38] know you can't castle out of check." I'm [02:15:40] like, "Shut the [ __ ] up. I know how this [02:15:42] game works." And I go on to beat him and [02:15:44] I and I went to my wife and I'm like, [02:15:47] he's six weeks away from beating me. I [02:15:48] got to go. [02:15:51] And then I spent I spent two days I [02:15:53] spent two [ __ ] days on YouTube and I [02:15:56] was like, "Okay, I got to brush up on my [02:15:58] openings and I got to I got Oh my god, I [02:16:00] don't have time for this [ __ ] [02:16:01] >> But I can't let this sevenyear-old beat [02:16:03] me. [02:16:05] >> You know what I mean? You're going to [02:16:06] have to." [02:16:06] >> And I'm like And I was like, "How do I [02:16:08] how do I stall this until maybe he's 10 [02:16:10] or 11?" Then it's like, "Okay, fine. You [02:16:11] finally beat me. Congratulations. [02:16:13] >> You have to think of him as an extension [02:16:15] of you and be happy when he does. [02:16:16] >> Oh my god. [02:16:17] >> Yeah. That's just how it is. Look, if [02:16:19] you're a man and you have a son, I I [02:16:21] have all daughters, but if I had a son, [02:16:24] I would be legitimately terrified that [02:16:26] he'd be able to tap me. Because if I had [02:16:29] a son, one of the first things that I [02:16:31] would do is get them. I got my kids [02:16:32] involved in martial arts at an early [02:16:33] age, but I didn't force them to keep [02:16:35] doing it. They did it for a certain [02:16:36] amount of time, and then they went on to [02:16:37] do a bunch of other things that they [02:16:38] enjoy better, [02:16:39] >> which is fine. But I think it's good to [02:16:41] learn some skills, learn how to defend [02:16:43] yourself so you're not completely lost. [02:16:46] Just it's I think it's good for you. [02:16:47] It's good to learn. It's good to develop [02:16:49] confidence. But for boys, I think it's [02:16:51] critical, you know, especially boys with [02:16:53] my kind of DNA. I'm like, I think it's [02:16:55] good to get that [ __ ] out of your [02:16:56] system. But if I had a son, there'd be a [02:16:58] certain point in time. I'm like, it's a [02:16:59] matter of time before this [ __ ] [02:17:01] can kill me. You know, it's like I mean, [02:17:04] I'm 58 years old. If I had a 20-year-old [02:17:06] kid, like, he'd probably kill me. [02:17:07] >> Kick your ass. [02:17:08] >> Probably [ __ ] kill me. [02:17:09] >> He'd kick your ass. Yeah. It's like what [02:17:11] am I going to do? There's nothing you [02:17:12] can do. You just have to accept it and [02:17:13] then hope your relationship with him is [02:17:15] strong enough that he still respects you [02:17:16] even though he can kill you because it [02:17:18] can't be entirely ba. Look, there's a [02:17:21] lot of martial arts instructors that are [02:17:23] old and they're revered and respected [02:17:25] and nobody wants to try to hurt them, [02:17:27] right? Because you realize if you learn [02:17:30] enough, you get to a certain point in [02:17:31] time, you realize like [02:17:32] >> I'm a much better dad to my sons than I [02:17:35] am my daughters. And I mean this in the [02:17:36] following way. My daughters have the run [02:17:38] of the place. Whatever they want, I'm in [02:17:40] love with them. I don't love them. I'm [02:17:41] in love with them. Whatever they need, [02:17:43] what they can just [02:17:44] >> enamored by. [02:17:45] >> They're just like they can control me. [02:17:47] They just kind of send me in one [02:17:48] direction or another. I'm just like [02:17:49] they're I'm like, [02:17:50] >> by the way, they know that, too. [02:17:51] >> I'm enslaved by them. [02:17:52] >> Yes. [02:17:53] >> You know, and I just want their [02:17:54] attention. Any small little shred, I'm [02:17:56] like, [02:17:57] >> boo your son, you keep them in check. [02:17:59] >> Whereas like my sons, I keep the I'm [02:18:01] doing everything that I was supposed I [02:18:02] think I'm supposed to be doing. Now, the [02:18:04] good news is my, you know, daughters are [02:18:06] just different. Like, they're girls. [02:18:07] They're just, so they don't need the [02:18:09] same kind of like tough loveish, [02:18:11] >> right? [02:18:12] >> You know, but then my boys reveal their [02:18:14] characteristics in ways that really [02:18:16] surprise me. And I'm just like, man, [02:18:17] this is so [ __ ] awesome. Parenting [02:18:19] has been the best. Like, when I again, [02:18:21] like slowing down and actually being in [02:18:23] it, [02:18:24] >> and I'm like, [ __ ] this is amazing. [02:18:25] >> It is pretty amazing. And watching your [02:18:28] kids get really good at things is really [02:18:30] fascinating. It's fascinating. I told [02:18:31] you this story before, but like you know [02:18:34] my son, my oldest son, this is my [02:18:36] 17-year-old. It's just a great kid. [02:18:40] He goes and he's like, "Okay, I'm [02:18:41] applying for college." And I'm like, [02:18:43] "Great. Let me take you to the Naval [02:18:44] Academy, West Point. Let me show you [02:18:46] these servicemies." And he sees those [02:18:48] and he's like, "These are incredible." [02:18:49] But then he's like, "I think I want to [02:18:50] go to like, you know, Georgetown or [02:18:52] Vanderbilt or whatever." And I'm like, [02:18:54] "Hey man, that's like um just a bigger [02:18:56] version of your high school and [02:18:58] whatever. if that's what you want to do, [02:19:00] you do you. And you know, um, but you [02:19:04] know, my the I'll help you like kind of [02:19:07] get to the starting line here, but [02:19:09] you're on your own. And he had to get a [02:19:11] job because I'm like, if you're going to [02:19:12] get into these schools, you got to get a [02:19:13] job. And so he tries to [02:19:16] last summer, I just started [ __ ] [02:19:18] screaming at him. And I'm like, you [02:19:21] [ __ ] louse, you haven't done [02:19:22] anything. And this is at like another [02:19:24] kid's at our at our son's birthday [02:19:25] party. I scream at him. He starts [02:19:28] crying. I'm like, "You need to do more." [02:19:30] Then my wife screams at him. He starts [02:19:33] crying again. Then my ex-wife screams at [02:19:35] him. He starts crying again. [02:19:38] And he just goes, "I'm out of here." He [02:19:40] walks out. Meanwhile, I start panicking [02:19:43] and I'm like, "I got to tiger dad this [02:19:44] situation." So, I start texting a few [02:19:47] friends trying to figure out, "Hey, can [02:19:48] I, you know, do you guys want to hire [02:19:49] this kid? He's like really, you know, [02:19:51] he's pretty smart kid. Did all this [02:19:52] stuff in robotics, yada yada." One of [02:19:55] them says, "I'd be willing to interview [02:19:57] him." I call him and he's like, "Dad, I [02:20:01] got a job." I said, 'What do you mean [02:20:02] you got a job? Said, 'I went around [02:20:05] downtown, [02:20:07] went to all these places and I was in a [02:20:08] McDonald's and um the woman was having a [02:20:12] little bit of difficulty speaking [02:20:13] English, so I just booked her in [02:20:14] Spanish. I got the application. I sat [02:20:16] down at the desk and the guy having [02:20:17] lunch beside me said, "Hey, I heard you [02:20:20] needed a job and uh I really like the [02:20:23] way you talked to this woman. I'm the [02:20:25] general manager of the car wash down the [02:20:26] street. Come and work for me." [02:20:29] And I said, "Well, what are you going to [02:20:30] do?" He goes, "I'm going to go work [02:20:32] there." And I said, "Okay, well, I got [02:20:34] this other interview for you as well, so [02:20:35] you should see maybe you can do both." [02:20:37] Anyways, the end of the story is he did [02:20:39] he did these two jobs. He worked at a [02:20:40] robotics firm, but then he worked at a [02:20:42] car wash. And when I tell you this [02:20:44] story, I am so proud of this kid because [02:20:45] of the car wash. Because that car wash [02:20:48] thing, he was he would come home and [02:20:49] he's like, "Man, you have no idea how [02:20:51] people live." And I'm like, "What do you [02:20:52] mean?" He's like, "The stuff that I find [02:20:54] in the trunk when I have to vacuum these [02:20:56] cars and clean out the cars." And I'm [02:20:58] like, "Bro, that is a gift. You have [02:21:00] given a [ __ ] gift. That is the thing [02:21:02] that if you take with you, you'll be [02:21:04] golden the rest of your life." Because [02:21:06] all this other [ __ ] is all kind of [02:21:07] manufactured. I help because I'm [02:21:09] anxious. I'm insecure. Mhm. [02:21:11] >> But that [ __ ] you did on your own. And [02:21:13] that thing is what people will [ __ ] [02:21:14] respect when push comes to shove. [02:21:16] >> It's also jobs that suck are really good [02:21:18] for you. [02:21:18] >> So good. I used to work at Burger King [02:21:20] when I was 14. [02:21:22] >> Man, let me tell you. [02:21:23] >> You were 14 and you had a job. [02:21:26] >> When my dad had to stay behind like we [02:21:30] were my dad was a diplomat in the [02:21:32] embassy of Sri Lanka in Canada. This [02:21:35] [ __ ] war in Sri Lanka is crazy. He [02:21:37] writes this essay. His life is [02:21:38] threatened. So he files for refugee [02:21:41] status. He gets it. [02:21:44] He gets kicked out of the embassy. So he [02:21:47] doesn't have a job. My mom becomes a [02:21:48] housekeeper [02:21:50] and we're kind of toiling in this [02:21:51] poverty cycle. So 14 you have to I had [02:21:54] to get a job and I would take the money [02:21:56] and you know we buy I buy the bus [02:21:58] passes. I would buy some of the [02:21:59] groceries. We just trying to make it all [02:22:01] work right. And uh I got a job at the [02:22:04] Burger King. [02:22:05] This is another example where I was [02:22:08] like, I'm going to go get a job. Hey, [02:22:10] can you drive me to the interview? And [02:22:12] my dad's like, "No, [02:22:16] get on your [ __ ] bicycle and go." And [02:22:18] I thought, "Bro, we need this. You need [02:22:20] the money more than I do. Why are you [02:22:22] making me bicycle?" But I bicycled and I [02:22:25] got the job and I worked there. And I [02:22:27] used to work the night shift. [02:22:28] 14-year-old kid, man. Wow. [02:22:30] >> From [ __ ] 8 till 2 in the morning. [02:22:32] And I would have to clean this like 8:00 [02:22:34] p.m. to 2 in the morning. [02:22:34] >> Then you had to go to school in the [02:22:35] morning. [02:22:36] >> No, then I this was always like Friday, [02:22:38] Saturday, Sunday. [02:22:39] >> Wow. [02:22:39] >> Thursday, Friday. Sorry. Thursday, [02:22:40] Friday, Saturday. And then Yeah. Some [02:22:42] days I would have to go to school. But [02:22:44] And why did I work until two? Because [02:22:46] when the restaurant closes, [02:22:48] you get whatever the food is left over, [02:22:50] right? So like you get a couple chicken [02:22:52] sandwiches, you get like the, you know, [02:22:54] the the version of the McNuggets that [02:22:56] Burger King had, a couple whoppers, and [02:22:58] you take them home. [02:23:02] But the amount of vomit that I had to [02:23:04] clean up the bathroom, you can't [02:23:07] imagine, man, the a downtown Burger King [02:23:10] near bars, you know, after closing time, [02:23:14] the [ __ ] you see. [02:23:16] >> Oh, wow. [02:23:16] >> And the [ __ ] you deal with. And all I [02:23:19] could think of was like, I just want to [02:23:20] get the [ __ ] out of here. But that was [02:23:23] so valuable for me. [02:23:24] >> Yeah, [02:23:25] >> that was so valuable for me. Um, [02:23:29] and then I worry that my, you know, kids [02:23:31] don't get exposed to it, but when my son [02:23:32] got it, maybe I'm overimposing too much [02:23:34] about it, but it's like I'm like, man, [02:23:36] that that car wash thing is really going [02:23:38] to be the thing that separates you in [02:23:40] life. [02:23:40] >> Yeah. Doing something that sucks. It it [02:23:42] also [02:23:43] >> just being humble and grinding through [02:23:44] that [ __ ] you know? Do you realize like [02:23:47] this is sometimes people they don't pick [02:23:49] a path and they just have a job and they [02:23:52] don't like it and they stay with this [02:23:54] thing they don't like forever and that's [02:23:56] not what you want. [02:23:58] >> No, [02:23:58] >> it's not what you want. But the [02:24:00] development like the learning how to do [02:24:03] something that sucks and grinding [02:24:04] through it [02:24:05] >> and still doing it well. [02:24:06] >> Yeah. You know, doing it well. [02:24:08] >> Make a make a Whopper. Be there on top. [02:24:09] >> I know how to [ __ ] make a Whopper. [02:24:11] >> Yeah. [02:24:11] >> Do you know what I mean? [02:24:12] >> Yeah. [02:24:13] uh make the fries, change the oil, all [02:24:15] that [ __ ] [02:24:16] >> And then when you apply that those [02:24:18] lessons to something you actually love [02:24:21] and you work hard at something you love, [02:24:23] >> magical. [02:24:23] >> Oh, it's incredible. It's a real gift. [02:24:26] >> It's a real gift. [02:24:27] >> Yeah. I mean, you know, some people they [02:24:29] don't appreciate the process, you know, [02:24:32] and it's hard to because like when [02:24:34] you're young and you're going through [02:24:36] these difficult jobs and these things [02:24:37] that suck and you don't know how it's [02:24:39] going to turn out, you know, and a lot [02:24:41] of times people aren't really educated [02:24:42] in what a process actually is and about [02:24:45] how it does develop character and it [02:24:47] does develop discipline and that these [02:24:49] things are actual skills that you can [02:24:51] apply to other things in life. You just [02:24:54] think, "God, I'm a [ __ ] loser." I [02:24:55] have a I have a visual for this. I [02:24:58] always ask myself, am I in the engine [02:24:59] room right now? [02:25:01] >> This is my way of saying like an engine [02:25:02] room is a little hot. It's a little [02:25:04] uncomfortable, but it's where all the [02:25:07] [ __ ] is happening. It's where the [ __ ] [02:25:08] is being made. [02:25:10] >> And so I'm like, it's a little, you [02:25:12] know, discomforting, [02:25:13] >> but I got to be in there. And there are [02:25:15] days where, and there'll be weeks [02:25:17] >> where that's all I do. I'm just in it, [02:25:20] you know? I don't I'm not good at [02:25:21] responding to emails sometimes or [02:25:23] whatever because there'll just be weeks [02:25:24] where I'm in it and it's an incredible [02:25:27] visual for me because I'm like yeah this [02:25:29] is like where like I'm grounded [02:25:31] >> and I like feel myself and then when I [02:25:33] when I look at my like my health [02:25:37] that's when like I just feel like really [02:25:39] good about myself like not insecure and [02:25:43] my vitals are different like it's crazy [02:25:45] like my [ __ ] HRV like my HRV [02:25:49] craters [02:25:51] when I'm like just like you know [02:25:55] insecure [02:25:56] >> of course [02:25:57] >> but why is that like it's it's your it's [02:25:59] your heart rate variability should have [02:26:01] nothing to do with your like [02:26:03] >> disposition and your mood [02:26:05] >> well your mind is the idea that your [02:26:08] mind is separate from the body is crazy [02:26:10] >> it's crazy [02:26:10] >> it's not and [02:26:12] >> but is your HRV lower when you're just [02:26:14] out of sorts [02:26:15] >> yes probably right [02:26:16] >> I'm sure yeah I don't really monitor it [02:26:18] that much Yeah. [02:26:20] >> And I'm I try not to ever get out of [02:26:21] sorts, too. And one of the ways that I [02:26:24] keep from getting out of sorts is daily [02:26:26] discipline. Like it's if I if I have [02:26:28] days where I'm sure it gets out of sorts [02:26:30] if I have a few days in a row where I [02:26:32] don't work out, but I I work out almost [02:26:34] every day. And if I'm not working out, [02:26:36] I'm still cold plunging and going to the [02:26:38] sauna and stretching. I'm always doing [02:26:40] something. And if I don't do something, [02:26:42] I feel like I'm [ __ ] up. And then [02:26:44] then I can [02:26:45] >> So, does it matter what it is? Meaning, [02:26:46] as long as it's a routine. [02:26:48] >> Yeah. Well, I I do it all myself. I [02:26:50] don't have a trainer, but I write things [02:26:52] down. I write down what I want to [02:26:53] accomplish. I write down what I'm going [02:26:54] to do, and then I just do it. I like a [02:26:57] robot force myself to do it. [02:26:59] >> Yeah. [02:27:00] >> And then I always feel better after it's [02:27:01] over. And it's always the hardest part [02:27:03] of my day. [02:27:03] >> Yeah. And so it makes everything else so [02:27:05] much easier because it's I [ __ ] work [02:27:07] out hard [02:27:08] >> and so everything else is pretty easy, [02:27:11] >> you know, because the strain like just [02:27:12] being in that [ __ ] cold water or just [02:27:15] going through Tabatas on a a dye bike [02:27:18] like this shit's hard. It's really hard. [02:27:20] Like I could die right now hard. And so [02:27:22] everything else is like how how hard's [02:27:24] it going to be? Oh, it's uncomfortable. [02:27:26] Oh boohoo, you know? Like I think it's [02:27:28] important to go through that. I I really [02:27:31] think it is, you know? I really think it [02:27:33] is and that that's a the difference [02:27:35] between, [02:27:37] >> you know, sanity and like having a very [02:27:39] slippery grip on your your own personal [02:27:43] sovereignty. I think a lot of it is like [02:27:46] you have to choose it. It has to be like [02:27:50] elective [02:27:52] voluntary adversity. Like you have to [02:27:55] choose to do it. [02:27:56] >> Yeah. I that's a really great way of [02:27:57] saying it. Voluntary adversity. If it's [02:27:59] forced upon you, [02:28:00] >> you can kind of compartmentalize [02:28:02] >> and then you get angry like [ __ ] this [02:28:03] bitter and resentful making me do stupid [02:28:05] [ __ ] But if you force yourself to do [02:28:07] it, you know, [02:28:08] >> this why these special forces guys are [02:28:09] such [ __ ] animals. [02:28:10] >> Of course, [02:28:11] >> they're choosing, [02:28:12] >> right? Exactly. And they develop that, [02:28:15] you know, this mentality when you're [02:28:17] around other people that are also [02:28:18] savages. You know, you just you realize [02:28:21] like there's other people out there in [02:28:22] the world that are not making excuses. [02:28:25] they are getting after it every day and [02:28:27] they are pushing every day. And the more [02:28:29] you can surround yourself with people [02:28:30] like that, the more people the people [02:28:32] that complain about nonsense and the [02:28:34] find excuses and focus on other people [02:28:37] and [ __ ] about things and why is she [02:28:40] doing this? Why is this happening for [02:28:42] him? [02:28:44] >> Yeah, [02:28:44] >> it's loser mentality. And if you're [02:28:46] around more winners, you know, you [02:28:48] absorb that. You imitate your [02:28:49] atmosphere. Yeah, it's very important [02:28:50] and it's very hard for people, [02:28:52] especially young people, to find [02:28:55] positive influences and to find positive [02:28:58] groups. And I think it's one of the [02:29:00] reasons why a lot of young people [02:29:02] gravitate towards podcast because they [02:29:03] get to hear interesting conversations [02:29:05] with really accomplished people that are [02:29:07] fascinating that are unlike anybody that [02:29:09] they're around on a daily basis, [02:29:12] >> you know. And that that's also one of [02:29:13] the reasons why it's important to find [02:29:15] some that's why martial arts are so good [02:29:17] for young people because you're around [02:29:19] other people that are doing this really [02:29:21] difficult thing and other sports too [02:29:23] whether it's football or wrestling [02:29:24] whatever it is. [02:29:25] >> I actually found like you know the last [02:29:26] few years I go out of my way to not [02:29:28] isolate myself. That's one thing like [02:29:29] being around other people engaging in [02:29:31] things. [02:29:32] >> Yes. [02:29:32] >> Has been really healthy for me. [02:29:33] >> Oh for sure. [02:29:34] >> Oh my god. And I just found like what [02:29:36] the [ __ ] am I doing? It's like [02:29:37] everything is in my little house by [02:29:38] myself with everybody everything comes [02:29:40] to me. It's so odd. [02:29:42] >> It's odd. It's really odd. Very [02:29:43] unhealthy. [02:29:44] >> And it starts to [ __ ] you up in the [02:29:45] bind. [02:29:45] >> And then your interaction with humans is [02:29:47] only on the internet. It's terrible. You [02:29:49] know, [02:29:50] >> or with people that are sickantically [02:29:51] either being paid or need something from [02:29:53] you. [02:29:54] >> Yeah. [02:29:55] >> And then I think you're in a really bad [02:29:56] place. [02:29:57] >> Absolutely. [02:29:57] >> Whereas like if you're in the grind with [02:29:59] other people, they're beating you at [02:30:00] things. It's great. [02:30:01] >> Yeah. Yeah, if you're in a situation [02:30:03] where there's a bunch of sickopantically [02:30:04] connected people to you and they're just [02:30:06] all kissing your ass and I mean we all [02:30:08] know people that are like the heads of [02:30:09] companies and that are just like [ __ ] [02:30:11] tyrants. [02:30:12] >> I think the the the the trap about being [02:30:14] successful because it's not everything [02:30:16] it's wrapped up to be is exactly that. [02:30:17] You become so isolated that you become [02:30:20] this like very caricaturous version of [02:30:22] yourself [02:30:23] >> because you forget what it's like to [02:30:25] just a basic example like wait in line, [02:30:28] be kind to other people, be polite, like [02:30:30] be accommodating, have some empathy, [02:30:32] >> right? [02:30:32] >> Where are you put in that situation to [02:30:34] do those things, [02:30:35] >> right? You forget that you're just a [02:30:36] person. [02:30:36] >> You're just a [ __ ] person. [02:30:38] >> And if you achieve some level of success [02:30:40] that you're trying to you're trying to [02:30:42] achieve this level of success so you [02:30:44] elevate past being a person, you're [02:30:46] missing the point. like you're never [02:30:47] going to and if you do it'll come at a [02:30:50] price. [02:30:50] >> I thought being successful was supposed [02:30:52] to write all the wrongs that I felt like [02:30:56] I missed [02:30:58] and it turns out nobody gives a [ __ ] [02:31:00] >> No. [02:31:01] >> And it does none of that. [02:31:02] >> I think it's all the process. The all of [02:31:06] life is the process. [02:31:07] >> I agree. [02:31:08] >> I think as soon as you think that [02:31:10] there's a goal like, "Oh, I'm going to [02:31:11] retire and experience my golden years." [02:31:13] I think it's all horseshit. And that's [02:31:15] one of my main fears about AI. My my [02:31:19] main one of my main fears about this [02:31:21] idea of universal high income and [02:31:24] everyone's going to have, you know, [02:31:25] ultimate abundance. It's like where does [02:31:27] anybody find purpose and meaning? And [02:31:30] where do the where do you take whatever [02:31:34] this thing is that the mind is [02:31:37] constructed of these these needs that [02:31:39] the mind has that have to be satisfied [02:31:42] in order to achieve sanity? in order to [02:31:45] achieve some sort of like place where [02:31:48] you can be at peace, [02:31:49] >> fulfillment. [02:31:49] >> Yeah, you have to do you're going to [02:31:51] have to do something, man. You're going [02:31:52] to have to do something. And I mean, [02:31:54] maybe it could just be jiu-jitsu and [02:31:57] golf and find some stuff that you enjoy [02:31:59] doing and take some benefit in that. But [02:32:03] boy, that's not been the case for [02:32:06] hundreds of years. You know, that's not [02:32:08] how human beings have exist. I mean, but [02:32:11] also part of me says, why do we have to [02:32:14] work to find those things? Why can't we [02:32:17] why why is it all that? [02:32:20] >> Well, you got to find the thing that's [02:32:22] not work, [02:32:23] >> right? But but I'm getting at is like [02:32:25] why is our identity all tied up in money [02:32:30] and and and and [02:32:32] just things and objects and stuff. And [02:32:36] this is a fairly new thing in human [02:32:38] society, right? [02:32:40] >> Why can't it transform into [02:32:44] m like your basic needs are all met? [02:32:47] Like nobody ever has to worry about [02:32:48] starving again. Nobody ever has to worry [02:32:50] about not having a home to sleep in. [02:32:52] Nobody ever has to worry about not [02:32:53] having healthcare. Nobody ever has to [02:32:55] worry about not having education. So [02:32:56] then it becomes find a purpose with your [02:32:59] life. And as a society, can we adjust? [02:33:03] Can we gravitate towards a new way of [02:33:07] existing in meaning? And it would [02:33:08] probably be great. In one way, it'd be [02:33:11] great because we wouldn't have to be [02:33:14] constantly thinking, why does he have [02:33:15] that and I don't have that and this and [02:33:16] that. Instead, it would probably be [02:33:19] like, what can I do to get better at the [02:33:21] thing that I love, right? [02:33:23] >> What you know, and [02:33:24] >> or let me be a part of a project to do [02:33:26] something that seems implausible, [02:33:28] >> but I feel like I'm in the engine room [02:33:30] every day. This is great. I'm toiling [02:33:31] with these guys. probably not going to [02:33:34] work. Some crazy convoluted thing that [02:33:36] has a .01 chance of success [02:33:40] >> that can captivate a lot of people. [02:33:42] >> Yes. You know, [02:33:43] >> the process. [02:33:43] >> The process. [02:33:44] >> Yeah. [02:33:45] >> The process. [02:33:45] >> The process is everything. And there's [02:33:47] no I used to like think backward. [02:33:49] >> There is no attention in the process, [02:33:50] >> right? [02:33:51] >> There's only attention in the outcome, [02:33:53] >> right? [02:33:54] >> Do you see what I mean? [02:33:55] >> Absolutely. [02:33:56] >> Which is another clue and a secret that [02:33:58] that's actually where you should be [02:33:59] focused. [02:33:59] >> Well, you might get attention, but [02:34:00] that's not what you want. What you want [02:34:02] is the process to work out. You you want [02:34:04] to get better at whatever it is you're [02:34:05] doing and get that thing to a better [02:34:07] place than it is right now currently. [02:34:09] Right. That's what you're thinking of. [02:34:10] You're not thinking of I am going to get [02:34:12] all this attention. I'm going to be on [02:34:14] the cover of a magazine. [02:34:16] Yeah. Can't It can't be that. That's not [02:34:19] good for anybody. But everybody thinks [02:34:21] that's what they're going to get. Oh, [02:34:23] I'm going to get this. [02:34:23] >> Everybody thinks that's what they want. [02:34:25] >> Yeah. Right. [02:34:27] >> And the problem with that is that it's [02:34:29] not what you want. [02:34:30] >> No. [02:34:31] And then now we're going to completely [02:34:33] upend [02:34:34] potentially all of that. [02:34:38] >> Yeah. Well, maybe it'll come inside [02:34:40] it'll come it'll coincide with the hive [02:34:43] mind technology. [02:34:44] >> This hive mind thing actually that you [02:34:46] say I find very compelling because this [02:34:48] idea of like how do you govern an AI? [02:34:53] Each of us individually are not capable, [02:34:55] but I think you, me, like 10,000, 100 [02:34:57] thousand people working together. [02:35:01] The question is, are we smarter? [02:35:03] And I think there's a reasonable chance [02:35:05] that that could be true. And then the [02:35:07] other version of the hive mind is here [02:35:08] are all these like crazy ideas that [02:35:10] would just make the world incredible. [02:35:13] And a group of a thousand people go off [02:35:15] and they kind of jointly work on that [02:35:17] together. That I find super fascinating. [02:35:20] Like I that could be it. Like it could [02:35:22] be like, you know, a thousand physicists [02:35:24] are like, "We're going to create this [02:35:26] new interstellar form of [02:35:27] transportation." And they just go off [02:35:28] and they're just like they don't have to [02:35:30] worry about [02:35:32] existing because all of that's paid for. [02:35:34] >> Well, it also could solve all of our [02:35:36] problems that we have with like halves [02:35:39] and have nots. If we're all one, how [02:35:42] could we tolerate haveotss? How could we [02:35:44] tolerate people living on dirt floors in [02:35:46] third world countries with no access to [02:35:48] clean water? We wouldn't tolerate it [02:35:49] tolerate [02:35:50] >> because we they would be us and we would [02:35:52] understand that. [02:35:52] >> Yeah. [02:35:53] >> I mean it could be like a complete [02:35:55] gamecher in terms of human civilization. [02:35:58] It could really move people into a [02:35:59] complete next direction. I mean it could [02:36:01] eliminate crime and violence. Yeah. [02:36:03] >> Which sounds insane. Like boy that's so [02:36:06] utopian. Like oh why don't you suck on [02:36:08] some crystals you [ __ ] hippie. But [02:36:10] legitimately if look if everybody has a [02:36:13] cell phone which essentially everybody [02:36:14] does right right now in this time and [02:36:16] age. If we get to a point where [02:36:18] everybody is connected, everybody is [02:36:21] hive mind connected, you're there's [02:36:24] we're all you're not going to just be [02:36:26] able to drive by a homeless encampment, [02:36:28] >> right? [02:36:28] >> You won't you'll feel it. You'll feel [02:36:31] it. It won't be like, "Hey, you [ __ ] [02:36:32] losers. Hit the gas." It's going to be [02:36:35] like, "We need to solve this. We need to [02:36:37] get these people counseling, mental [02:36:39] health crisis, get them off the drugs, [02:36:41] whatever it is that's wrong with them." [02:36:44] >> I mean, that's an incredible idea. [02:36:46] >> Yeah. You know, like when an airplane [02:36:48] kind of like goes like this and your [02:36:49] stomach goes and you just feel it. [02:36:51] >> Could you imagine like you drive by a [02:36:53] homeless encampment and that's what you [02:36:54] feel? Like you feel like [02:36:56] >> something's wrong. [02:36:56] >> And we'll all feel it collectively. If [02:36:59] we're all connected and we all feel [02:37:01] things connectively, we will actively [02:37:03] work together to solve these problems. [02:37:05] And if we're dealing with a if if we [02:37:07] really get to a point of abundance, like [02:37:09] true abundance, where resources are not [02:37:12] an issue and no one's starving, we could [02:37:15] really fix all the problems that like [02:37:18] >> none of them are insurmountable. None of [02:37:20] them are breathing underwater, right? [02:37:22] None of them are flying to the sun. None [02:37:24] of them, right? So all of them are [02:37:26] things that could be if we took all the [02:37:29] world's resources, socialism doesn't [02:37:31] work, right? Why does it not work? [02:37:33] because it rewards lazy people and it [02:37:34] punishes ambitious people. It's not [02:37:36] doesn't doesn't work with human nature, [02:37:38] but it would work if you have [ __ ] [02:37:40] hive mind. If we all we all understand [02:37:43] what it means to put in effort. We all [02:37:45] understood what what each other are [02:37:46] feeling and thinking, [02:37:47] >> right? [02:37:48] >> And we all compiled resources and fixed [02:37:51] all of our social problems. Like [02:37:53] literally stop all wars, stop all crime, [02:37:57] stop all violence, stop all poverty. [02:38:00] Done. And then what do we do? We work [02:38:03] together to solve whatever the [ __ ] else [02:38:04] is wrong with you society. [02:38:06] >> Well, it's more like what is left over [02:38:08] that we haven't figured out. [02:38:10] >> Think about what the world was like [02:38:12] before the internet. It's almost [02:38:13] impossible to imagine, but we both grew [02:38:15] up without it. [02:38:16] >> Yeah. [02:38:17] >> And Yeah. And so we're entering into [02:38:19] this new world. Think about what world [02:38:22] was like without the hive mind, but yet [02:38:24] we all grew up without it. Like that [02:38:26] might be the next thing. The thing that [02:38:28] I remember the most about that era is I [02:38:32] had a positive sum view of everybody [02:38:35] >> really. [02:38:36] >> Meaning there weren't like the the bad [02:38:39] actors were pretty bad but yeah [02:38:41] generally like I looked up to most [02:38:44] business people like the people that I [02:38:45] now I feel like have been a little bit [02:38:47] unmasked then to me were pristine. [02:38:49] >> Oh that's interesting. [02:38:50] >> Like the Bill Gateses of the world, you [02:38:51] know. I was like man I really aspire to [02:38:53] be Bill Gates when I was like 13 or 14. [02:38:56] It just seemed like [02:38:57] >> now you're like, why is he buying all [02:38:58] the farmland? This [ __ ] weirdo. [02:39:00] >> I mean, it's a [ __ ] so funny. He uh [02:39:03] he bought this like 45,000 acres in [02:39:07] 4,500 acres. I can't get the order of [02:39:09] magnet right uh in Phoenix to build his [02:39:11] own digital city. [02:39:13] >> Yeah. [02:39:14] >> Okay. [02:39:14] >> It's like weird. So, I bought the 1700 [02:39:16] acres beside him. [02:39:20] >> That's hilarious. [02:39:21] >> [ __ ] you. [02:39:22] >> It's a very odd thing. It's a very odd [02:39:25] thing when people get exposed and you [02:39:27] just go like what the [ __ ] is that guy [02:39:29] really all about [02:39:30] >> and but also like how isolated is he? [02:39:33] >> Oh, he's been he's been isolated for 50 [02:39:35] years, [02:39:36] >> right? Like who are his friends and how [02:39:38] how many people does he have? [02:39:39] >> Must be very hard to be him actually. I [02:39:41] I mean [02:39:41] >> especially now that he's divorced, [02:39:42] right? So now he's got no one going but [02:39:45] that speech [ __ ] sucked. [02:39:46] >> Yeah, he's I mean he has a long-term [02:39:48] partner. Um she seems like a lovely [02:39:51] woman. Um, but yeah, it's just got to be [02:39:54] super lonely. [02:39:54] >> It's got to be. [02:39:56] >> It's not I to me it's not worth that [02:39:58] level of I don't even know what is it. [02:40:01] It's like material success at least [02:40:03] measured in the outside world. I don't [02:40:04] know what it is, but it's not [02:40:06] it's a lot, man. This is like I like I [02:40:08] don't know how Elon does it. Like it's a [02:40:10] lot. It's super isolating. [02:40:11] >> Yeah. [02:40:12] >> It's it's just he's very by himself. [02:40:16] >> Mhm. [02:40:17] >> And he's going to be even more isolated [02:40:19] in a matter of a few months. [02:40:20] >> Yeah. And that's unfortunate because you [02:40:23] have very empathetic, very kind of like [02:40:25] sensitive people like that I think need [02:40:26] other people. [02:40:28] >> Well, he's got people around him, but [02:40:30] he's got very few people around him that [02:40:33] can kick reality at him. [02:40:35] >> Yeah. [02:40:35] >> You know, that that is a bit of a [02:40:37] problem, but he still seems to be having [02:40:40] fun. Every time I'm around him, we have [02:40:41] a bunch of laughs. Like, he's fun to [02:40:43] hang out. [02:40:44] >> He's got an incredible sense of humor. [02:40:45] We um Jamie and I went down uh to one of [02:40:49] the rocket launches at SpaceX. Yeah, we [02:40:51] went down there crazy [02:40:52] >> and we watched from the ground while I [02:40:56] took off, which is incredible cuz it's [02:40:57] like how far was it, Jamie? It was like [02:40:59] two miles away from us. [02:41:00] >> A mile mile and a half. [02:41:01] >> So like it's a mile and a half. You feel [02:41:03] it in your chest. Have you been when a [02:41:05] when a rocket launches? You been there, [02:41:06] dude? It's bananas. The [ __ ] thing [02:41:08] like first of all, it doesn't look that [02:41:09] far. It looks like it's like [02:41:12] >> maybe quarter mile. it. I don't I'm just [02:41:14] not good at judging. [02:41:15] >> This is a starship. [02:41:16] >> Oh yeah. So you feel it. [02:41:20] You like his kids started crying like we [02:41:22] want to go inside. Like it's disturbing [02:41:24] like the amount of energy that's coming [02:41:27] out of these [ __ ] rocket boosters. [02:41:29] And then I hung out with him in the [02:41:31] command center while the rocket was [02:41:33] flying through space and we're watching [02:41:34] it on all these monitors and then lands [02:41:37] in the water in Australia. He's cracking [02:41:39] jokes the whole time because the thing [02:41:41] is like losing pressure because it's [02:41:43] they're stress testing all the stuff [02:41:45] which is really funny when really dumb [02:41:47] people, oh he's a [ __ ] dumbass, his [02:41:48] rockets keep blowing up. Like like they [02:41:51] just don't understand like the only way [02:41:53] you find out what the capability of this [02:41:55] technology is is you have to like let it [02:41:58] blow up and then you go, "Okay, it needs [02:41:59] to be thicker. It needs to be this and [02:42:01] that and we need to add these things and [02:42:02] there's sensors everywhere." And so he's [02:42:04] cracking jokes the entire time while [02:42:06] this thing is like losing pressure. And [02:42:07] it eventually wound up landing and it [02:42:09] was fine, but it did have a hole in it. [02:42:12] But it was just like he's laughing like [02:42:14] he's having a good old time. He's not [02:42:15] freaked out. No, [02:42:16] >> you know, he he's uniquely built to [02:42:19] handle it. [02:42:19] >> I uh when there was a rocket launch in [02:42:22] Vraenberg in California and I chartered [02:42:25] a Pilates and I because you can get like [02:42:28] a little like propeller plane. [02:42:30] >> Oh, okay. And I went around and around [02:42:32] and I have this video of it kind of like [02:42:34] coming up and through cuz like [02:42:36] >> how close were you? [02:42:40] >> 100 miles. [02:42:41] >> Oh wow. [02:42:42] >> But you but it's like right there. [02:42:43] >> Uhhuh. [02:42:44] >> You know cuz the distance and it's [02:42:46] coming up and I'm kind of going around. [02:42:48] It was the craziest thing. It was cool. [02:42:50] It was super cool. [02:42:51] >> That [ __ ] is super cool. [02:42:53] >> It's very cool. It's very cool. I mean [02:42:55] just Starbase is bananas. Just when you [02:42:58] go down there and they have their own [02:42:59] town, the whole thing is straight. [02:43:00] There's [ __ ] cyber trucks everywhere. [02:43:02] I'm like, how do you find your car? [02:43:03] Like, [02:43:04] >> is it is it an incorporated town? It [02:43:06] started off as unincorporated, but it [02:43:08] own thing now. [02:43:09] >> I believe it's its own town. And [02:43:11] >> is there a mayor? [02:43:12] >> That's a good question. I think there [02:43:13] is. I think we talked about this. I [02:43:16] don't remember though. But the actual [02:43:19] factory itself is nuts cuz I Jamie and I [02:43:23] were both like, "This is way bigger than [02:43:25] I thought it was going to be." And the [02:43:26] rockets are way bigger than you thought. [02:43:27] And like the garage doors are [ __ ] [02:43:29] bananas. [02:43:30] >> They got a city government website [02:43:33] commission mayor. [02:43:37] >> That's crazy. [02:43:39] >> Bobby Bobby Peted. Bobby Peted is the [02:43:41] mayor. [02:43:42] >> They have their own little Bobby. That's [02:43:43] awesome. [02:43:44] >> Irish pub and it's like it's really [02:43:46] cool. They have really good food. you [02:43:47] know, when he uh when he opened the [02:43:48] first uh Gigafactory, which was in [02:43:50] Nevada, we had a party and uh like it [02:43:54] was like a small opening thing and so we [02:43:55] all drove in there and I have a video of [02:43:58] me in a just like a pickup truck driving [02:44:00] into the thing. I started the video and [02:44:04] I think it was 43 seconds until it [02:44:06] ended. And this is like, you know, a [02:44:08] decade ago and I thought to myself, this [02:44:10] is implausible. Like I've never even [02:44:12] contemplated things that could be built [02:44:14] this big. I didn't think it was allowed. [02:44:16] I don't even know how something like [02:44:17] this works. [02:44:19] >> And I was like, how does how do you [02:44:21] envision this whole thing works? Like [02:44:23] simple. [02:44:24] >> Raw materials in the front, cars out the [02:44:27] back. [02:44:29] I'm like, that's it. It sounds so [02:44:32] simple. [02:44:33] >> Well, he thinks big. [02:44:35] >> He thinks big. And thank God he's [02:44:36] around. I mean, if he wasn't around, if [02:44:38] he hadn't purchased Twitter, I think our [02:44:41] entire civilization would look very [02:44:42] different. [02:44:43] >> Very different. [02:44:44] >> It would. I mean, that sounds like a [02:44:45] very grandiose thing to say. [02:44:46] >> Sounds hyperbolic, but you're right. [02:44:48] >> I think it's true because I don't I [02:44:50] think free speech is a core component of [02:44:52] our civilization, and I don't really [02:44:55] think we had it. [02:44:56] >> I think it was curated, and it was very [02:44:58] tightly controlled by the actual federal [02:45:00] government, which is spooky. [02:45:01] >> No, no. It decided what we should be [02:45:05] paying attention to. Yes. [02:45:06] >> Just just put it very simply without [02:45:08] kind of like [02:45:09] >> And that's not right. [02:45:10] >> Right. Because when they're telling you [02:45:13] to pay attention to this and the actual [02:45:16] issue is this and you cannot, then you [02:45:19] can't fix what's actually broken, [02:45:21] >> right? [02:45:21] >> And you start to we start to basically [02:45:23] be like we're we're part of just a a [02:45:26] useful idiot for these people. [02:45:27] >> Yes. [02:45:28] >> And that's not right. [02:45:29] >> It's not right. [02:45:31] >> Listen, man. This was a lot of fun. It's [02:45:32] always great to talk to you. Thank you [02:45:33] very much for doing this. It was very [02:45:35] cool. Um let's do it again sometime. All [02:45:38] right. Thank you. All right. Bye, [02:45:40] everybody.