Full Transcript
https://www.youtube.com/watch?v=10MdOvK-aG4
[00:00] How do I sound?
[00:00] How do I sound?
[00:00] You sound perfect.
[00:01] You sound perfect.
[00:01] You sound great.
[00:02] You sound great.
[00:02] How do I look?
[00:03] How do I look?
[00:03] Yeah, you sound great.
[00:04] Yeah, you sound great.
[00:04] Better than you look.
[00:05] Better than you look.
[00:05] Have a face made for radio.
[00:08] a face made for radio.
[00:08] You don't look as tired as you have in recent weeks.
[00:09] You don't look as tired as you have in recent weeks.
[00:10] recent weeks.
[00:10] That's true.
[00:11] That's true.
[00:11] Yeah.
[00:11] Yeah.
[00:11] Oh, yeah.
[00:11] Somebody was slagging me for the bags and for my eyes.
[00:13] Somebody was slagging me for the bags and for my eyes.
[00:16] I mean, this audience is brutal.
[00:17] audience is brutal.
[00:17] They're brutal.
[00:18] They're brutal.
[00:18] They're brutal.
[00:19] They're brutal.
[00:19] It's a good thing I'm rich.
[00:24] Let your winners ride.
[00:27] Let your winners ride.
[00:31] Rainman David and it said we open sourced it to the fans and they've just gone crazy with it.
[00:33] and it said we open sourced it to the fans and they've just gone crazy with it.
[00:35] fans and they've just gone crazy with it.
[00:35] it.
[00:35] Love you guys.
[00:39] All right everybody, welcome back to the number one podcast in the world.
[00:41] All right everybody, welcome back to the number one podcast in the world.
[00:43] It's the Allin podcast.
[00:43] With us today, Chimoth Polyatia, David Saxs, and our fifth bestie, Mr. Brad Gersonner is here.
[00:46] the Allin podcast.
[00:46] With us today, Chimoth Polyatia, David Saxs, and our fifth bestie, Mr. Brad Gersonner is here.
[00:49] Chimoth Polyatia, David Saxs, and our fifth bestie, Mr. Brad Gersonner is here.
[00:52] fifth bestie, Mr. Brad Gersonner is here.
[00:52] I think uh David Freeberg is suffering from some socialist related flu.
[00:55] here.
[00:55] I think uh David Freeberg is suffering from some socialist related flu.
[00:58] suffering from some socialist related flu.
[00:58] He's very sick of reading about
[01:00] flu. He's very sick of reading about socialists, but he'll be back next week.
[01:02] socialists, but he'll be back next week with two incredible incredible interviews.
[01:05] interviews.
[01:05] You guys see those Spencer Pratt ads?
[01:08] Wow.
[01:09] It's one of the best political ads I've ever ever seen.
[01:12] Oh, there's like three or four of them.
[01:13] There multiples.
[01:13] Yeah.
[01:15] Whoever that social media team is is on fire.
[01:17] If you get a good social media team and you get a good ad production team, I think it's nextgen because these things go crazy.
[01:22] And Spencer Pratt, if he wins this election, which I think he's going to in Los Angeles, the reason is what Brad said.
[01:30] Those ads are incredible.
[01:31] Well, he's also quite a good debater.
[01:34] Did you see clips from this debate?
[01:37] Incredible. He's so funny. He's so chill.
[01:39] Yeah. Well, he's up against Karen Bass, who's the mayor, who is basically extremely leftwing.
[01:43] And then there's someone who's a city council woman who's even further to the left of Karen Bass.
[01:49] I mean, she's often like Fidel Castro territory.
[01:53] She's an Indian Fidel Castro Ramen.
[01:56] So, she was basically, I guess, criticizing the mayor for the homelessness problem.
[01:58] And then Pratt
[02:00] homelessness problem.
[02:03] And then Pratt pointed out that this council woman is pointed out that this council woman is actually in charge of all these homeless programs already.
[02:07] programs already.
[02:08] He eviscerated her.
[02:11] Eviscerated her.
[02:11] And he basically made the key point, which is, look, the problem here is not lack of housing.
[02:14] It's an addiction issue, and it's a mental illness issue.
[02:17] And he said, "Look, if you
[02:18] He said, "If she went to the street, she'd get stabbed in the neck."
[02:23] Yeah.
[02:25] Which is pretty accurate if you've been to Skidro.
[02:26] I mean, you would not want to walk through there.
[02:29] It It was like the Spider-Man photo.
[02:31] Wait, Nick's got the clip.
[02:32] Play the clip, Nick.
[02:33] Oh god, this clip is brutal.
[02:34] This is a different clip, but this went super viral.
[02:36] It reminded me of Trump a little bit.
[02:38] Let's see.
[02:39] I'm not sure how to respond to that vision of Los Angeles.
[02:42] This is a MAGA Republicans idea of what Los Angeles looks like.
[02:49] This is This is really not unbelievable.
[02:51] For those of you listening, he put his hands up and wiggled his head like, "Oh my god."
[02:58] Hey, hey, Sachs.
[02:58] You know, Steven Pratt wins mayor.
[02:59] Spencer Pratt the, you know, the ballot
[03:01] Spencer Pratt the, you know, the ballot initiative, the retirement protection initiative, the retirement protection and savings act, right?
[03:03] It's going to pass.
[03:05] Going to pass with big numbers.
[03:06] This is the, you know, referendum that effectively is going to knock out the wealth tax.
[03:08] Can you imagine if California effectively passes a constitutional amendment protecting retirement savings and personal assets and banning the wealth tax and crack gets elected?
[03:10] The message that would send to the country.
[03:11] That's that's a very non-conensus view that I'm becoming increasingly optimistic about.
[03:14] Well, from your lips to God's ears, but until that message actually is sent, I think I'm going to be uh in Texas.
[03:16] in Texas.
[03:18] Yeehaw.
[03:20] Well, I mean, this is also in the face of I don't know, just a follow-up story here, but Mandami did like an attack video on Ken Griffin's house.
[03:22] We talked about it on the pot a couple weeks ago, and like literally stood in front of his house, pointed at it, and this is in the face of like a CEO getting shot for ideological reasons, Sam Alman's house being targeted.
[03:24] This is like a really dangerous thing for Mandami to do.
[03:27] And
[04:02] dangerous thing for Mandami to do.
[04:04] And Ken Griffin came out today and uh or yesterday in an interview and said,
[04:05] "Hey, listen.
[04:07] I'm out. We're going to be putting our efforts into um Florida.
[04:11] And this is the same thing that happened to Chicago.
[04:12] And he basically said like I really felt offended and I was, you know, nervous about this because of my personal safety.
[04:19] And he called him out.
[04:21] Mondami came out with like a a mealymouthed response that didn't even apologize for what he did.
[04:25] Just double down on it essentially.
[04:28] All right, let's get to the dot.
[04:30] I don't know if you guys saw it or not, but
[04:31] >> who cares?
[04:32] New York is becoming a flyover city.
[04:34] It's interesting way to put it.
[04:36] Uh, don't disagree.
[04:38] All right, first story.
[04:40] Elon just leased all of Colossus 1, his data center.
[04:42] >> He did.
[04:45] >> What? WHAT? YES.
[04:49] SHOCKING to Daario and Anthropic Chamath on last week's pod.
[04:51] Uh, go ahead and give yourself a pat on the back.
[04:52] You said Elon and Dario should do a deal tomorrow.
[04:54] It didn't happen the next day.
[04:56] It happened 5 days later.
[04:58] So, you came close, Jimoth.
[05:00] Uh, but no cigar because of Anthropic's obvious compute
[05:03] because of Anthropic's obvious compute constraints.
[05:05] Anthropic just added over 220,000 Nvidia GPUs, over 300 megawatts of energy.
[05:09] 220,000 Nvidia GPUs, over 300 megawatts of energy.
[05:12] The deal is already having an impact.
[05:15] As we've discussed here, Claude users have been experiencing rate limits.
[05:16] Well, Claude has now doubled the Claude code rate limits, removed peak usage caps for paid users, and increased API volumes for Opus models.
[05:19] limits. Well, Claude has now doubled the Claude code rate limits, removed peak usage caps for paid users, and increased API volumes for Opus models.
[05:21] usage caps for paid users, and increased API volumes for Opus models.
[05:22] API volumes for Opus models.
[05:26] XAI is now trading their models at Colossus 2.
[05:29] So, they have more than enough compute.
[05:31] Elon made a great bet on compute and built up those data centers really fast and that is now paying off.
[05:34] made a great bet on compute and built up those data centers really fast and that is now paying off.
[05:36] those data centers really fast and that is now paying off.
[05:38] We had the cursor deal we talked about last week.
[05:40] Let's talk about the emergence of Elon Web Services EWS Chim.
[05:43] He is now in the hyperscaler competing against Google Cloud, Amazon Web Services and Azure.
[05:46] hyperscaler competing against Google Cloud, Amazon Web Services and Azure.
[05:49] Cloud, Amazon Web Services and Azure.
[05:51] And uh I don't know if you had inside information or just a brilliant uh epiphany but uh take us behind the call and what do you think about the deal itself?
[05:53] information or just a brilliant uh epiphany but uh take us behind the call and what do you think about the deal itself?
[05:55] epiphany but uh take us behind the call and what do you think about the deal itself?
[05:57] and what do you think about the deal itself?
[05:59] itself?
[06:00] I think the deal is fantastic.
[06:03] I think the deal is fantastic.
[06:03] I'll say maybe three quick things.
[06:06] maybe three quick things.
[06:09] The first is, as I mentioned a couple weeks ago, Anthropic and OpenAI's revenue performance has nothing to do with demand.
[06:13] Zero.
[06:16] It is entirely to do with the supply constraints that exist in data centers and specifically in power.
[06:20] If they had infinite power, I think that their revenues would probably be even more parabolic.
[06:23] And so all the breathlessness about either exceeding or underperforming a forecast, in my opinion, mean nothing.
[06:26] I think the five-year view for those two companies is quite robust.
[06:29] The thing that they really need is more compute and more power.
[06:30] That's the first thing.
[06:33] The second thing is while they need that, we have a very big problem which is we unfortunately have very poor leadership at the head of most of these AI firms.
[07:07] at the head of most of these AI firms.
[07:09] I think they are coming off as untrustworthy or too self-interested.
[07:13] untrustworthy or too self-interested.
[07:16] The political reaction now is starting to turn negative.
[07:20] The community reaction is negative.
[07:23] You have about nine gigawatts that are supposed to come online this year.
[07:25] Almost 50% of it now is being protested.
[07:28] More than likely, if if history holds, most of that will get turned off.
[07:35] So, they will get even more supply constraint.
[07:36] So, that's the setup.
[07:39] So, what's the opportunity I think for Elon?
[07:42] If you look inside of how people try to nitpick the SpaceX valuation case, or let's not even let's give them sorry, let's be more generous.
[07:49] When people try to paint the bare case or they tried to red team the valuation, the biggest element is the onthe-c value around the orbital data centers.
[08:02] And by actually landing a bunch of terrestrial capacity, I think you start to blunt that because you can now start to say that even if
[08:09] you can now start to say that even if the orbital data centers get delayed by the orbital data centers get delayed by a few months or a few quarters, even if a few months or a few quarters, even if the technological the technological derisking of it takes longer, he now has derisking of it takes longer, he now has this structural core business that will this structural core business that will effectively subsidize effectively subsidize his ability to train Grock, which I his ability to train Grock, which I think is a really important and under think is a really important and under reportported theme.
[08:29] So you have all this infrastructure.
[08:31] He somehow saw the tea leaves before most people.
[08:35] He built to a level of scale and secured power before most people.
[08:39] It has now become the critical asset.
[08:41] And now he's kind of kingmaking.
[08:43] And I think that that's a really interesting valuation reinforcement as SpaceX goes through testing the waters and the and the road show.
[08:52] show.
[08:53] Brad, your take?
[08:55] Yeah. No, I think it's well said.
[08:57] I mean, first we we know that there's nobody better on planet Earth than Elon at converting electrons to tokens.
[09:01] It's a critically important evolution to the story.
[09:04] You know, I think our friend Shawn Magcguire, he he he sent out a tweet that summed it up well and he said
[09:09] tweet that summed it up well and he said SpaceX has this five layer cake, launch,
[09:12] SpaceX has this five layer cake, launch, connectivity, compute, hyperscaler,
[09:15] connectivity, compute, hyperscaler, space data centers, and then
[09:17] space data centers, and then applications and models and then other
[09:19] applications and models and then other bets. Right?
[09:22] The question on the road show has been but X.AI AI doesn't isn't on the revenue trajectory of open AI and and anthropic and yet there are huge commitments and now we see the ace card that Elon's playing.
[09:30] and anthropic and yet there are huge commitments and now we see the ace card that Elon's playing.
[09:34] that Elon's playing. He said he was building AWS all along or EWS all along.
[09:36] He said he was building AWS all along or EWS all along.
[09:40] And so I estimate that this is going to generate in this year an incremental 4 to5 billion of revenue on top of what I I've seen analysts estimates in the mid20s.
[09:42] generate in this year an incremental 4
[09:44] to5 billion of revenue on top of what I
[09:47] I've seen analysts estimates in the mid20s.
[09:49] That's a material amount of incremental revenue to offset the cost of the investments that he's made here.
[09:51] incremental revenue to offset the cost
[09:53] of the investments that he's made here.
[09:55] And that will subsidize to Chamas's point all that he's investing to build the next generation of Groth.
[09:58] point all that he's investing to build
[10:00] the next generation of Groth. Remember
[10:03] Remember too that he has three facilities, Colossus, Macro hard, and Macro harder.
[10:05] too that he has three facilities, Colossus, Macro hard, and Macro harder.
[10:08] 1.2 gawatts in Macro hard and macro
[10:11] 1.2 gawatts in Macro hard and macro harder in Blackwell.
[10:13] So he's given the harder in Blackwell.
[10:13] So he's given the one that's kind of less connected.
[10:15] one that's kind of less connected.
[10:15] H100's great for inference to anthropic.
[10:19] H100's great for inference to anthropic.
[10:19] He's monetizing it in a big way.
[10:21] It's terrific for Anthropic and it solves
[10:24] terrific for Anthropic and it solves what I think was the biggest question in
[10:26] what I think was the biggest question in the valuation story which is what if he
[10:30] the valuation story which is what if he spends ahead of X.AI's revenue.
[10:33] It takes spends ahead of X.AI's revenue.
[10:33] It takes the pressure Chimath off X.AI delivering
[10:36] the pressure Chimath off X.AI delivering immediate revenue.
[10:39] Now he becomes uh an immediate competitor in the hyperscaler.
[10:41] immediate competitor in the hyperscaler.
[10:41] I don't think this is the last announcement.
[10:42] I don't think this is the last announcement.
[10:45] I think he's going to make a lot more you know moves in this
[10:47] direction. I think it will be a material part of their story and their revenue
[10:49] part of their story and their revenue projections uh as they come together and
[10:51] projections uh as they come together and I would just say finally you know again
[10:54] I would just say finally you know again everybody has talked about how we don't
[10:56] everybody has talked about how we don't have enough power how we don't have
[10:58] have enough power how we don't have enough compute how the revenues would
[10:59] enough compute how the revenues would not show up this year you know but the
[11:01] not show up this year you know but the chaos that is American capitalism
[11:03] chaos that is American capitalism somehow finds a way okay and there's
[11:05] somehow finds a way okay and there's tremendous demand for anthropic and we
[11:11] tremendous demand for anthropic and we find a way I was so happy to see kind of.
[11:15] find a way I was so happy to see kind of the dant and the kind exchange.
[11:18] the dant and the kind exchange between the team of Anthropic and Elon.
[11:20] between the team of Anthropic and Elon because we need all of this in order to.
[11:23] because we need all of this in order to produce uh American frontier models to.
[11:25] produce uh American frontier models to stay at the frontier.
[11:28] stay at the frontier. And then finally, I just say, you know, Chimath, you.
[11:29] I just say, you know, Chimath, you referenced these activists that are.
[11:31] referenced these activists that are protesting delaying these data centers.
[11:33] protesting delaying these data centers in in these localities.
[11:35] in in these localities. One thing I want to dispel this myth, these this is not.
[11:37] to dispel this myth, these this is not like organic hyper local protests by.
[11:40] like organic hyper local protests by people in a community that aren't being.
[11:42] people in a community that aren't being spurred on.
[11:45] spurred on. This is highly organized activists that are moving across the.
[11:47] activists that are moving across the country to stir up trouble in the exact.
[11:50] country to stir up trouble in the exact same way they did to stop all fision.
[11:52] same way they did to stop all fision reactors being built 30 years ago in.
[11:55] reactors being built 30 years ago in America.
[11:57] America. Now we have no nuclear reactors being built.
[11:59] being built. China's got a hundred of them.
[12:02] them. Who was funding those activists?
[12:04] I think we need to really look into who's funding the activists now.
[12:06] I'm not saying that there aren't any concerns,
[12:07] saying that there aren't any concerns, but the misinformation about water, the.
[12:10] but the misinformation about water, the misinformation about electricity bills,
[12:12] misinformation about electricity bills, electricity bills are going up in the places that are not building data centers, New York and California, because they haven't built any supply on the grid.
[12:21] In Texas, where you're building the most data centers in the country, electricity costs are going down.
[12:26] So, um, I'm I I think that's a boogeyman that we got to take on.
[12:28] boogeyman that we got to take on.
[12:31] Sure. Johnson.
[12:33] Well, look, the deal is highly complimentary.
[12:35] As Chamath and Brad pointed out, SpaceX has a profitable, I think very profitable space and telecommunications Starlink business, the satellite business, but the XAI business had huge losses.
[12:45] The reasons pretty straightforward.
[12:50] You need these super large training clusters, but they cost a lot of money.
[12:53] And until you have a model that's capable of competing at the frontier, you're not making any revenue.
[12:57] And that problem is compounded by the fact that right now all the revenue is in enterprise, which is to say coding.
[13:01] We know that XAI just did that deal with Cursor to try and catch up, but they don't have a coding product yet.
[13:08] So they're not participating in the
[13:12] yet. So they're not participating in the revenue, but they're participating in all the cost.
[13:13] revenue, but they're participating in all the cost. So this deal fixes that problem.
[13:16] all the cost. So this deal fixes that problem.
[13:19] problem. Elon's now able to have a frontier model company, but he's able to now not have these massive unpaid for capex commitments, right?
[13:22] frontier model company, but he's able to now not have these massive unpaid for capex commitments, right?
[13:25] now not have these massive unpaid for capex commitments, right? because he's able to kind of lease that capacity.
[13:27] capex commitments, right? because he's able to kind of lease that capacity.
[13:30] able to kind of lease that capacity. So I think it solves a major problem for them and their balance sheet.
[13:31] I think it solves a major problem for them and their balance sheet.
[13:33] them and their balance sheet. And then you have to say that philanthropic this is a really great thing because they were compute constraint.
[13:35] you have to say that philanthropic this is a really great thing because they were compute constraint.
[13:38] is a really great thing because they were compute constraint.
[13:40] were compute constraint. And just to build on that point, I mean I guess let me be the first to congratulate Dario on winning the AI race.
[13:42] build on that point, I mean I guess let me be the first to congratulate Dario on winning the AI race.
[13:44] me be the first to congratulate Dario on winning the AI race.
[13:47] winning the AI race.
[13:47] And you've been, let's be honest, Saxs, you have been on this podcast, you've been moderately critical of that company and Daario himself for being um, you know, a little P Doomer 110.
[13:49] you have been on this podcast, you've been moderately critical of that company and Daario himself for being um, you know, a little P Doomer 110.
[13:51] been moderately critical of that company and Daario himself for being um, you know, a little P Doomer 110.
[13:53] and Daario himself for being um, you know, a little P Doomer 110.
[13:57] you know, a little P Doomer 110. And on your ex account, you've been even a little spicier.
[14:00] know, a little P Doomer 110. And on your ex account, you've been even a little spicier.
[14:02] ex account, you've been even a little spicier. So now that there's peace in the Middle East of uh of the AI business, what's your take here?
[14:04] spicier. So now that there's peace in the Middle East of uh of the AI business, what's your take here?
[14:07] the Middle East of uh of the AI business, what's your take here?
[14:09] business, what's your take here?
[14:09] My take is look, let's just honestly and accurately assess where the state of
[14:14] accurately assess where the state of this AI market is at right now and this AI market is at right now and Anthropic's place within it.
[14:19] So for the last 3 years, Anthropic has been growing at a rate of 10x a year.
[14:24] I think going into this year, probably the conventional wisdom was that there'd be no way to sustain that kind of rate of growth at this level of scale.
[14:32] And what happened in the first four months of the year?
[14:34] First, we find out that from January 1st to March 31st, they grew from roughly 10 billion of ARR to 30 billion.
[14:41] So, it tripled.
[14:43] And then in April, if anything, the rate of increase seemed to accelerate.
[14:45] They went from 30 to 44 billion of ARR.
[14:51] Nobody in Silicon Valley has ever seen anything like it.
[14:53] Forget about the rest of the country.
[14:54] I mean, all we do in Silicon Valley is deal with exponentials.
[14:56] And still, people have never seen that kind of growth at that level of scale.
[15:02] The only thing holding them back in the future was compute.
[15:03] Now they've made this deal.
[15:06] They've made other deals as well to get that compute.
[15:07] I think it's pretty much a foregone conclusion that they will hit that forecast of 10x this year exiting
[15:14] that forecast of 10x this year exiting the year.
[15:15] Call it roughly 100 billion of ARR.
[15:18] And now the only question is whether they hit a trillion in 2027.
[15:22] And we can debate >> getting on board.
[15:26] We can we can debate whether that's true or not.
[15:28] But look, if they do that, I think they'll easily be the most valuable tech company in history.
[15:33] In fact, they might even be more valuable than the rest of the Mag 7 put together.
[15:37] Just to give people some basis for comparison here, you know, the biggest tech companies, Apple, >> Nvidia, >> Nvidia, Google.
[15:45] I think they kind of do around 4 to 500 billion a year right now >> of of revenue.
[15:50] I guess Nvidia is in a little bit of a different category, but you look at >> you look at the hyperscalers, the three hybrid scalers.
[15:53] Yeah. I mean, Google is doing what, like 120 billion a quarter, something like that.
[15:58] 100 billion a quarter.
[16:00] >> Correct.
[16:01] >> But growing at what, 20% year-over-year?
[16:03] Not 100%.
[16:05] It's certainly not a thousand%.
[16:06] So, the fact that anthropic could be on track.
[16:10] In fact, let me correct.
[16:11] >> You see them going to the track.
[16:13] It'll be a Maggie.
[16:13] I'm saying something else
[16:15] be a Maggie.
[16:19] I'm saying something else which is that unless something about which is that unless something about their current trajectory changes.
[16:22] their current trajectory changes anthropic will be the most powerful.
[16:25] anthropic will be the most powerful monopoly ever created in human history.
[16:27] monopoly ever created in human history.
[16:28] Oh, again it will be, you know, a trillion.
[16:30] again it will be, you know, a trillion dollars of ARR growing at some.
[16:33] dollars of ARR growing at some exponential.
[16:34] exponential.
[16:34] Interesting.
[16:36] Dario calls it AGI. I call it the biggest monopoly in human history.
[16:38] biggest monopoly in human history.
[16:40] Interesting to hear that word monopoly.
[16:42] Sachs, very interesting placement.
[16:45] Chimath go ahead and then I'll move in 2025 was 420 billion Microsoft was.
[16:50] in 2025 was 420 billion Microsoft was 300 billion Alphabet was 390 billion.
[16:53] 300 billion Alphabet was 390 billion Amazon 700 billion Nvidia 190 billion.
[16:56] Amazon 700 billion Nvidia 190 billion Meta 185 billion Tesla 110 billion total.
[17:00] Meta 185 billion Tesla 110 billion total about 2.3 to 2.35 trillion so if Sax is.
[17:03] about 2.3 to 2.35 trillion so if Sax is right and Anthropic you know can tack on.
[17:08] right and Anthropic you know can tack on a trillion it won't be the mag seven.
[17:10] a trillion it won't be the mag seven it'll be the mag.
[17:12] it'll be the mag.
[17:13] Just to put it in perspective though, Daario on Dwarkkish said he thought the.
[17:16] Daario on Dwarkkish said he thought the combined AI revenue of the market combined AI revenue of the market leaders would be about a trillion in 29.
[17:21] leaders would be about a trillion in 29.
[17:21] I love what you're saying.
[17:23] Sachs, I think there is unlimited TAM.
[17:25] We may be over our skis a little bit in terms of, you know, the forecast.
[17:27] If you back your way from compute, right, they're they expect to have 5 gigs by the end of this year, 10 gigs by the end of next year.
[17:31] expect to have 5 gigs by the end of this year, 10 gigs by the end of next year.
[17:35] It's kind of hard to get to those numbers for a single company, but I do believe that the, you know, trajectory that they're on, I totally agree with you, is on an exponential um that not many people believed in 4 months ago.
[17:36] It's kind of hard to get to those numbers for a single company, but I do believe that the, you know, trajectory that they're on, I totally agree with you, is on an exponential um that not many people believed in 4 months ago.
[17:40] believe that the, you know, trajectory that they're on, I totally agree with you, is on an exponential um that not many people believed in 4 months ago.
[17:42] that they're on, I totally agree with you, is on an exponential um that not many people believed in 4 months ago.
[17:44] many people believed in 4 months ago.
[17:46] Right. So then the question is, okay, I think we all agree they're on an exponential curve and that the TAM is big enough to support that.
[17:48] Right. So then the question is, okay, I think we all agree they're on an exponential curve and that the TAM is big enough to support that.
[17:49] think we all agree they're on an exponential curve and that the TAM is big enough to support that.
[17:51] exponential curve and that the TAM is big enough to support that.
[17:53] Just one data point on TAM.
[17:56] My understanding of the total market size just on coding is one trillion meaning that a trillion dollars a year roughly is spent on software developers and all things related to the creation of software.
[18:00] one trillion meaning that a trillion dollars a year roughly is spent on software developers and all things related to the creation of software.
[18:02] dollars a year roughly is spent on software developers and all things related to the creation of software.
[18:04] software developers and all things related to the creation of software.
[18:06] related to the creation of software.
[18:08] Now, I'm not saying that they eat that entire market, but I can easily see the market for software doubling.
[18:10] entire market, but I can easily see the market for software doubling.
[18:12] market for software doubling.
[18:14] Well, hold on.
[18:16] Well, hold on.
[18:16] Doubling from a trillion to two trillion given that coding tokens to two trillion given that coding tokens basically 10xes or 100xes the value of basically 10xes or 100xes the value of that market and the ability to generate code.
[18:28] So, I think we all agree that the TAM here is large enough to support a trillion dollars of revenue.
[18:32] Brad, I think you bring up a couple of really important constraints.
[18:34] First, there may not be enough comput and there's not enough energy.
[18:38] I'd say the second big one is what's the competitive reaction going to be?
[18:41] Totally.
[18:41] Because I would say at the beginning of this year, all these frontier labs were playing around with a lot of different things.
[18:49] I mean, Anthropic was the porcupine.
[18:49] They believed in one thing.
[18:53] All these other companies were kind of acting like the fox who thinks they're good at a lot of different things.
[18:56] They were doing Nano Banana.
[18:56] They were doing Sora.
[18:59] They were doing, you know, they were doing image generation.
[19:01] They were doing fantasy character chatbots.
[19:06] In hindsight, they were doing a lot of things that appear to be kind of a waste of time.
[19:10] The whole market appears now to be coding and the things that will be built on coding tokens like co-work like agents.
[19:14] And so there is going to be a
[19:16] agents.
[19:18] And so there is going to be a competitive response here where all the competitive response here where all the other guys realize, oh wait a second, we were misfocused.
[19:22] They're going to get focused.
[19:24] I just don't know how much share they're going to be able to take.
[19:26] It does look like OpenAI has already made the pivot.
[19:28] We hear very good things about Codeex now based on GPT 5.5.
[19:34] 5.5 is based on a new base model called Spud.
[19:37] I think they're very optimistic about continuing improvements.
[19:40] Their rate of growth appears to be accelerating now because of uh 5.5.
[19:45] So look, there's reason to believe that OpenAI can take some share here.
[19:49] I'm sure that Google won't be asleep at the wheel.
[19:53] They're very very good at coding.
[19:55] They've got a really good team and Elon just tied up with Cursor.
[19:57] So there is going to be more competition but still what you have to say and I think all of us know this from Silicon Valley is you always want to be the company in the lead that's on that trajectory where all you have to do is maintain inertia whereas the other people have to change something
[20:13] in order to put themselves back in the race.
[20:15] So this is when I say somewhat
[20:17] race. So this is when I say somewhat sort of faciciously congratulations
[20:19] sort of faciciously congratulations Dario on winning the AI race. I don't
[20:22] Dario on winning the AI race. I don't mean that he's won it but he is winning
[20:24] mean that he's won it but he is winning it right now. Well, here's the the
[20:26] it right now. Well, here's the the brilliance of what Elon's doing. If you
[20:28] brilliance of what Elon's doing. If you look at the existing business, which is
[20:30] look at the existing business, which is Starlink and basically the launch
[20:31] Starlink and basically the launch services at SpaceX, incredible business.
[20:34] services at SpaceX, incredible business. Obviously,
[20:36] Obviously, 20 billion this year, I think is the
[20:37] 20 billion this year, I think is the estimate. But if you look at the
[20:39] estimate. But if you look at the footprint of Amazon Web Services, Azure
[20:41] footprint of Amazon Web Services, Azure and GCP, you're looking at, you know,
[20:44] and GCP, you're looking at, you know, 300 billion dollars in revenue and a
[20:49] 300 billion dollars in revenue and a market cap of combined 5 trillion, 4
[20:53] market cap of combined 5 trillion, 4 trillion if these were independent
[20:54] trillion if these were independent companies. And if you look at what is
[20:57] companies. And if you look at what is Elon's core competency at Tesla, it's
[21:00] Elon's core competency at Tesla, it's building factories. And if you look at
[21:02] building factories. And if you look at the footprint of these factories,
[21:04] the footprint of these factories, they're huge. What are data centers?
[21:05] they're huge. What are data centers? They're basically big giant factories.
[21:07] They're basically big giant factories. And then if you look at energy, what
[21:10] And then if you look at energy, what else is Elon extremely good at? This is
[21:12] else is Elon extremely good at? This is the battery deployment and he's also got
[21:14] the battery deployment and he's also got solar deployment from the Solar City
[21:16] solar deployment from the Solar City often criticized acquisition he did
[21:18] often criticized acquisition he did years ago. So you put this all together.
[21:21] years ago. So you put this all together. If this is 5 billion, as I think you
[21:23] If this is 5 billion, as I think you referenced, Brad, if it's $5 billion in
[21:25] referenced, Brad, if it's $5 billion in incremental
[21:27] incremental Elon Web Services business and he's a
[21:29] Elon Web Services business and he's a Neocloud, what could he build on planet
[21:33] Neocloud, what could he build on planet Earth? What could he build inside of
[21:35] Earth? What could he build inside of Teslas in terms of extra compute? What
[21:37] Teslas in terms of extra compute? What could he build inside the power wall?
[21:38] could he build inside the power wall? What if the power walls had his new fabs
[21:41] What if the power walls had his new fabs in them and you built a distributed
[21:42] in them and you built a distributed system from home to home? The power wall
[21:45] system from home to home? The power wall has compute in it. The cars have comput
[21:48] has compute in it. The cars have comput in it. And of course, the ultimate
[21:50] in it. And of course, the ultimate manifestation of this where nobody can
[21:52] manifestation of this where nobody can complain is you go right out into space.
[21:55] complain is you go right out into space. And that's what he's going to do. And
[21:57] And that's what he's going to do. And the the sneaky small part of this
[21:59] the the sneaky small part of this announcement from Elon and uh from
[22:01] announcement from Elon and uh from Anthropic and Daario was they're also
[22:04] Anthropic and Daario was they're also interested in space. So look for the
[22:07] interested in space. So look for the race to go from factories and data
[22:10] race to go from factories and data centers to homes. The power wall with
[22:12] centers to homes. The power wall with comput in it. It's already online,
[22:14] comput in it. It's already online, right? And Starlink also gives him the
[22:16] right? And Starlink also gives him the ability to do distributed comput's
[22:19] ability to do distributed comput's homes. Again, you could be paying people
[22:21] homes. Again, you could be paying people to put power walls with computing in it.
[22:23] to put power walls with computing in it. That's going to be the next shoe to
[22:24] That's going to be the next shoe to drop, I believe. Did you guys see the
[22:26] drop, I believe. Did you guys see the deal that was announced
[22:28] deal that was announced yesterday between PY Homes, which is a
[22:31] yesterday between PY Homes, which is a huge builder
[22:33] huge builder >> and SPAN?
[22:34] >> and SPAN? >> Yes.
[22:35] >> Yes. >> Nick, just throw this up here. It's
[22:36] >> Nick, just throw this up here. It's super cool. What's happening is that
[22:38] super cool. What's happening is that these guys
[22:39] these guys >> are putting many data centers with
[22:42] >> are putting many data centers with Nvidia GPU clusters
[22:44] Nvidia GPU clusters beside every home and then allowing
[22:46] beside every home and then allowing people to actually run those things. And
[22:47] people to actually run those things. And that's just incredible. I thought that
[22:49] that's just incredible. I thought that was so cool.
[22:50] was so cool. >> It's a great pivot. What this company
[22:51] >> It's a great pivot. What this company did originally, Chimamoth, was they did
[22:53] did originally, Chimamoth, was they did the power panel. They made smart power
[22:55] the power panel. They made smart power panels. So, you know, when you flip your
[22:56] panels. So, you know, when you flip your breakers, all those breakers are in an
[22:59] breakers, all those breakers are in an app. I looked at it for my house, but I
[23:00] app. I looked at it for my house, but I guess they pivoted to add this and I
[23:02] guess they pivoted to add this and I think base power, Brad, you're an
[23:03] think base power, Brad, you're an investor in it. They're going to do the
[23:04] investor in it. They're going to do the same thing.
[23:05] same thing. >> Zack Dell is doing that. One of the
[23:07] >> Zack Dell is doing that. One of the things I just say in response, Jason, to
[23:08] things I just say in response, Jason, to what you just said about Elon, right?
[23:10] what you just said about Elon, right? This is why the SpaceX IPO is going to
[23:13] This is why the SpaceX IPO is going to trade at 40 to 50 times revenue. Okay.
[23:17] trade at 40 to 50 times revenue. Okay. So next year if they do 40 to 50 billion
[23:20] So next year if they do 40 to 50 billion and this thing goes out at 2 trillion
[23:22] and this thing goes out at 2 trillion right that they're going to trade at a
[23:24] right that they're going to trade at a really high revenue multiple compared to
[23:26] really high revenue multiple compared to the mag five that are trading at like 25
[23:29] the mag five that are trading at like 25 times earnings and there's only one
[23:31] times earnings and there's only one person on the planet who has a future
[23:34] person on the planet who has a future pipeline of innovation and the largest
[23:37] pipeline of innovation and the largest TAM in the world because he's playing in
[23:38] TAM in the world because he's playing in all these different spaces that can
[23:40] all these different spaces that can command that multiple and it's Elon and
[23:43] command that multiple and it's Elon and it's deserved and it's great for the
[23:44] it's deserved and it's great for the country
[23:45] country >> has that same Tesla has that same Elon
[23:48] >> has that same Tesla has that same Elon uh variable in it as well which is
[23:50] uh variable in it as well which is people value his companies at I would
[23:52] people value his companies at I would say two times market three times market
[23:55] say two times market three times market four times market because of the future
[23:57] four times market because of the future pipeline and they devalue Apple because
[23:59] pipeline and they devalue Apple because they don't have somebody like Elon or
[24:01] they don't have somebody like Elon or Steve Jobs there who is giving them the
[24:03] Steve Jobs there who is giving them the future
[24:03] future >> I don't think it's devalued I think
[24:04] >> I don't think it's devalued I think >> or properly valued if you don't have an
[24:06] >> or properly valued if you don't have an Elon and you have somebody like I think
[24:08] Elon and you have somebody like I think that's exactly what it is we talked
[24:10] that's exactly what it is we talked about this last week but explain why you
[24:11] about this last week but explain why you think it's different
[24:12] think it's different >> I think all of these companies are
[24:13] >> I think all of these companies are actually very fairly valued and then
[24:16] actually very fairly valued and then Elon World gets a premium
[24:18] Elon World gets a premium >> totally
[24:19] >> totally >> and that premium is because of what you
[24:21] >> and that premium is because of what you guys said that I agree with. The big
[24:23] guys said that I agree with. The big message that I take away from this which
[24:25] message that I take away from this which the markets and retail are telling you
[24:28] the markets and retail are telling you is you guys have stopped innovating.
[24:32] is you guys have stopped innovating. There's a lot of incrementalism
[24:34] There's a lot of incrementalism and we as a society aren't benefiting
[24:37] and we as a society aren't benefiting broadly the way that you told us we
[24:39] broadly the way that you told us we would be. And so maybe this is the best
[24:42] would be. And so maybe this is the best way for them to get this message, which
[24:44] way for them to get this message, which is to whack their valuation. And by the
[24:46] is to whack their valuation. And by the way, I'll just say it again, when Tesla
[24:48] way, I'll just say it again, when Tesla and SpaceX merge and we have all things
[24:51] and SpaceX merge and we have all things Elon and Elon Corp, okay, which will
[24:53] Elon and Elon Corp, okay, which will happen probably by the end of the year.
[24:55] happen probably by the end of the year. Maybe it'll happen in the middle of next
[24:57] Maybe it'll happen in the middle of next year.
[24:59] year. It's going to then break everybody's
[25:00] It's going to then break everybody's brains again because you'll have this
[25:02] brains again because you'll have this one asset, as you guys said, that will
[25:05] one asset, as you guys said, that will trade at a valuation premium. And some
[25:08] trade at a valuation premium. And some people will say it's unexplainable. And
[25:10] people will say it's unexplainable. And I think it's logically explainable,
[25:11] I think it's logically explainable, which is everybody else has stopped
[25:14] which is everybody else has stopped innovating. People know how to draw more
[25:17] innovating. People know how to draw more blood from the stone, how to target
[25:19] blood from the stone, how to target better ads.
[25:21] better ads. That does nothing for society anymore.
[25:23] That does nothing for society anymore. >> That's it. Literally, in fact, it does
[25:25] >> That's it. Literally, in fact, it does the opposite. It there is no good left.
[25:27] the opposite. It there is no good left. That was literally the exact point I was
[25:29] That was literally the exact point I was making when you cut me off. If you look,
[25:31] making when you cut me off. If you look, Tim Cook's greatest innovation Tim
[25:33] Tim Cook's greatest innovation Tim Cook's greatest innovation before you
[25:35] Cook's greatest innovation before you cut me off was Apple TV. Not even the
[25:38] cut me off was Apple TV. Not even the hardware product. It was just spending
[25:39] hardware product. It was just spending money and making a Netflix knockoff.
[25:41] money and making a Netflix knockoff. There's been no other product in
[25:43] There's been no other product in >> Hold on. Let me finish again before you
[25:45] >> Hold on. Let me finish again before you interrupt me making my
[25:46] interrupt me making my >> You don't like that? Oh, you like me?
[25:47] >> You don't like that? Oh, you like me? >> Oh, no. Oh, yeah. Well, okay. Okay. Go
[25:49] >> Oh, no. Oh, yeah. Well, okay. Okay. Go back to
[25:50] back to >> meat kettle.
[25:52] >> meat kettle. >> If you look at their track record and I
[25:54] >> If you look at their track record and I think this is why we had a change there
[25:57] think this is why we had a change there is they have not done anything
[25:59] is they have not done anything innovative and in fact the things they
[26:00] innovative and in fact the things they were doing that were innovating in AI or
[26:03] were doing that were innovating in AI or self-driving cars, they shut down. They
[26:05] self-driving cars, they shut down. They won't take any swings for the bat. So
[26:06] won't take any swings for the bat. So they are getting penalized in their
[26:08] they are getting penalized in their valuation penalized. They're just not
[26:10] valuation penalized. They're just not getting a premium. They're not getting
[26:11] getting a premium. They're not getting penalized.
[26:12] penalized. >> I think they're getting penalized.
[26:13] >> I think they're getting penalized. >> Every metric they're trading at
[26:14] >> Every metric they're trading at incredible valuations. Just look at
[26:16] incredible valuations. Just look at them.
[26:17] them. >> Oh no. I don't I mean if you compare the
[26:19] >> Oh no. I don't I mean if you compare the two valuations, I think they're being
[26:20] two valuations, I think they're being penalized. Anyway, let's Anybody else
[26:22] penalized. Anyway, let's Anybody else want to get in on this before we move on
[26:24] want to get in on this before we move on to the next one? Yeah.
[26:25] to the next one? Yeah. >> There is no world in which Google and
[26:27] >> There is no world in which Google and Meta and Apple and Amazon could be
[26:32] Meta and Apple and Amazon could be viewed as being penalized in valuation.
[26:35] viewed as being penalized in valuation. There is very clearly a world where Elon
[26:37] There is very clearly a world where Elon gets a massive premium because he's
[26:39] gets a massive premium because he's innovating.
[26:41] innovating. >> You're saying the same things. You're
[26:42] >> You're saying the same things. You're saying the same thing the same thing.
[26:44] saying the same thing the same thing. It's not the same.
[26:45] It's not the same. >> Listen, we're I think we're we're
[26:46] >> Listen, we're I think we're we're debating semantics here. I'm not letting
[26:48] debating semantics here. I'm not letting you off the hook, Saxy Poo. When Sax is
[26:52] you off the hook, Saxy Poo. When Sax is very deliberate in how he speaks, they
[26:54] very deliberate in how he speaks, they said he's the captain of the debate club
[26:55] said he's the captain of the debate club in his 20,000word article this week and
[26:57] in his 20,000word article this week and that he's a master debater. He's a
[26:59] that he's a master debater. He's a masturbator. And you slipped in. You
[27:03] masturbator. And you slipped in. You slipped it in. Are you saying that the
[27:06] slipped it in. Are you saying that the FTC or whoever should be going in and
[27:08] FTC or whoever should be going in and looking at anthrop Oh, Brad's book is
[27:11] looking at anthrop Oh, Brad's book is getting attacked headwinds. You said
[27:13] getting attacked headwinds. You said they're a monopoly or they're heading to
[27:15] they're a monopoly or they're heading to monopoly tactics s is that what you're
[27:18] monopoly tactics s is that what you're saying?
[27:19] saying? >> Well, look, I mean, we know that tech
[27:22] >> Well, look, I mean, we know that tech markets have a history of consolidating
[27:25] markets have a history of consolidating down and turning into either monopolies
[27:27] down and turning into either monopolies or duopolies. And if you just look at
[27:29] or duopolies. And if you just look at the revenue right now, there's only two
[27:31] the revenue right now, there's only two companies making substantial revenue on
[27:35] companies making substantial revenue on AI. It's Enthropic and Open AI. We know
[27:38] AI. It's Enthropic and Open AI. We know that OpenAI is growing at 3 to 4x, which
[27:40] that OpenAI is growing at 3 to 4x, which is incredible at the level of scale
[27:42] is incredible at the level of scale they're at. Enthropic though, we said,
[27:43] they're at. Enthropic though, we said, is growing at an exponential 10x a year.
[27:46] is growing at an exponential 10x a year. And if they just do that for 18 more
[27:49] And if they just do that for 18 more months, they'll be by far the most
[27:51] months, they'll be by far the most valuable company in human history. And
[27:53] valuable company in human history. And they'll have unprecedented control over
[27:55] they'll have unprecedented control over the most important technology of our
[27:57] the most important technology of our time. So, I don't know what you call
[27:59] time. So, I don't know what you call that, but it is something to think
[28:02] that, but it is something to think about. And I guess I do have a thought
[28:03] about. And I guess I do have a thought experiment for you guys, which is I just
[28:07] experiment for you guys, which is I just want you to think for a second about the
[28:10] want you to think for a second about the case of of John D. Rockefeller, who I
[28:13] case of of John D. Rockefeller, who I think is known as probably the most
[28:16] think is known as probably the most successful, most ruthless
[28:18] successful, most ruthless >> monopolist in in American history. But
[28:21] >> monopolist in in American history. But he wasn't very good at PR. He was
[28:22] he wasn't very good at PR. He was terrible at PR. Everyone sort of
[28:24] terrible at PR. Everyone sort of recognized how ruthless he is. We've
[28:26] recognized how ruthless he is. We've seen movies like There Will Be Blood,
[28:28] seen movies like There Will Be Blood, which is basically about him. In any
[28:29] which is basically about him. In any event, imagine if John D. Rockefeller
[28:32] event, imagine if John D. Rockefeller was way better at public relations, and
[28:35] was way better at public relations, and instead of calling his company Standard
[28:37] instead of calling his company Standard Oil, he called it safe oil. Okay, let's
[28:40] Oil, he called it safe oil. Okay, let's just let's just play this thought
[28:41] just let's just play this thought experiment.
[28:42] experiment. >> Clean, beautiful coal. Yes,
[28:43] >> Clean, beautiful coal. Yes, >> safe oil. He called it safe oil because,
[28:46] >> safe oil. He called it safe oil because, as we know, kerosene is dangerous. Their
[28:48] as we know, kerosene is dangerous. Their first big product was kerosene. And
[28:51] first big product was kerosene. And kerosene can light your house or it can
[28:53] kerosene can light your house or it can burn it down. and in the wrong hands it
[28:55] burn it down. and in the wrong hands it can torch a city or you can use it to
[28:57] can torch a city or you can use it to make a bomb. So John D, let's say should
[29:02] make a bomb. So John D, let's say should have called for the creation of a new
[29:04] have called for the creation of a new government agency to regulate the safety
[29:06] government agency to regulate the safety of his product and they could have done
[29:08] of his product and they could have done rigorous testing, licensing, common
[29:10] rigorous testing, licensing, common sense regulation. There would have been
[29:12] sense regulation. There would have been a very intense debate over safety
[29:14] a very intense debate over safety standards, you know, what should the
[29:15] standards, you know, what should the proper wick thickness be and should we
[29:18] proper wick thickness be and should we allow all those dangerous independent
[29:21] allow all those dangerous independent refiners, right? And I think people
[29:23] refiners, right? And I think people would have gotten so wrapped up in this
[29:25] would have gotten so wrapped up in this debate over what constituted safe oil or
[29:29] debate over what constituted safe oil or safe kerosene that they would have
[29:31] safe kerosene that they would have missed what was really going on, which
[29:33] missed what was really going on, which is that Rockefeller was building the
[29:34] is that Rockefeller was building the richest, most powerful monopoly of all
[29:36] richest, most powerful monopoly of all time. In fact, people might even have
[29:38] time. In fact, people might even have called Rockefeller an effective altruist
[29:41] called Rockefeller an effective altruist because of course he was so concerned
[29:43] because of course he was so concerned about the safety of his product.
[29:46] about the safety of his product. >> I love it. Shout out to David Sax's
[29:48] >> I love it. Shout out to David Sax's writers. Great great writers. Newman.
[29:51] writers. Great great writers. Newman. Newman wrote this.
[29:52] Newman wrote this. >> No, I wrote it.
[29:53] >> No, I wrote it. >> Emmy award for best writing in a
[29:55] >> Emmy award for best writing in a dramatic monologue goes to Newman. Wow.
[29:58] dramatic monologue goes to Newman. Wow. Sax writing. He landed it. Very good.
[30:00] Sax writing. He landed it. Very good. Sax,
[30:01] Sax, >> I thought after the Elon anthropic dant
[30:04] >> I thought after the Elon anthropic dant where Elon said, you know, complimented
[30:06] where Elon said, you know, complimented anthropic that and David started off
[30:08] anthropic that and David started off with a bit of a compliment. I thought we
[30:10] with a bit of a compliment. I thought we maybe were past this first. It's
[30:11] maybe were past this first. It's ridiculous to think of this as a
[30:13] ridiculous to think of this as a monopoly. You know, we're talking about
[30:15] monopoly. You know, we're talking about annual run rate revenues, David, but on
[30:18] annual run rate revenues, David, but on a gap basis, they're doing about the
[30:19] a gap basis, they're doing about the same revenue as OpenAI in the month of
[30:21] same revenue as OpenAI in the month of March. Okay? So, we're way ahead of
[30:23] March. Okay? So, we're way ahead of ourselves. By the way, 5 months ago,
[30:25] ourselves. By the way, 5 months ago, everybody thought Open AI was going to
[30:27] everybody thought Open AI was going to run away with this. Google's revenues
[30:29] run away with this. Google's revenues are very substantial in AI. And by the
[30:32] are very substantial in AI. And by the way, Google, Amazon, etc., these
[30:34] way, Google, Amazon, etc., these companies are producing hundred billion
[30:36] companies are producing hundred billion dollars of free cash flow to justify
[30:38] dollars of free cash flow to justify their incremental investment. At the
[30:39] their incremental investment. At the same time, you have these two startups
[30:41] same time, you have these two startups that are still fledgling, that are still
[30:43] that are still fledgling, that are still fragile in the scheme of things. You of
[30:46] fragile in the scheme of things. You of all people should know we've got the
[30:47] all people should know we've got the best competition in AI on the planet,
[30:50] best competition in AI on the planet, which is why we're at the frontier and
[30:52] which is why we're at the frontier and kicking the tail of everybody else on
[30:54] kicking the tail of everybody else on the planet. So, I just want to see these
[30:56] the planet. So, I just want to see these companies compete. I want to see DC stay
[30:58] companies compete. I want to see DC stay out of the way. The last thing I want to
[31:00] out of the way. The last thing I want to be doing is is is, you know, seeing
[31:03] be doing is is is, you know, seeing people talk about this and throwing
[31:04] people talk about this and throwing roadblocks into the way of the
[31:05] roadblocks into the way of the competition. Um, I think I agree.
[31:08] competition. Um, I think I agree. >> Well, let me let me hold on. Let me
[31:10] >> Well, let me let me hold on. Let me translate Brad for you. Don't with
[31:12] translate Brad for you. Don't with my paper is what he's saying. He's got
[31:14] my paper is what he's saying. He's got bets on these. So, Sachs, Washington,
[31:16] bets on these. So, Sachs, Washington, DC, don't with Brad's paper.
[31:19] DC, don't with Brad's paper. >> Saxs, uh, do you want to get into the,
[31:21] >> Saxs, uh, do you want to get into the, uh, regulation stuff right now as a
[31:23] uh, regulation stuff right now as a segue or
[31:24] segue or >> Let me respond to Brad and also
[31:25] >> Let me respond to Brad and also translate what I'm saying satirically.
[31:28] translate what I'm saying satirically. Okay.
[31:29] Okay. >> First of all, nobody wants to see these
[31:32] >> First of all, nobody wants to see these companies compete vigorously more than
[31:35] companies compete vigorously more than me. That was the whole premise
[31:38] me. That was the whole premise >> of the action plan that we worked on
[31:40] >> of the action plan that we worked on last year is we want to bring out the
[31:41] last year is we want to bring out the best in everyone. This is how America's
[31:43] best in everyone. This is how America's going to win the AI race. We have five
[31:45] going to win the AI race. We have five major labs vigorously competing and as
[31:48] major labs vigorously competing and as long as that competition is taking place
[31:50] long as that competition is taking place that I think that's a good thing.
[31:51] that I think that's a good thing. Doesn't mean we can't have guardrails
[31:52] Doesn't mean we can't have guardrails and the rest of it, but basically
[31:54] and the rest of it, but basically competition should be our northstar. All
[31:57] competition should be our northstar. All of that being said, okay, what I am
[32:00] of that being said, okay, what I am pointing out and I I think it's
[32:01] pointing out and I I think it's historically true that people in
[32:03] historically true that people in Washington have woken up to monopolies
[32:07] Washington have woken up to monopolies on the late side, not early, right?
[32:10] on the late side, not early, right? Because I mean, once a company has won
[32:12] Because I mean, once a company has won 80% of the market, that's when they wake
[32:14] 80% of the market, that's when they wake up and say, "Oh, we have a monopoly
[32:15] up and say, "Oh, we have a monopoly here." And I'm not saying that they have
[32:18] here." And I'm not saying that they have a monopoly yet. But if the trajectory
[32:20] a monopoly yet. But if the trajectory continues for just 18 more months,
[32:23] continues for just 18 more months, >> then I think it will be in this
[32:25] >> then I think it will be in this unprecedently powerful position. I mean
[32:29] unprecedently powerful position. I mean >> and and hold on and I don't think people
[32:31] >> and and hold on and I don't think people should be distracted from that fact by
[32:34] should be distracted from that fact by this rhetoric around safety because
[32:36] this rhetoric around safety because someone like Rockefeller could have used
[32:38] someone like Rockefeller could have used it too. And I do think I mean just like
[32:41] it too. And I do think I mean just like one one last point on this. I do think
[32:43] one one last point on this. I do think that if you actually look at what a lot
[32:46] that if you actually look at what a lot of the the safetiest policies are
[32:49] of the the safetiest policies are calling for, they're basically calling
[32:51] calling for, they're basically calling for a form of regulatory capture and
[32:53] for a form of regulatory capture and they're calling for things that would
[32:54] they're calling for things that would create a stronger moat around this
[32:57] create a stronger moat around this monopoly or duopoly that's in the
[32:59] monopoly or duopoly that's in the process of being created and it would
[33:01] process of being created and it would get in the way of competition. So again,
[33:04] get in the way of competition. So again, I think that people might not have such
[33:07] I think that people might not have such a charitable view of all this safety
[33:09] a charitable view of all this safety rhetoric if they understood that what
[33:12] rhetoric if they understood that what was being created here is the biggest
[33:14] was being created here is the biggest monopoly in human history. And I think
[33:16] monopoly in human history. And I think we should just be a little bit more
[33:17] we should just be a little bit more skeptical about some of these altruistic
[33:20] skeptical about some of these altruistic claims. I can't believe that David is
[33:24] claims. I can't believe that David is like, you know, talking monopolies when
[33:27] like, you know, talking monopolies when we haven't even left the starting gate
[33:29] we haven't even left the starting gate of AI. I I I I think this is a uh I to
[33:34] of AI. I I I I think this is a uh I to me
[33:34] me >> there's only two companies with revenue.
[33:36] >> there's only two companies with revenue. >> The last thing I want is DC trying to
[33:39] >> The last thing I want is DC trying to preemptively preemptively, which would
[33:41] preemptively preemptively, which would be like a disastrous consequence, get in
[33:44] be like a disastrous consequence, get in the game of picking winners and losers
[33:46] the game of picking winners and losers at the starting line of AI. That would
[33:48] at the starting line of AI. That would be a disaster.
[33:49] be a disaster. >> Brad, did you just put another soap box
[33:51] >> Brad, did you just put another soap box on top of the soap box you were standing
[33:52] on top of the soap box you were standing on?
[33:54] on? >> Look, Brad, like I said, my northstar is
[33:57] >> Look, Brad, like I said, my northstar is competition. As long as there's
[33:58] competition. As long as there's competition going on, I support it.
[34:00] competition going on, I support it. However, hold on, hold on. We know
[34:03] However, hold on, hold on. We know >> that monopouists want to stop
[34:06] >> that monopouists want to stop competition and they use regulatory
[34:07] competition and they use regulatory capture to do it. And furthermore, they
[34:09] capture to do it. And furthermore, they do things like ban their competitors
[34:11] do things like ban their competitors from using their product. What
[34:12] from using their product. What conceivable reason did Anthropic have
[34:15] conceivable reason did Anthropic have for banning OpenClaw using its models?
[34:18] for banning OpenClaw using its models? That is anti-competitive, is it not?
[34:21] That is anti-competitive, is it not? >> I I I would double click on it. I would
[34:23] >> I I I would double click on it. I would double click on it. I might not, you
[34:24] double click on it. I might not, you know, file, but I would double click.
[34:26] know, file, but I would double click. Okay, listen. Chimoth, the girls are
[34:28] Okay, listen. Chimoth, the girls are fighting. Let's keep moving through the
[34:30] fighting. Let's keep moving through the docket. We're going to be here all day
[34:31] docket. We're going to be here all day with these two. And uh one thing that
[34:34] with these two. And uh one thing that you're going to need to act on very
[34:35] you're going to need to act on very quickly is the all-in summit. It's
[34:37] quickly is the all-in summit. It's selling out fast. Don't miss it.
[34:38] selling out fast. Don't miss it. Speakers are top tier again. Freeberg
[34:40] Speakers are top tier again. Freeberg busy working on some amazing speakers.
[34:43] busy working on some amazing speakers. Saxs will be there. Uh he's flying in
[34:44] Saxs will be there. Uh he's flying in and out every day for four hours. And
[34:46] and out every day for four hours. And then we're we're going to have a lot of
[34:49] then we're we're going to have a lot of networking stuff going down. We're
[34:50] networking stuff going down. We're building some networking software. So
[34:52] building some networking software. So when you come to our events, you get to
[34:53] when you come to our events, you get to meet people. That's what we always say.
[34:55] meet people. That's what we always say. my playbook for events. If you learn
[34:57] my playbook for events. If you learn something from the speakers every day,
[34:59] something from the speakers every day, one or two things, if you meet somebody
[35:01] one or two things, if you meet somebody new and you eat some great food and have
[35:03] new and you eat some great food and have some fun, you get two or three of those
[35:05] some fun, you get two or three of those things. Ah, man. Even if you get one,
[35:06] things. Ah, man. Even if you get one, you're going to come back to the event,
[35:07] you're going to come back to the event, you're going to get all three all day
[35:09] you're going to get all three all day long. allin.com/events. Los Angeles,
[35:12] long. allin.com/events. Los Angeles, September 13th, 14th, and 15th.
[35:15] September 13th, 14th, and 15th. Apologies to everybody asking, but
[35:16] Apologies to everybody asking, but liquidity is sold out and we've shut
[35:18] liquidity is sold out and we've shut down the wait list. There's just no more
[35:20] down the wait list. There's just no more room. All right. The White House
[35:22] room. All right. The White House allegedly, possibly is considering,
[35:25] allegedly, possibly is considering, according to reports, an FDA for AI that
[35:29] according to reports, an FDA for AI that would vet, you heard that correct,
[35:31] would vet, you heard that correct, folks, that would vet new models for
[35:33] folks, that would vet new models for safety. The thing we've been talking
[35:35] safety. The thing we've been talking about not doing here, the thing David
[35:37] about not doing here, the thing David Saxs has spent the last year on, the
[35:39] Saxs has spent the last year on, the White House is considering. New York
[35:41] White House is considering. New York Times reported Trump is considering an
[35:43] Times reported Trump is considering an executive order to create an quote AI
[35:46] executive order to create an quote AI working group. This group would include
[35:47] working group. This group would include tech exeacts and government officials
[35:49] tech exeacts and government officials who would quote examine potential
[35:52] who would quote examine potential oversight procedures
[35:54] oversight procedures including quote a review process for new
[35:57] including quote a review process for new AI models. Oi according to the report
[36:00] AI models. Oi according to the report the catalyst was wait for it anthropics
[36:03] the catalyst was wait for it anthropics mythos model which reportedly scared
[36:06] mythos model which reportedly scared spooked made people really nervous at
[36:09] spooked made people really nervous at the White House. quote, "The White House
[36:11] the White House. quote, "The White House wants to avoid any political
[36:13] wants to avoid any political repercussions if a devastating AI
[36:16] repercussions if a devastating AI enabled cyber attack were to occur. They
[36:18] enabled cyber attack were to occur. They want a CYA, according to the New York
[36:20] want a CYA, according to the New York Times." Kevin Hassid, that guy, the
[36:23] Times." Kevin Hassid, that guy, the director of the National Economic
[36:25] director of the National Economic Council, confirmed the report on Fox
[36:27] Council, confirmed the report on Fox Business. Here's your 15-second clip.
[36:28] Business. Here's your 15-second clip. We're studying possibly an executive
[36:31] We're studying possibly an executive order uh to give a clear roadmap to
[36:33] order uh to give a clear roadmap to everybody about how this is going to go
[36:35] everybody about how this is going to go and how future AIs that also potentially
[36:38] and how future AIs that also potentially create vulnerabilities uh should go
[36:40] create vulnerabilities uh should go through a process so that you know
[36:42] through a process so that you know they're released to the wild after
[36:43] they're released to the wild after they've been proven safe just like an
[36:45] they've been proven safe just like an FDA drug.
[36:46] FDA drug. >> Additionally,
[36:48] >> Additionally, friend of the pod Scott Bessen had
[36:51] friend of the pod Scott Bessen had something to say. What we had in the
[36:52] something to say. What we had in the past month was a step change in the
[36:55] past month was a step change in the power of one large language model, but
[36:58] power of one large language model, but we're going to see it from the other a
[37:00] we're going to see it from the other a AI companies. What we are determined to
[37:02] AI companies. What we are determined to do is work with our AI companies to
[37:05] do is work with our AI companies to allow them to continue innovate. But our
[37:08] allow them to continue innovate. But our charge of the US government is
[37:09] charge of the US government is maintaining safety. And there there is a
[37:12] maintaining safety. And there there is a very important calculus here between
[37:15] very important calculus here between innovation and safety. And at the the US
[37:18] innovation and safety. And at the the US government, we're going to make sure
[37:19] government, we're going to make sure that things stay safe. There you go.
[37:21] that things stay safe. There you go. Kevin Hasset and Bessant. Slightly
[37:25] Kevin Hasset and Bessant. Slightly different positions here. Brad, what do
[37:26] different positions here. Brad, what do you think?
[37:27] you think? >> Actually, I don't think they're slightly
[37:28] >> Actually, I don't think they're slightly different positions, but I I I would
[37:30] different positions, but I I I would agree that Kevin bringing up the FDA
[37:32] agree that Kevin bringing up the FDA kind of muddied the waters. I talked to
[37:33] kind of muddied the waters. I talked to Kevin last night after that clip ran,
[37:37] Kevin last night after that clip ran, you know, and I asked him I I just said,
[37:38] you know, and I asked him I I just said, "Do you think FDA is the right analog
[37:41] "Do you think FDA is the right analog here?" And he said, you know, I was I
[37:42] here?" And he said, you know, I was I was only bringing it up to say that they
[37:44] was only bringing it up to say that they we want them to show us the models so
[37:46] we want them to show us the models so that we can coordinate them. Obviously,
[37:48] that we can coordinate them. Obviously, our job is to make sure that the
[37:50] our job is to make sure that the government is prepared, that we harden
[37:52] government is prepared, that we harden our systems, that our intelligence
[37:54] our systems, that our intelligence agencies are up to speed. But he does
[37:56] agencies are up to speed. But he does not think, and I can't find anybody on
[37:59] not think, and I can't find anybody on the right, you know, uh, that believes
[38:01] the right, you know, uh, that believes that we're going to move to an approval
[38:03] that we're going to move to an approval regime, right? The approval regime, this
[38:06] regime, right? The approval regime, this idea that you're going to have to share
[38:08] idea that you're going to have to share every model with an FDA in Washington
[38:10] every model with an FDA in Washington and they're going to have to pre-approve
[38:12] and they're going to have to pre-approve the model is a disaster. Sachs has been
[38:14] the model is a disaster. Sachs has been effectively fighting against this
[38:16] effectively fighting against this correctly over the course of the last
[38:18] correctly over the course of the last year. It would just it would lead to
[38:20] year. It would just it would lead to three bad things. Number one, we do not
[38:22] three bad things. Number one, we do not want to put the wa Washington in in the
[38:24] want to put the wa Washington in in the position of picking winners and losers
[38:26] position of picking winners and losers when it comes to these models. We're
[38:27] when it comes to these models. We're winning. We're on the winning horse in
[38:29] winning. We're on the winning horse in America. We're out in front of the rest
[38:31] America. We're out in front of the rest of the world. There's no reason to
[38:33] of the world. There's no reason to change horses and regimes at this point.
[38:35] change horses and regimes at this point. And we don't want to burden this with
[38:36] And we don't want to burden this with more democracy. But at the same time
[38:39] more democracy. But at the same time obviously I call these pre-agi or agi
[38:41] obviously I call these pre-agi or agi models mistral spud etc. I see a lot of
[38:45] models mistral spud etc. I see a lot of coordination going on between the
[38:46] coordination going on between the industry and government. I think we can
[38:48] industry and government. I think we can do an even better job of evolving that
[38:50] do an even better job of evolving that framework so that everybody in
[38:52] framework so that everybody in government is on the same page. We need
[38:54] government is on the same page. We need to build more capacity in government to
[38:56] to build more capacity in government to quickly be able to do the cyber review
[38:58] quickly be able to do the cyber review on these models. Right now it takes too
[39:01] on these models. Right now it takes too long when the coordination does occur.
[39:03] long when the coordination does occur. So we need to have a finite amount of
[39:04] So we need to have a finite amount of time that they get government feedback
[39:07] time that they get government feedback etc. But the last thing that we want is
[39:09] etc. But the last thing that we want is an FDA of models sitting in Washington.
[39:12] an FDA of models sitting in Washington. Kevin understands that. Scott Besson
[39:14] Kevin understands that. Scott Besson understands that. So I expect that we
[39:16] understands that. So I expect that we will continue down the path that we've
[39:18] will continue down the path that we've been on. Chimoth obviously I think we
[39:21] been on. Chimoth obviously I think we all agree we don't need an FDA for AI
[39:23] all agree we don't need an FDA for AI but there are things that reasonably
[39:27] but there are things that reasonably people would want to have guard rails
[39:30] people would want to have guard rails around a AI. I'm sure you would agree it
[39:33] around a AI. I'm sure you would agree it shouldn't be a total free-for-all. So,
[39:35] shouldn't be a total free-for-all. So, what's your take on this? Is it just
[39:37] what's your take on this? Is it just somebody gave a bad analogy here or
[39:39] somebody gave a bad analogy here or maybe some people were weasling their
[39:41] maybe some people were weasling their way into the White House to try to shift
[39:42] way into the White House to try to shift things when Sax was back at home or
[39:44] things when Sax was back at home or something? What What's going on here?
[39:46] something? What What's going on here? Give us the uh cuz that's what that's
[39:48] Give us the uh cuz that's what that's what people say. They say the last
[39:49] what people say. They say the last person to talk to Trump kind of has his
[39:52] person to talk to Trump kind of has his ear and that things can bend a certain
[39:54] ear and that things can bend a certain way.
[39:55] way. >> I don't think it's that. I think that
[39:56] >> I don't think it's that. I think that there's a pretty profound vibe shift
[39:59] there's a pretty profound vibe shift with respect to tech, tech oligarchs,
[40:03] with respect to tech, tech oligarchs, Silicon Valley, and particularly the AI.
[40:05] Silicon Valley, and particularly the AI. That vibe shift has already happened on
[40:08] That vibe shift has already happened on Main Street, and I think that that's
[40:11] Main Street, and I think that that's starting to seep into Washington. I
[40:13] starting to seep into Washington. I think that regulations are coming. I
[40:16] think that regulations are coming. I think they'll be worse under a
[40:18] think they'll be worse under a Democratic regime, but I think that some
[40:20] Democratic regime, but I think that some form of oversight is going to exist
[40:22] form of oversight is going to exist under a Republican regime.
[40:25] under a Republican regime. The question that I think is worth
[40:26] The question that I think is worth asking is why.
[40:29] asking is why. And if you listen to everybody's tone,
[40:33] And if you listen to everybody's tone, it's all around the negatives of AI.
[40:36] it's all around the negatives of AI. So I think we suffer from two things.
[40:38] So I think we suffer from two things. Number one is we have horrible
[40:39] Number one is we have horrible messaging. Nobody spends the time and
[40:43] messaging. Nobody spends the time and the money to articulate the positive
[40:46] the money to articulate the positive upside case so that there's broad-based
[40:49] upside case so that there's broad-based support. And two, the idea that there's
[40:52] support. And two, the idea that there's going to be, as Sach said earlier, a few
[40:55] going to be, as Sach said earlier, a few winners and many, many, many potential
[40:58] winners and many, many, many potential losers, I think is really disconcerting
[41:00] losers, I think is really disconcerting to everybody. And the response from the
[41:03] to everybody. And the response from the tech community again should be the
[41:06] tech community again should be the leadership of the tech world coming
[41:09] leadership of the tech world coming together and actually reinvesting in
[41:12] together and actually reinvesting in America at large. They're not doing that
[41:14] America at large. They're not doing that in enough of a scale that blunts this.
[41:17] in enough of a scale that blunts this. So what you're seeing is the buildup of
[41:19] So what you're seeing is the buildup of antibodies. Is it avoidable? Yes. Are we
[41:23] antibodies. Is it avoidable? Yes. Are we doing a good job of avoiding it?
[41:25] doing a good job of avoiding it? Absolutely not. We're doing a horrible
[41:26] Absolutely not. We're doing a horrible job. I'd give the community, the tech
[41:28] job. I'd give the community, the tech leaders a Dminus, trending to an F. The
[41:33] leaders a Dminus, trending to an F. The response is what we're seeing. So I
[41:34] response is what we're seeing. So I think the question, Jason, isn't
[41:37] think the question, Jason, isn't regulation, no regulation. It's why did
[41:39] regulation, no regulation. It's why did we get here? And I think we got here
[41:42] we get here? And I think we got here because the other version, the glass
[41:45] because the other version, the glass half full version, the demonstrated
[41:47] half full version, the demonstrated investment, the broad-based uplifting of
[41:50] investment, the broad-based uplifting of American society hasn't happened. And if
[41:52] American society hasn't happened. And if it has, it's been very poorly
[41:54] it has, it's been very poorly communicated. And so the response is,
[41:56] communicated. And so the response is, hey, hold on. We're going to give three
[41:58] hey, hold on. We're going to give three guys trillion dollar net worths and
[42:01] guys trillion dollar net worths and we're going to allow them to control the
[42:02] we're going to allow them to control the keys. That's why this is happening.
[42:04] keys. That's why this is happening. >> Exactly. Correct. And it's very easy,
[42:07] >> Exactly. Correct. And it's very easy, Sachs, to imagine all the bad things
[42:09] Sachs, to imagine all the bad things that can happen. Our minds are
[42:11] that can happen. Our minds are constructed to do that. We're vigilant.
[42:13] constructed to do that. We're vigilant. We look out for the tiger or, you know,
[42:15] We look out for the tiger or, you know, the tornado to keep ourselves safe.
[42:18] the tornado to keep ourselves safe. Humans have a bias towards safety and
[42:21] Humans have a bias towards safety and they're going to think about, you know,
[42:22] they're going to think about, you know, deep fakes. They're going to think about
[42:24] deep fakes. They're going to think about robotics. They're going to think about
[42:25] robotics. They're going to think about self-driving cars, taking people's jobs.
[42:27] self-driving cars, taking people's jobs. They're going to think about, you know,
[42:28] They're going to think about, you know, all the dark things that could happen,
[42:30] all the dark things that could happen, boweapons, etc. And we don't have
[42:32] boweapons, etc. And we don't have anybody out there really talking about
[42:34] anybody out there really talking about all the positives that could happen.
[42:37] all the positives that could happen. What's your take on the palace intrigue
[42:40] What's your take on the palace intrigue we all have here? What's going on in the
[42:43] we all have here? What's going on in the palace in the 47th administration around
[42:46] palace in the 47th administration around this debate? Who's leading Trump down
[42:49] this debate? Who's leading Trump down the path of regulation and creating this
[42:52] the path of regulation and creating this AI FDA? We know you're part of the camp
[42:55] AI FDA? We know you're part of the camp that wants to keep this train moving and
[42:58] that wants to keep this train moving and not overregulated, not have regulatory
[43:00] not overregulated, not have regulatory capture. Who are the people trying to
[43:01] capture. Who are the people trying to slow this down?
[43:02] slow this down? >> Well, look, I I think there's several
[43:04] >> Well, look, I I think there's several things going on here. The first one is
[43:06] things going on here. The first one is there's a lot of fake news. This whole
[43:08] there's a lot of fake news. This whole idea of an FDA for AI, I don't think any
[43:11] idea of an FDA for AI, I don't think any senior official supports it. Just like
[43:13] senior official supports it. Just like Brad was saying, I spoke to Hassid as
[43:15] Brad was saying, I spoke to Hassid as well. That's not where his head is at.
[43:17] well. That's not where his head is at. So, I don't think anybody in the
[43:19] So, I don't think anybody in the administration is saying they want an
[43:21] administration is saying they want an FDA for AI. Certainly, I don't think
[43:24] FDA for AI. Certainly, I don't think that's the way the president thinks
[43:25] that's the way the president thinks about these issues. He's the most pro-
[43:27] about these issues. He's the most pro- innovation president we've ever had. And
[43:29] innovation president we've ever had. And the White House chief of staff, Susie
[43:30] the White House chief of staff, Susie Wall, just put out a statement last
[43:32] Wall, just put out a statement last night that I think pretty much shoots
[43:33] night that I think pretty much shoots this down. So I think there's a big fake
[43:35] this down. So I think there's a big fake news component. Remember it was not
[43:37] news component. Remember it was not really the White House who was saying
[43:38] really the White House who was saying it. It was the New York Times and there
[43:40] it. It was the New York Times and there and really I think actually Andrew Ross
[43:42] and really I think actually Andrew Ross Sto I'm not criticizing him but he's a
[43:45] Sto I'm not criticizing him but he's a commentator and he's the one who said
[43:47] commentator and he's the one who said this first and then somehow that spin or
[43:50] this first and then somehow that spin or that gloss somehow took on a life of its
[43:53] that gloss somehow took on a life of its own. And I think Silicon Valley reacted
[43:57] own. And I think Silicon Valley reacted accordingly. There's a very visceral
[43:58] accordingly. There's a very visceral negative reaction here because we know
[44:00] negative reaction here because we know how damaging that would be to
[44:01] how damaging that would be to innovation. But look, I think the good
[44:03] innovation. But look, I think the good news is that that was fake news. Second,
[44:06] news is that that was fake news. Second, I think that there's another thing going
[44:08] I think that there's another thing going on, which is a straw manning of what the
[44:12] on, which is a straw manning of what the Trump administration did on AI in its
[44:15] Trump administration did on AI in its first year. And in the same way that
[44:18] first year. And in the same way that they want to spin this FDA for AI,
[44:20] they want to spin this FDA for AI, they're also trying to spin what we did
[44:22] they're also trying to spin what we did as this completely lazy fair attitude
[44:25] as this completely lazy fair attitude where there'd be no regulations
[44:26] where there'd be no regulations whatsoever, no guardrails. It's a way of
[44:29] whatsoever, no guardrails. It's a way of criticizing what we did. They're trying
[44:31] criticizing what we did. They're trying to portray it as unsafe. In fact, if you
[44:33] to portray it as unsafe. In fact, if you look on March 20th, the White House
[44:35] look on March 20th, the White House released a national AI regulatory
[44:38] released a national AI regulatory framework that I worked on in which we
[44:41] framework that I worked on in which we put out a four-page bulleted list of
[44:43] put out a four-page bulleted list of legislation that we would support if
[44:46] legislation that we would support if Congress wants to pass it. So, we have
[44:49] Congress wants to pass it. So, we have not been against every conceivable
[44:52] not been against every conceivable regulation or every conceivable law. We
[44:54] regulation or every conceivable law. We just believe that there should be
[44:56] just believe that there should be specific solutions to specific problems
[44:59] specific solutions to specific problems as opposed to a giant power grab by
[45:01] as opposed to a giant power grab by Washington that would squash innovation.
[45:03] Washington that would squash innovation. So I think that's point number two.
[45:06] So I think that's point number two. Point number three is there is a
[45:08] Point number three is there is a legitimate thing happening here with
[45:10] legitimate thing happening here with let's call it mythos or cyber. Okay, we
[45:13] let's call it mythos or cyber. Okay, we know that it's not just mythos. Open AI
[45:17] know that it's not just mythos. Open AI now has a model that's just as cyber
[45:19] now has a model that's just as cyber capable as mythos. And within 3 to six
[45:22] capable as mythos. And within 3 to six months, all the major frontier labs and
[45:24] months, all the major frontier labs and including Chinese models will have cyber
[45:27] including Chinese models will have cyber capabilities.
[45:28] capabilities. In response to that, we do need there to
[45:31] In response to that, we do need there to be a hardening of systems and we do need
[45:35] be a hardening of systems and we do need there to be a scanning of code bases to
[45:37] there to be a scanning of code bases to find these vulnerabilities and patch
[45:40] find these vulnerabilities and patch them before the hackers do it because
[45:42] them before the hackers do it because the hackers will have these capabilities
[45:44] the hackers will have these capabilities in a matter of months. That's a
[45:46] in a matter of months. That's a certainty because the same capabilities
[45:48] certainty because the same capabilities you use for cyber defense can also be
[45:50] you use for cyber defense can also be used for cyber offense. It's the same
[45:52] used for cyber offense. It's the same tool set and the open source models will
[45:55] tool set and the open source models will have these capabilities anyway.
[45:56] have these capabilities anyway. >> They already have it to a certain
[45:57] >> They already have it to a certain extent. Let's be honest, they have 80%
[45:59] extent. Let's be honest, they have 80% of it.
[46:00] of it. >> It's simply the case that AI will be
[46:02] >> It's simply the case that AI will be good at cyber and so we do need a
[46:04] good at cyber and so we do need a response to that. Now my view on what
[46:07] response to that. Now my view on what should that response be should be first
[46:10] should that response be should be first of all we should want the government and
[46:13] of all we should want the government and the private sector to work cooperatively
[46:15] the private sector to work cooperatively and I think they are we have a giant
[46:18] and I think they are we have a giant cyber security industry in the United
[46:20] cyber security industry in the United States whose sole job it is to protect
[46:23] States whose sole job it is to protect systems and protect against breaches.
[46:26] systems and protect against breaches. >> We have the best companies in the world
[46:28] >> We have the best companies in the world at doing that. We have Crowd Strike. We
[46:30] at doing that. We have Crowd Strike. We have Palo Alto Networks. We talked about
[46:32] have Palo Alto Networks. We talked about that before. We have the best defense.
[46:34] that before. We have the best defense. >> Right. Exactly. And so what we should be
[46:36] >> Right. Exactly. And so what we should be doing I think is getting these tools
[46:38] doing I think is getting these tools mythos and then the open AI model and
[46:41] mythos and then the open AI model and and others like it in the hands of our
[46:43] and others like it in the hands of our cyber security industry. And by the way
[46:45] cyber security industry. And by the way not just the public companies like Palo
[46:47] not just the public companies like Palo Alto Networks and Crowd Strike although
[46:49] Alto Networks and Crowd Strike although certainly they're two of the most
[46:50] certainly they're two of the most noteworthy but there's also some
[46:52] noteworthy but there's also some incredibly strong startups on the way up
[46:55] incredibly strong startups on the way up there that are at the cutting edge of
[46:57] there that are at the cutting edge of doing AI powered pen testing and all the
[47:00] doing AI powered pen testing and all the rest of it. We need to get these tools
[47:02] rest of it. We need to get these tools into their hands as quickly as possible
[47:03] into their hands as quickly as possible because they're a force multiplier for
[47:05] because they're a force multiplier for all the companies out there that aren't
[47:07] all the companies out there that aren't that good at cyber security or maybe
[47:10] that good at cyber security or maybe they've got IT departments. They can use
[47:12] they've got IT departments. They can use these companies as vendors. So I think
[47:14] these companies as vendors. So I think that there is a role for
[47:15] that there is a role for >> Can I ask you a question? Yeah.
[47:16] >> Can I ask you a question? Yeah. >> Do you think that the models should
[47:20] >> Do you think that the models should have a KYC rapper going forward?
[47:22] have a KYC rapper going forward? >> KYC for the audience is know your
[47:25] >> KYC for the audience is know your customer.
[47:25] customer. >> Yeah. So really what it would mean is
[47:27] >> Yeah. So really what it would mean is that before you can use mythos you have
[47:30] that before you can use mythos you have to identify yourself so that we can try
[47:32] to identify yourself so that we can try to know that you're not a state
[47:35] to know that you're not a state sponsored actor or you know a bad guy.
[47:38] sponsored actor or you know a bad guy. >> I think that's the type of thing that we
[47:39] >> I think that's the type of thing that we should be thinking about. So first of
[47:41] should be thinking about. So first of all I want to say that both anthropic
[47:43] all I want to say that both anthropic and open AI acted responsibly here. No
[47:46] and open AI acted responsibly here. No one was trying to release these super
[47:48] one was trying to release these super powerful models. So, in a way, all the
[47:51] powerful models. So, in a way, all the people who are saying that we need
[47:52] people who are saying that we need pre-release approvals for models,
[47:54] pre-release approvals for models, they're trying to solve a problem that
[47:55] they're trying to solve a problem that didn't exist regulated.
[47:57] didn't exist regulated. >> Yeah, they probably wasn't trying to
[47:58] >> Yeah, they probably wasn't trying to release. I wasn't trying to release
[48:00] release. I wasn't trying to release this. They all understood the power and
[48:01] this. They all understood the power and they were all acting responsibly.
[48:03] they were all acting responsibly. >> They understood the ramifications. They
[48:05] >> They understood the ramifications. They would have been sued. So, they there is
[48:06] would have been sued. So, they there is a self-p policing going on here, which
[48:08] a self-p policing going on here, which is the ultimate way to do this.
[48:10] is the ultimate way to do this. >> Yes. And but to your point, Jimoth,
[48:12] >> Yes. And but to your point, Jimoth, yeah, look, I think that before giving
[48:15] yeah, look, I think that before giving your API for a super powerful model, you
[48:17] your API for a super powerful model, you should not give that to a company or an
[48:19] should not give that to a company or an actor. You don't know who they are. So,
[48:21] actor. You don't know who they are. So, yeah, some basic KYC makes sense. They
[48:23] yeah, some basic KYC makes sense. They should know who they're giving these
[48:24] should know who they're giving these tools to. And I guess my view on the
[48:27] tools to. And I guess my view on the mythos preview and whatever the
[48:28] mythos preview and whatever the equivalent is of what OpenAI is doing is
[48:32] equivalent is of what OpenAI is doing is that we very rapidly need to get these
[48:34] that we very rapidly need to get these tools into the hands of more good guys.
[48:36] tools into the hands of more good guys. You need to know who those good guys
[48:38] You need to know who those good guys are. You need to know who they are. So
[48:40] are. You need to know who they are. So yeah, KYC is like a predicate for that,
[48:43] yeah, KYC is like a predicate for that, right? You got a new
[48:44] right? You got a new >> just to be clear, we'd all agree that if
[48:46] >> just to be clear, we'd all agree that if you did have identity for those frontier
[48:49] you did have identity for those frontier models, which they're probably doing
[48:50] models, which they're probably doing anyway right now, and you logged what
[48:52] anyway right now, and you logged what people were doing with them to look for
[48:53] people were doing with them to look for security breaches, that wouldn't
[48:54] security breaches, that wouldn't necessarily happen when you released it
[48:56] necessarily happen when you released it to the public because of privacy issues.
[48:59] to the public because of privacy issues. Here's your Poly Market for Trump
[49:01] Here's your Poly Market for Trump ordering a federal review of AI models
[49:03] ordering a federal review of AI models by May 31st. 21% chance I think uh to
[49:07] by May 31st. 21% chance I think uh to our partner at Poly Market. Man, I I got
[49:09] our partner at Poly Market. Man, I I got to get in here. I don't Do I have inside
[49:11] to get in here. I don't Do I have inside information here being uh the world's
[49:13] information here being uh the world's greatest moderator on this podcast or I
[49:15] greatest moderator on this podcast or I can Can I collect this money? Chimoth,
[49:16] can Can I collect this money? Chimoth, what am I going to do here?
[49:17] what am I going to do here? >> I would not do not do not place a bet.
[49:19] >> I would not do not do not place a bet. Jal,
[49:20] Jal, >> don't place a bet.
[49:21] >> don't place a bet. >> Don't place a bet. But anyway, Chim's
[49:23] >> Don't place a bet. But anyway, Chim's point, I mean, look, I think we're kind
[49:25] point, I mean, look, I think we're kind of workshopping this in real time.
[49:26] of workshopping this in real time. >> We are.
[49:27] >> We are. >> I think that I think that for the
[49:28] >> I think that I think that for the preview period, we should definitely
[49:30] preview period, we should definitely have KYC.
[49:31] have KYC. >> Maybe.
[49:31] >> Maybe. >> What about logging? What about logging?
[49:33] >> What about logging? What about logging? Well, look, once you're past the preview
[49:35] Well, look, once you're past the preview period and it's in general release, I'm
[49:38] period and it's in general release, I'm not sure if the KYC matters as much
[49:39] not sure if the KYC matters as much because so many people are going to have
[49:40] because so many people are going to have it. But during the preview period, there
[49:42] it. But during the preview period, there should be KYC.
[49:43] should be KYC. >> Let me let me just say one thing. All
[49:45] >> Let me let me just say one thing. All the labs are already tracking API use.
[49:48] the labs are already tracking API use. Okay? And anything suspicious because
[49:50] Okay? And anything suspicious because they there there are major
[49:52] they there there are major anti-distillation efforts going on by
[49:54] anti-distillation efforts going on by all the labs. There's a ton of
[49:56] all the labs. There's a ton of coordination going on with the
[49:57] coordination going on with the government. there's way more happening I
[50:00] government. there's way more happening I think in terms about API and API use um
[50:03] think in terms about API and API use um and anything suspicious is being flagged
[50:06] and anything suspicious is being flagged and being shared with the government. So
[50:07] and being shared with the government. So the idea that we have no idea who's
[50:09] the idea that we have no idea who's doing it I think is not the case and in
[50:12] doing it I think is not the case and in fact in some cases we may want to allow
[50:15] fact in some cases we may want to allow people to use it so that we can see
[50:17] people to use it so that we can see exactly the types of things that they
[50:19] exactly the types of things that they are extracting. So I would just I would
[50:21] are extracting. So I would just I would say we're already down that path but
[50:23] say we're already down that path but better coordination
[50:25] better coordination may may in fact be called for.
[50:27] may may in fact be called for. >> Yeah. And just one last point in this
[50:28] >> Yeah. And just one last point in this whole thing is I just want to build on
[50:30] whole thing is I just want to build on my point that pre-release approvals is
[50:32] my point that pre-release approvals is solving a problem that didn't really
[50:33] solving a problem that didn't really exist because again anthropic and open
[50:35] exist because again anthropic and open AI weren't trying to release these
[50:37] AI weren't trying to release these models yet is that there is a
[50:39] models yet is that there is a substantial faction of let's say AI
[50:42] substantial faction of let's say AI idologues or doomers who
[50:45] idologues or doomers who are basically employing the classic
[50:47] are basically employing the classic never let a crisis go to waste strategy
[50:50] never let a crisis go to waste strategy right that yes we do have this cyber
[50:53] right that yes we do have this cyber issue that is real you know everyone
[50:55] issue that is real you know everyone needs to harden their systems now
[50:56] needs to harden their systems now >> totally over the next 3 to 6 months.
[50:58] >> totally over the next 3 to 6 months. That is a real issue. But that is a
[51:01] That is a real issue. But that is a problem that we will solve over the next
[51:02] problem that we will solve over the next 6 months. We have to. But what they're
[51:04] 6 months. We have to. But what they're trying to do is use that issue to try
[51:08] trying to do is use that issue to try and create a permanent new
[51:09] and create a permanent new infrastructure in Washington. Again, I
[51:11] infrastructure in Washington. Again, I don't think the admin that's not the
[51:12] don't think the admin that's not the administration's intention. That's not
[51:14] administration's intention. That's not the administration's agenda. But you saw
[51:16] the administration's agenda. But you saw a lot of people on social media, a lot
[51:19] a lot of people on social media, a lot of the think tanks and even Bernie
[51:21] of the think tanks and even Bernie Sanders weighed in and he said for the
[51:23] Sanders weighed in and he said for the first time, I like something that the
[51:25] first time, I like something that the Trump administration wants to do.
[51:26] Trump administration wants to do. >> The administration understands the 1% of
[51:29] >> The administration understands the 1% of the 1% of Sachs and everybody understand
[51:31] the 1% of Sachs and everybody understand that this is out of control. The AI is
[51:33] that this is out of control. The AI is going to take the jobs. They're going to
[51:34] going to take the jobs. They're going to take my summer home. It's going to be
[51:36] take my summer home. It's going to be terrible.
[51:37] terrible. >> So, there are people who have this
[51:38] >> So, there are people who have this agenda. Look, Bernie Sanders just wants
[51:40] agenda. Look, Bernie Sanders just wants to stop the progress. I mean, he's
[51:43] to stop the progress. I mean, he's >> he wants to ban data centers. He's put
[51:44] >> he wants to ban data centers. He's put out a bunch of
[51:46] out a bunch of He he basically has bought into the
[51:47] He he basically has bought into the whole doomer narrative. So look, that's
[51:50] whole doomer narrative. So look, that's why he likes the FDA idea is because it
[51:54] why he likes the FDA idea is because it would put the kibos on innovation.
[51:56] would put the kibos on innovation. >> It's enough already. Let's give let's
[51:57] >> It's enough already. Let's give let's have a go back to paper and pen. It was
[51:58] have a go back to paper and pen. It was a better society. Saxs
[52:00] a better society. Saxs >> Jason, what do you think?
[52:01] >> Jason, what do you think? >> I think there's two really interesting
[52:03] >> I think there's two really interesting things I want to build on here. The
[52:04] things I want to build on here. The first is your point Shamath around how
[52:07] first is your point Shamath around how do we turn around the sort of bad vibes
[52:11] do we turn around the sort of bad vibes around AI? I think we have to have two
[52:14] around AI? I think we have to have two strategies here. One is giving what
[52:16] strategies here. One is giving what you've been working on Brad with your
[52:18] you've been working on Brad with your project we should see more people
[52:19] project we should see more people giving. There's no reason why Nvidia
[52:22] giving. There's no reason why Nvidia SpaceX when they go public
[52:25] SpaceX when they go public anthropic when they go public open AAI
[52:27] anthropic when they go public open AAI if and when they go public or if they
[52:29] if and when they go public or if they become stay as nonprofit there's no
[52:31] become stay as nonprofit there's no reason those folks in an IPO couldn't
[52:33] reason those folks in an IPO couldn't give a portion of the IPO to every
[52:35] give a portion of the IPO to every American citizen. So IPO K, IPO for
[52:38] American citizen. So IPO K, IPO for kids, they all take, you know, whatever
[52:40] kids, they all take, you know, whatever it is, 5%, 1%, whatever they choose, and
[52:43] it is, 5%, 1%, whatever they choose, and they put it into the Invest America
[52:45] they put it into the Invest America accounts and we should see some major
[52:48] accounts and we should see some major giving from the people who are becoming
[52:50] giving from the people who are becoming trillionaires, 100 billionaires,
[52:52] trillionaires, 100 billionaires, whatever it happens to be. There's no
[52:53] whatever it happens to be. There's no reason not to. But those people haven't
[52:55] reason not to. But those people haven't been doing that. We had this giving
[52:56] been doing that. We had this giving pledge which was a little bit of virtue
[52:58] pledge which was a little bit of virtue signaling and it wasn't real. It was
[53:00] signaling and it wasn't real. It was just, you know, at the end of your life,
[53:01] just, you know, at the end of your life, you promise to give away half your
[53:02] you promise to give away half your money. So let's have something real.
[53:04] money. So let's have something real. Let's have something where you know
[53:05] Let's have something where you know people say I'm going to give away 1% of
[53:07] people say I'm going to give away 1% of my stock over the next 20 years of my
[53:09] my stock over the next 20 years of my life every year 1% will go into invest
[53:11] life every year 1% will go into invest America whatever it is it won't cost
[53:13] America whatever it is it won't cost anybody anything you can't spend this
[53:15] anybody anything you can't spend this money whether it's Bezos or whoever
[53:17] money whether it's Bezos or whoever second in that same thing in terms of
[53:19] second in that same thing in terms of giving back we have not talked about how
[53:22] giving back we have not talked about how massive this could be for health and
[53:24] massive this could be for health and extending people's life and reducing
[53:26] extending people's life and reducing suffering we need to work on that that's
[53:28] suffering we need to work on that that's where contributions to basic science
[53:29] where contributions to basic science could come in and obviously education
[53:32] could come in and obviously education and lowering the cost of education And
[53:34] and lowering the cost of education And if you look at what Americans on the
[53:36] if you look at what Americans on the bottom half, you were talking about the,
[53:38] bottom half, you were talking about the, you know, cup half empty, there's really
[53:40] you know, cup half empty, there's really two or three things they really feel
[53:42] two or three things they really feel anxiety about. One of it is income and
[53:45] anxiety about. One of it is income and the second is healthare and on the
[53:48] the second is healthare and on the margins, housing and their kids, you
[53:50] margins, housing and their kids, you know, their kids' education, the cost of
[53:52] know, their kids' education, the cost of those things. We should really take a
[53:53] those things. We should really take a look deeply at, and I know this is very
[53:56] look deeply at, and I know this is very unpopular amongst capitalists, including
[53:58] unpopular amongst capitalists, including myself. We should really look at the
[53:59] myself. We should really look at the minimum wage and study what happened in
[54:01] minimum wage and study what happened in New Zealand, Sweden, Switzerland, uh,
[54:04] New Zealand, Sweden, Switzerland, uh, Australia when they raised it. What
[54:06] Australia when they raised it. What actually happened when they raised it,
[54:07] actually happened when they raised it, and there was a lot of hand ringing
[54:08] and there was a lot of hand ringing about it, but when they slowly raised
[54:09] about it, but when they slowly raised it, what they found was those consumers
[54:11] it, what they found was those consumers don't save money. They spend it. They're
[54:13] don't save money. They spend it. They're always behind the eight-ball in terms of
[54:15] always behind the eight-ball in terms of their spending. We should opt in to
[54:17] their spending. We should opt in to trying to raise the minimum wage company
[54:20] trying to raise the minimum wage company by company by company. and just give
[54:22] by company by company. and just give people who are at the end of the
[54:24] people who are at the end of the spectrum that understanding that, hey,
[54:26] spectrum that understanding that, hey, year-over-year, whether it's Amazon or
[54:29] year-over-year, whether it's Amazon or Target, etc., restaurants, we're all
[54:32] Target, etc., restaurants, we're all collectively going to add a little bit
[54:34] collectively going to add a little bit to that minimum wage and try to lift the
[54:36] to that minimum wage and try to lift the bottom third of society. That's the
[54:38] bottom third of society. That's the stuff we're not talking about. We don't
[54:40] stuff we're not talking about. We don't talk about it here on this podcast. We
[54:41] talk about it here on this podcast. We don't talk about universal healthcare.
[54:43] don't talk about universal healthcare. We don't talk about the minimum wage,
[54:44] We don't talk about the minimum wage, but that's what capitalists should be
[54:46] but that's what capitalists should be talking about. And if we did that, if we
[54:48] talking about. And if we did that, if we increased the minimum wage, and I'm not
[54:50] increased the minimum wage, and I'm not a socialist. I'm a capitalist who think
[54:51] a socialist. I'm a capitalist who think this is good for capitalism. If we
[54:53] this is good for capitalism. If we increased the minimum wage just modestly
[54:55] increased the minimum wage just modestly each year and we opted into doing that
[54:58] each year and we opted into doing that and we figured out a way to give
[54:59] and we figured out a way to give universal healthcare, companies wouldn't
[55:00] universal healthcare, companies wouldn't have to deal with universal healthcare
[55:02] have to deal with universal healthcare and we would have customers and we're a
[55:04] and we would have customers and we're a customer-driven economy. Like 60 70 80%
[55:07] customer-driven economy. Like 60 70 80% of what happens in this country is
[55:08] of what happens in this country is driven by the consumer. We need consumer
[55:10] driven by the consumer. We need consumer spending. It's great for companies if we
[55:13] spending. It's great for companies if we had more people being able to buy
[55:15] had more people being able to buy Netflix or order on Amazon. Anyway,
[55:18] Netflix or order on Amazon. Anyway, that's my that's my TED talk. Thanks for
[55:19] that's my that's my TED talk. Thanks for coming. How do we get from AI to the
[55:22] coming. How do we get from AI to the minimum wage? I'm still
[55:23] minimum wage? I'm still >> No, because of the black eye. No, the
[55:25] >> No, because of the black eye. No, the black eye we have in this country with
[55:27] black eye we have in this country with polarization of wealth and people scared
[55:29] polarization of wealth and people scared of losing their jobs. We We should look
[55:31] of losing their jobs. We We should look at why are they scared, David. And I've
[55:33] at why are they scared, David. And I've talked to you privately and you said to
[55:35] talked to you privately and you said to me privately, you can strike this if you
[55:36] me privately, you can strike this if you want, but you said to me privately, you
[55:38] want, but you said to me privately, you wouldn't be against necessarily figuring
[55:41] wouldn't be against necessarily figuring out a way to un do universal healthcare
[55:43] out a way to un do universal healthcare if there was a way to do it. You want to
[55:45] if there was a way to do it. You want to see every human have healthcare. Yes.
[55:48] see every human have healthcare. Yes. >> Sure. Sure. I mean the issue is not the
[55:50] >> Sure. Sure. I mean the issue is not the desiraability of it. It's the cost. I
[55:52] desiraability of it. It's the cost. I mean
[55:53] mean >> and so you're a great entrepreneur of
[55:55] >> and so you're a great entrepreneur of our time. How would you do it? Have you
[55:56] our time. How would you do it? Have you given any cycle to it?
[55:57] given any cycle to it? >> I just I haven't studied that issue. So
[55:59] >> I just I haven't studied that issue. So I don't know. I just know.
[56:01] I don't know. I just know. >> How about you study the country's
[56:03] >> How about you study the country's >> I remember what PJ Oor once said which
[56:05] >> I remember what PJ Oor once said which is if you think healthcare is expensive
[56:07] is if you think healthcare is expensive now just wait until you make it free.
[56:10] now just wait until you make it free. >> Yeah.
[56:11] >> Yeah. >> So you take away all the incentives and
[56:13] >> So you take away all the incentives and you have an even bigger problem.
[56:15] you have an even bigger problem. >> Minimum wage. Yeah. Go ahead. Well, let
[56:16] >> Minimum wage. Yeah. Go ahead. Well, let me just please get back to AI. Listen,
[56:19] me just please get back to AI. Listen, you guys are right about the
[56:20] you guys are right about the unpopularity of AI. We've all seen those
[56:22] unpopularity of AI. We've all seen those polls, but I want to just put up this
[56:24] polls, but I want to just put up this additional poll that came out about the
[56:27] additional poll that came out about the salience of this issue, which is how
[56:29] salience of this issue, which is how important do people think it is? And AI
[56:32] important do people think it is? And AI ranked 29 out of 39. So although AI is
[56:35] ranked 29 out of 39. So although AI is not very popular, it is certainly not
[56:37] not very popular, it is certainly not top of mind for voters. It's not in the
[56:39] top of mind for voters. It's not in the top 10 issues. It's not in the top 20
[56:41] top 10 issues. It's not in the top 20 issues. What is top of mind for voters?
[56:43] issues. What is top of mind for voters? Number one, cost of living. Number two,
[56:45] Number one, cost of living. Number two, the economy. And we know that AI is
[56:48] the economy. And we know that AI is deflationary. It helps with the cost of
[56:50] deflationary. It helps with the cost of living and it's creating an economic
[56:53] living and it's creating an economic boom right now. Okay? It's 75% of GDP
[56:57] boom right now. Okay? It's 75% of GDP growth in Q1. By the way, that that
[56:59] growth in Q1. By the way, that that economic growth is not just limited to
[57:01] economic growth is not just limited to startups in Silicon Valley. We're seeing
[57:02] startups in Silicon Valley. We're seeing a construction boom. We're seeing a blue
[57:04] a construction boom. We're seeing a blue collar boom. We're seeing 25 to 30% wage
[57:07] collar boom. We're seeing 25 to 30% wage increases for uh construction workers.
[57:09] increases for uh construction workers. >> Absolutely.
[57:10] >> Absolutely. >> And so on down the line. So,
[57:12] >> And so on down the line. So, >> and Brad, if you if you look at that
[57:13] >> and Brad, if you if you look at that chart, there's healthcare in there, too.
[57:15] chart, there's healthcare in there, too. >> So, look, my point is that AI may not be
[57:17] >> So, look, my point is that AI may not be popular, but the effects of it actually
[57:20] popular, but the effects of it actually are popular if the media would honestly
[57:22] are popular if the media would honestly report what was happening, which is AI
[57:25] report what was happening, which is AI is creating an economic boom right now.
[57:27] is creating an economic boom right now. >> It couldn't be better said, David. You
[57:29] >> It couldn't be better said, David. You know, Bernie Sanders calling for a
[57:31] know, Bernie Sanders calling for a moratorium, shutting out all data data
[57:33] moratorium, shutting out all data data centers. We'd have negative GDP growth
[57:35] centers. We'd have negative GDP growth this year. The stock market would be
[57:37] this year. The stock market would be down 15 to 20%. Unemployment would be on
[57:40] down 15 to 20%. Unemployment would be on the rise. You know there is a
[57:42] the rise. You know there is a consequence to the government
[57:43] consequence to the government controlling through command to control
[57:45] controlling through command to control the economy. In 1929 we had 4%
[57:48] the economy. In 1929 we had 4% unemployment. 3 years later it was
[57:50] unemployment. 3 years later it was unemployment was 23%.
[57:52] unemployment was 23%. Because government got involved in
[57:54] Because government got involved in regulating everything and shutting down
[57:56] regulating everything and shutting down you know what was working. That is the
[57:58] you know what was working. That is the greatest threat we have here. AI is
[58:00] greatest threat we have here. AI is delivering huge net benefits today in
[58:02] delivering huge net benefits today in terms of unemployment rate in terms of
[58:04] terms of unemployment rate in terms of economic growth and productivity growth.
[58:06] economic growth and productivity growth. We need to tell the story. But to Chamas
[58:08] We need to tell the story. But to Chamas earlier point, we also need to deliver
[58:10] earlier point, we also need to deliver net benefits. Jason, thanks for the
[58:12] net benefits. Jason, thanks for the shout out. Yes, every American having an
[58:14] shout out. Yes, every American having an investment account that compounds with
[58:16] investment account that compounds with the upside of AI. We're going to do
[58:18] the upside of AI. We're going to do that. Going to deliver that and that's
[58:20] that. Going to deliver that and that's going to be massive. But I also think
[58:21] going to be massive. But I also think ideas like if we're going to put a data
[58:24] ideas like if we're going to put a data center in Abalene, Texas, let's make
[58:25] center in Abalene, Texas, let's make electricity in Abalene free for the
[58:27] electricity in Abalene free for the households in Abalene, Texas. Right?
[58:29] households in Abalene, Texas. Right? There are ideas that can deliver net
[58:31] There are ideas that can deliver net benefits. We got to deliver those. I
[58:33] benefits. We got to deliver those. I think we're optimism will be on the
[58:35] think we're optimism will be on the march. I think we're in the trough right
[58:37] march. I think we're in the trough right now.
[58:37] now. >> What's your take on minimum wage and
[58:39] >> What's your take on minimum wage and universal healthcare? Brad, how do you
[58:41] universal healthcare? Brad, how do you think about it as a capitalist, as an
[58:43] think about it as a capitalist, as an innovator, and in the face of AI, which
[58:45] innovator, and in the face of AI, which could have a dramatic impact on these
[58:46] could have a dramatic impact on these issues?
[58:48] issues? >> You know, like Zach, I, you know, to me,
[58:50] >> You know, like Zach, I, you know, to me, I haven't spent a ton of time thinking
[58:52] I haven't spent a ton of time thinking about those except that as a society,
[58:54] about those except that as a society, we're $38 trillion in debt. We haven't
[58:56] we're $38 trillion in debt. We haven't been able to afford to deliver those
[58:58] been able to afford to deliver those things. I generally think the market
[58:59] things. I generally think the market works out those issues better than the
[59:02] works out those issues better than the government top down trying to you know
[59:04] government top down trying to you know the government gets more and more
[59:05] the government gets more and more involved in healthcare and the only
[59:06] involved in healthcare and the only thing that happens it gets more
[59:07] thing that happens it gets more expensive. So we've all seen the charts
[59:09] expensive. So we've all seen the charts of the most expensive categories where
[59:10] of the most expensive categories where we've had inflation education healthcare
[59:13] we've had inflation education healthcare etc. It's where the government's
[59:14] etc. It's where the government's involved. I actually think if you just
[59:16] involved. I actually think if you just let the markets work entering an into an
[59:18] let the markets work entering an into an age of abundance a lot of these problems
[59:20] age of abundance a lot of these problems are going to be solved. People are going
[59:21] are going to be solved. People are going to have a lot of AI coaches in
[59:24] to have a lot of AI coaches in healthcare and education etc. Let the
[59:26] healthcare and education etc. Let the market work. government stay at bay,
[59:28] market work. government stay at bay, keep things safe. We're on a good march.
[59:31] keep things safe. We're on a good march. >> I think this proves my point perfectly.
[59:33] >> I think this proves my point perfectly. If you talk to any founder, they're not
[59:35] If you talk to any founder, they're not thinking about housing. They're not
[59:37] thinking about housing. They're not thinking about higher education costs.
[59:39] thinking about higher education costs. They're not thinking about minimum wage,
[59:41] They're not thinking about minimum wage, you know, and they're not thinking about
[59:42] you know, and they're not thinking about healthcare all that often. Some some do,
[59:44] healthcare all that often. Some some do, though. There's some innovation there.
[59:45] though. There's some innovation there. And it's because it's so regulated that
[59:48] And it's because it's so regulated that entrepreneurs and VCs are just like,
[59:49] entrepreneurs and VCs are just like, "That's kryptonite. The government has
[59:51] "That's kryptonite. The government has poisoned the well. We can't participate
[59:53] poisoned the well. We can't participate in that." And that's the roadblock. And
[59:54] in that." And that's the roadblock. And that's where Americans are suffering.
[59:56] that's where Americans are suffering. And that's where it would be great if
[59:58] And that's where it would be great if founders actually put their minds to it.
[59:59] founders actually put their minds to it. And the government's got to get rid of
[01:00:01] And the government's got to get rid of all that regulation and let us cook in
[01:00:03] all that regulation and let us cook in those specific verticals. All right. The
[01:00:05] those specific verticals. All right. The market is in hyperdrive. Hyperscaler
[01:00:08] market is in hyperdrive. Hyperscaler revenue has made the markets move up. We
[01:00:12] revenue has made the markets move up. We hit on this briefly, but we didn't have
[01:00:14] hit on this briefly, but we didn't have you here. Fifth best EBG. Cloud
[01:00:17] you here. Fifth best EBG. Cloud computing on a tear. I referenced it
[01:00:19] computing on a tear. I referenced it earlier, but AWS is now on $150 billion
[01:00:22] earlier, but AWS is now on $150 billion run rate. Azure 108 billion. GCP, Google
[01:00:26] run rate. Azure 108 billion. GCP, Google Cloud, 80 billion. There's a little bit
[01:00:27] Cloud, 80 billion. There's a little bit of fun with numbers there because Azure
[01:00:29] of fun with numbers there because Azure and Microsoft include some of their
[01:00:30] and Microsoft include some of their software products in there and Google
[01:00:31] software products in there and Google Cloud includes things like Google Office
[01:00:33] Cloud includes things like Google Office or Google Suite in there. But the growth
[01:00:36] or Google Suite in there. But the growth numbers are tremendous. AWS, which is
[01:00:39] numbers are tremendous. AWS, which is the more pure play of the three, 28%
[01:00:42] the more pure play of the three, 28% growth on a very big number. Azure 39%,
[01:00:45] growth on a very big number. Azure 39%, Google Cloud stunning everybody with 63%
[01:00:49] Google Cloud stunning everybody with 63% growth. It is incredible what the ARR
[01:00:52] growth. It is incredible what the ARR numbers are. Google Cloud Edit 10, AWS
[01:00:55] numbers are. Google Cloud Edit 10, AWS 10, Azure 9.5. So basically 30 billion
[01:00:57] 10, Azure 9.5. So basically 30 billion collectively. Jammond Ball who works for
[01:00:59] collectively. Jammond Ball who works for you uh I think put out some data on the
[01:01:03] you uh I think put out some data on the Twitter. Brad markets at all times high
[01:01:06] Twitter. Brad markets at all times high all-time highs. Mag 7 cooking
[01:01:11] all-time highs. Mag 7 cooking Uber blowing out growth. Disney blowing
[01:01:14] Uber blowing out growth. Disney blowing out growth. The consumer seems absurdly
[01:01:17] out growth. The consumer seems absurdly strong based on those two bell weathers.
[01:01:20] strong based on those two bell weathers. Tech seems extremely strong based on the
[01:01:22] Tech seems extremely strong based on the cloud computing. What's your take on the
[01:01:24] cloud computing. What's your take on the overall market and overall economy?
[01:01:26] overall market and overall economy? Obviously, inflation up a bit, people
[01:01:28] Obviously, inflation up a bit, people hand ringing about the never-ending war
[01:01:30] hand ringing about the never-ending war and the cost of oil.
[01:01:32] and the cost of oil. >> Let's just telescope way up. You know,
[01:01:35] >> Let's just telescope way up. You know, the level of criticism directed to this
[01:01:37] the level of criticism directed to this administration, right? Tariffs were
[01:01:39] administration, right? Tariffs were going to cause hyperinflation. We're
[01:01:41] going to cause hyperinflation. We're going to destroy GDP. Uh conflicts in
[01:01:44] going to destroy GDP. Uh conflicts in Venezuela and Iran were going to do the
[01:01:46] Venezuela and Iran were going to do the same. You know, we've heard all of the
[01:01:48] same. You know, we've heard all of the negative stories, but what's happening?
[01:01:50] negative stories, but what's happening? Accelerating GDP, a tenure that's
[01:01:53] Accelerating GDP, a tenure that's sitting at 43, inflation totally under
[01:01:56] sitting at 43, inflation totally under control. AI, AI, AI, comput, compute,
[01:02:00] control. AI, AI, AI, comput, compute, compute. We're leading the world. It's
[01:02:02] compute. We're leading the world. It's contributing massively, right, to GDP
[01:02:05] contributing massively, right, to GDP growth in the country. We see the S&P
[01:02:07] growth in the country. We see the S&P only up 8% this year, right? So, we're
[01:02:09] only up 8% this year, right? So, we're not into bubble territory here. Meta's
[01:02:12] not into bubble territory here. Meta's trading at 17 times, fully tax gap
[01:02:14] trading at 17 times, fully tax gap earnings. Nvidia at 19 times, Microsoft
[01:02:17] earnings. Nvidia at 19 times, Microsoft at 20 times, Google at 24 times, and
[01:02:20] at 20 times, Google at 24 times, and then the memory stocks that everybody's
[01:02:22] then the memory stocks that everybody's excited about. We have 25% of our
[01:02:24] excited about. We have 25% of our portfolio in SKH Highix five times fully
[01:02:28] portfolio in SKH Highix five times fully taxed gap earnings. Samsung six times,
[01:02:31] taxed gap earnings. Samsung six times, Micron 7 times. Right? This is not the
[01:02:34] Micron 7 times. Right? This is not the stuff that bubbles are made of. You
[01:02:36] stuff that bubbles are made of. You know, David referenced it earlier. We
[01:02:38] know, David referenced it earlier. We started the year Open AI and Anthropic
[01:02:40] started the year Open AI and Anthropic were doing combined about 30 billion in
[01:02:43] were doing combined about 30 billion in revenue. Now combined 4 months later 80
[01:02:46] revenue. Now combined 4 months later 80 billion in revenue. The the policies of
[01:02:49] billion in revenue. The the policies of this administration on the economy are
[01:02:52] this administration on the economy are working. They're working in in spades.
[01:02:54] working. They're working in in spades. Our gap on the rest of the world in AI
[01:02:56] Our gap on the rest of the world in AI is growing. And so from from my
[01:02:59] is growing. And so from from my perspective, you know, uh you know,
[01:03:01] perspective, you know, uh you know, we've been all in on the market. I
[01:03:02] we've been all in on the market. I talked about it earlier in the year.
[01:03:04] talked about it earlier in the year. We're heavily.
[01:03:05] We're heavily. >> When did you make that switch to go all
[01:03:06] >> When did you make that switch to go all in on the market? Because you were
[01:03:07] in on the market? Because you were bearish.
[01:03:09] bearish. >> I would say toward the toward the end of
[01:03:10] >> I would say toward the toward the end of last year. The market had run up a lot.
[01:03:12] last year. The market had run up a lot. We had a lot of these questions. Listen,
[01:03:14] We had a lot of these questions. Listen, entering this year, there was a huge
[01:03:16] entering this year, there was a huge question hanging over the market. Would
[01:03:19] question hanging over the market. Would the AI revenues show up? If the
[01:03:22] the AI revenues show up? If the anthropic revenues hadn't shown up and
[01:03:24] anthropic revenues hadn't shown up and we didn't see this reaceleration out of
[01:03:26] we didn't see this reaceleration out of the hyperscalers, the market would be
[01:03:28] the hyperscalers, the market would be down 10 to 15%. Because people would say
[01:03:30] down 10 to 15%. Because people would say there's no ROI on all of this
[01:03:34] there's no ROI on all of this infrastructure. Exactly. When I saw the
[01:03:36] infrastructure. Exactly. When I saw the numbers start showing up in December and
[01:03:39] numbers start showing up in December and into January, we went from medium to
[01:03:41] into January, we went from medium to large in terms of our exposures and 80%
[01:03:44] large in terms of our exposures and 80% of our exposures or more have been in
[01:03:47] of our exposures or more have been in compute, AI, memory, etc.
[01:03:49] compute, AI, memory, etc. >> And this is why it's great to operate in
[01:03:50] >> And this is why it's great to operate in the private market and the public market
[01:03:52] the private market and the public market because you can see things in the
[01:03:53] because you can see things in the private markets that inform the public
[01:03:55] private markets that inform the public markets. But the question remains, Brad,
[01:03:57] markets. But the question remains, Brad, how much better would the economy have
[01:03:59] how much better would the economy have been doing? you know, as much credit as
[01:04:02] been doing? you know, as much credit as you're giving to the administration if
[01:04:04] you're giving to the administration if they didn't start a hundred billion
[01:04:05] they didn't start a hundred billion dollar war that we did not need to go
[01:04:07] dollar war that we did not need to go into according to all reports. And if we
[01:04:10] into according to all reports. And if we didn't do a bunch of tariffs that wound
[01:04:13] didn't do a bunch of tariffs that wound up being unconstitutional by the Supreme
[01:04:15] up being unconstitutional by the Supreme Court, which Trump himself put in, we
[01:04:18] Court, which Trump himself put in, we would have been further ahead. That's uh
[01:04:20] would have been further ahead. That's uh my take on it. We would be ripping even
[01:04:22] my take on it. We would be ripping even more if we didn't have those uh
[01:04:24] more if we didn't have those uh >> Jason, it's hard to it's it's hard to
[01:04:26] >> Jason, it's hard to it's it's hard to imagine. Okay, just to set up again
[01:04:28] imagine. Okay, just to set up again here. It's hard to imagine a more
[01:04:31] here. It's hard to imagine a more goldilock situation for the United
[01:04:33] goldilock situation for the United States of America. We have reset the
[01:04:35] States of America. We have reset the table geopolitically. The discount rate
[01:04:38] table geopolitically. The discount rate globally is actually coming down, not
[01:04:40] globally is actually coming down, not going up. Evidenced by markets at
[01:04:42] going up. Evidenced by markets at all-time high and the bond market in
[01:04:44] all-time high and the bond market in control. And then look at the private
[01:04:46] control. And then look at the private markets. We have multiple trillion
[01:04:48] markets. We have multiple trillion dollar companies that have been created
[01:04:50] dollar companies that have been created in the private markets that are now
[01:04:52] in the private markets that are now coming public. SpaceX coming public is
[01:04:54] coming public. SpaceX coming public is going to be a multi-trillion dollar, you
[01:04:56] going to be a multi-trillion dollar, you know, open AI anthropic. Like at some
[01:04:59] know, open AI anthropic. Like at some point, you just have to acknowledge USA
[01:05:01] point, you just have to acknowledge USA is winning. Of course, there are always
[01:05:04] is winning. Of course, there are always things that we could be doing better,
[01:05:05] things that we could be doing better, but there's not a country on the world
[01:05:07] but there's not a country on the world that wouldn't trade all of its fortune
[01:05:10] that wouldn't trade all of its fortune for the United States fortune today.
[01:05:12] for the United States fortune today. >> 100% in agreement, American
[01:05:13] >> 100% in agreement, American exceptionalism as embodied by the great
[01:05:16] exceptionalism as embodied by the great companies in America, SpaceX, Google,
[01:05:19] companies in America, SpaceX, Google, etc., all the ones we've been talking
[01:05:20] etc., all the ones we've been talking here. That is the story. And I give
[01:05:23] here. That is the story. And I give infinite credit to this administration
[01:05:24] infinite credit to this administration for being businessfriendly. I do think
[01:05:26] for being businessfriendly. I do think they've made two critical mistakes. I
[01:05:28] they've made two critical mistakes. I think the tariffs were poorly executed
[01:05:29] think the tariffs were poorly executed and I think we shouldn't have gone to
[01:05:30] and I think we shouldn't have gone to this war. Um, and we should find a quick
[01:05:33] this war. Um, and we should find a quick resolution to it, which the
[01:05:34] resolution to it, which the administration seems to be desperately
[01:05:36] administration seems to be desperately doing. Sax, your take on the economy?
[01:05:39] doing. Sax, your take on the economy? >> Well, look, we have an AI boom going on
[01:05:40] >> Well, look, we have an AI boom going on right now, and I think that's thanks to
[01:05:42] right now, and I think that's thanks to President Trump's policies. Remember the
[01:05:43] President Trump's policies. Remember the first week he was in office, he
[01:05:45] first week he was in office, he rescended the Biden policies on chips
[01:05:47] rescended the Biden policies on chips and models. And what were those
[01:05:49] and models. And what were those policies? It was the approval regime
[01:05:51] policies? It was the approval regime that we're talking about. Models would
[01:05:54] that we're talking about. Models would have to go to Washington to get approved
[01:05:55] have to go to Washington to get approved if they were trained with some number of
[01:05:57] if they were trained with some number of flops. And then every sale of a GPU
[01:06:00] flops. And then every sale of a GPU worldwide would have to be licensed from
[01:06:02] worldwide would have to be licensed from Washington unless it fit into some
[01:06:04] Washington unless it fit into some narrow exemptions. So the whole approach
[01:06:07] narrow exemptions. So the whole approach of the Biden administration that
[01:06:09] of the Biden administration that President Trump inherited was everything
[01:06:11] President Trump inherited was everything approved in Washington. He rescended
[01:06:14] approved in Washington. He rescended that. He declared that we had to win the
[01:06:15] that. He declared that we had to win the AI race and he unleashed uh our
[01:06:17] AI race and he unleashed uh our companies to do that. Now one other
[01:06:19] companies to do that. Now one other really important thing is energy.
[01:06:21] really important thing is energy. Remember it was this president going
[01:06:23] Remember it was this president going back a decade who said drill baby drill.
[01:06:25] back a decade who said drill baby drill. He said we have to unleash American
[01:06:27] He said we have to unleash American energy. That's the basis for the
[01:06:28] energy. That's the basis for the American economy. It's also the basis
[01:06:30] American economy. It's also the basis for AI. He also has said that he wanted
[01:06:34] for AI. He also has said that he wanted to allow our AI companies to become
[01:06:36] to allow our AI companies to become energy companies so they could bring
[01:06:38] energy companies so they could bring their own power to these data centers.
[01:06:40] their own power to these data centers. So they're not drawing off the grid.
[01:06:41] So they're not drawing off the grid. They're not competing with consumers for
[01:06:43] They're not competing with consumers for electricity. They're generating their
[01:06:45] electricity. They're generating their own power. And it's thanks to this
[01:06:47] own power. And it's thanks to this president that we have seen this
[01:06:50] president that we have seen this bluecollar construction boom right now
[01:06:53] bluecollar construction boom right now powering all of this infrastructure.
[01:06:54] powering all of this infrastructure. What would the alternative have been? We
[01:06:57] What would the alternative have been? We know I mean Bernie Sanders has said it
[01:06:59] know I mean Bernie Sanders has said it would have been ban on data centers. So
[01:07:01] would have been ban on data centers. So >> yeah raise tax data centers. It's that
[01:07:04] >> yeah raise tax data centers. It's that would be a much a much worse choice. Uh
[01:07:07] would be a much a much worse choice. Uh Chamal, I'll give you the last word here
[01:07:08] Chamal, I'll give you the last word here as we wrap on the economy generally. I
[01:07:12] as we wrap on the economy generally. I think the markets are going to keep
[01:07:13] think the markets are going to keep going up for a while
[01:07:15] going up for a while and then at some point they're going to
[01:07:16] and then at some point they're going to go down.
[01:07:19] go down. >> Okay, I wrote it down. Uh, Chimath, you
[01:07:21] >> Okay, I wrote it down. Uh, Chimath, you said markets are up and then eventually
[01:07:24] said markets are up and then eventually they're going to come down. Those are
[01:07:26] they're going to come down. Those are the two things up and we put a U here
[01:07:28] the two things up and we put a U here and then a D here.
[01:07:29] and then a D here. >> He's got the I think you're doing an
[01:07:31] >> He's got the I think you're doing an impression of the taking notes emoji. Is
[01:07:34] impression of the taking notes emoji. Is that what you're doing?
[01:07:34] that what you're doing? >> Yes. Okay. Ups and down. Wow. Thank you
[01:07:38] >> Yes. Okay. Ups and down. Wow. Thank you for tuning in to AllIn where you can get
[01:07:40] for tuning in to AllIn where you can get your great calls and market action
[01:07:42] your great calls and market action advice. It's going to go up and then
[01:07:43] advice. It's going to go up and then down folks act accordingly. Go Chamat in
[01:07:46] down folks act accordingly. Go Chamat in all seriousness but what what what makes
[01:07:47] all seriousness but what what what makes you bullish you know let's say in the
[01:07:50] you bullish you know let's say in the next uh 6 months 12 months catalyst and
[01:07:52] next uh 6 months 12 months catalyst and then what do you think the headwinds are
[01:07:53] then what do you think the headwinds are as well? Let's take the you know the
[01:07:54] as well? Let's take the you know the short to midterm 6 months to two years.
[01:07:58] short to midterm 6 months to two years. I think that in the short term the
[01:08:01] I think that in the short term the people that makes the new thing
[01:08:06] people that makes the new thing needs to get valued and needs to
[01:08:10] needs to get valued and needs to demonstrate value. So who are the people
[01:08:12] demonstrate value. So who are the people making the new thing? It's the Nvidias,
[01:08:15] making the new thing? It's the Nvidias, it's the memory makers, it's the
[01:08:16] it's the memory makers, it's the anthropics, it's the SpaceX's and uh
[01:08:19] anthropics, it's the SpaceX's and uh it's the open AIS. But eventually it all
[01:08:21] it's the open AIS. But eventually it all comes home to roost. And
[01:08:25] comes home to roost. And you can't just make things for a market
[01:08:28] you can't just make things for a market who then doesn't have a measurable
[01:08:30] who then doesn't have a measurable benefit themselves.
[01:08:32] benefit themselves. To be very clear and blunt, there is
[01:08:34] To be very clear and blunt, there is literally not a cintilla of evidence
[01:08:37] literally not a cintilla of evidence that AI has helped lift the operating
[01:08:39] that AI has helped lift the operating margins of the S&P 500. There's all
[01:08:42] margins of the S&P 500. There's all kinds of bluster. There's going to be an
[01:08:44] kinds of bluster. There's going to be an important fork in the road. It's
[01:08:46] important fork in the road. It's probably two or three years from now.
[01:08:48] probably two or three years from now. One path will be opex shrinks, hence
[01:08:52] One path will be opex shrinks, hence margins increase and the other path is
[01:08:54] margins increase and the other path is revenues grow and margins expand and
[01:08:57] revenues grow and margins expand and opex stays flat but or maybe it even
[01:08:59] opex stays flat but or maybe it even goes up. Those two things are very
[01:09:01] goes up. Those two things are very important differences because in the
[01:09:03] important differences because in the former you're talking about shrinking
[01:09:06] former you're talking about shrinking workforce and shrinking opex as a
[01:09:07] workforce and shrinking opex as a percentage of operating margin and
[01:09:10] percentage of operating margin and revenue. In the latter you're actually
[01:09:11] revenue. In the latter you're actually growing through it. The answer to that
[01:09:13] growing through it. The answer to that question I think is critical about how
[01:09:15] question I think is critical about how the markets will respond and how society
[01:09:18] the markets will respond and how society will respond. So I think we have kind of
[01:09:20] will respond. So I think we have kind of call it 500 days where you just got to
[01:09:24] call it 500 days where you just got to be net long. But I think it's literally
[01:09:28] be net long. But I think it's literally in the hundreds of days from now 500
[01:09:33] in the hundreds of days from now 500 you're going to have to have an
[01:09:34] you're going to have to have an important reckoning moment. The people
[01:09:37] important reckoning moment. The people that are paying for all these tokens
[01:09:38] that are paying for all these tokens need to see an actual benefit.
[01:09:41] need to see an actual benefit. >> Yeah. That that's reasonable. Yeah,
[01:09:42] >> Yeah. That that's reasonable. Yeah, >> that's a really interesting point. Let
[01:09:43] >> that's a really interesting point. Let me connect a couple of dots here between
[01:09:45] me connect a couple of dots here between something Jamal said and what Brad said,
[01:09:46] something Jamal said and what Brad said, which is Brad said at the beginning of
[01:09:48] which is Brad said at the beginning of the year, we went into this year with
[01:09:50] the year, we went into this year with this massive capex, this massive
[01:09:52] this massive capex, this massive investment infrastructure, but people
[01:09:54] investment infrastructure, but people weren't sure that the ROI was going to
[01:09:56] weren't sure that the ROI was going to be there in terms of model revenue. And
[01:09:59] be there in terms of model revenue. And that was true and then the model revenue
[01:10:00] that was true and then the model revenue has proven out. And now what Schmoth is
[01:10:03] has proven out. And now what Schmoth is saying is that we're going to be at
[01:10:04] saying is that we're going to be at another fork in the road soon in terms
[01:10:05] another fork in the road soon in terms of whether there's going to be ROI on
[01:10:08] of whether there's going to be ROI on all those tokens that are being sold and
[01:10:11] all those tokens that are being sold and generating the revenue for those model
[01:10:13] generating the revenue for those model companies. And I agree with you that
[01:10:15] companies. And I agree with you that that is not proven out yet. But I'm
[01:10:17] that is not proven out yet. But I'm optimistic that it is going to be proven
[01:10:20] optimistic that it is going to be proven out and otherwise these you wouldn't be
[01:10:22] out and otherwise these you wouldn't be seeing
[01:10:23] seeing >> look I
[01:10:23] >> look I >> enterprises continue to to buy hold on
[01:10:26] >> enterprises continue to to buy hold on let me just point you wouldn't be seeing
[01:10:28] let me just point you wouldn't be seeing enterprise continue their month
[01:10:29] enterprise continue their month overmonth spend on coding tokens if they
[01:10:33] overmonth spend on coding tokens if they didn't feel like the ROI was going to be
[01:10:34] didn't feel like the ROI was going to be there but you make a good point which is
[01:10:37] there but you make a good point which is what is the impact on the economy going
[01:10:38] what is the impact on the economy going to be when all of this new software this
[01:10:42] to be when all of this new software this bespoke software that's being created
[01:10:44] bespoke software that's being created through again all these coding tokens
[01:10:45] through again all these coding tokens are being bought It's going to power a
[01:10:47] are being bought It's going to power a wave of productivity like I think we've
[01:10:49] wave of productivity like I think we've never seen before. So I think what
[01:10:51] never seen before. So I think what you're seeing is the ROI is getting is
[01:10:54] you're seeing is the ROI is getting is sort of trickling down from
[01:10:55] sort of trickling down from infrastructure to model to application
[01:10:58] infrastructure to model to application to end user and I think it's going to
[01:11:00] to end user and I think it's going to create an economic boom.
[01:11:02] create an economic boom. >> I I got to I'm with you Saxs. This is
[01:11:04] >> I I got to I'm with you Saxs. This is deja vu all over again. We watch this
[01:11:06] deja vu all over again. We watch this happen with the PC revolution, the
[01:11:08] happen with the PC revolution, the internet revolution, cloud revolution,
[01:11:10] internet revolution, cloud revolution, mobile revolution. We had all this hand
[01:11:11] mobile revolution. We had all this hand ringing. Will this ever pay off? Should
[01:11:12] ringing. Will this ever pay off? Should I build an app? Should I build a
[01:11:14] I build an app? Should I build a website? Should I not? Should I move to
[01:11:15] website? Should I not? Should I move to the cloud? Should I keep it on prem? All
[01:11:17] the cloud? Should I keep it on prem? All of these questions over and over and
[01:11:18] of these questions over and over and over again. And then they went from
[01:11:20] over again. And then they went from question marks to exclamation points. I
[01:11:21] question marks to exclamation points. I can tell you inside of my firm, we have
[01:11:23] can tell you inside of my firm, we have started, we were using agents, then we
[01:11:25] started, we were using agents, then we started building code. And I've got
[01:11:26] started building code. And I've got three people on the team who are making
[01:11:28] three people on the team who are making all the interfaces and products that a
[01:11:30] all the interfaces and products that a 22 person investment firm should not be
[01:11:33] 22 person investment firm should not be making internally. They should be using
[01:11:35] making internally. They should be using SAS software. And they are shipping
[01:11:38] SAS software. And they are shipping product day in and day out. The ROI is
[01:11:40] product day in and day out. The ROI is fate complete. Brad, it is fat or
[01:11:43] fate complete. Brad, it is fat or complete. I think this has been decided.
[01:11:45] complete. I think this has been decided. I think it's been decided.
[01:11:46] I think it's been decided. >> It has not been decided at all. It has
[01:11:48] >> It has not been decided at all. It has not been decided at all.
[01:11:49] not been decided at all. >> You have 8090. You're working with the
[01:11:51] >> You have 8090. You're working with the big enterprises. I invest in 100
[01:11:52] big enterprises. I invest in 100 startups a year. I work with the small
[01:11:54] startups a year. I work with the small ones. It is fate a complete with
[01:11:56] ones. It is fate a complete with startups. They are building software.
[01:11:57] startups. They are building software. They're shipping. They are getting
[01:11:58] They're shipping. They are getting massive value from these tokens. And
[01:12:00] massive value from these tokens. And they're getting so much value that they
[01:12:02] they're getting so much value that they don't have to add, you know, but half
[01:12:04] don't have to add, you know, but half the number of employees that they would
[01:12:06] the number of employees that they would with the same amount of capital. They're
[01:12:08] with the same amount of capital. They're getting further with less money. It is
[01:12:10] getting further with less money. It is working in startup land. I don't know
[01:12:12] working in startup land. I don't know what's happening at 8090. you would have
[01:12:14] what's happening at 8090. you would have a better picture obviously of the
[01:12:15] a better picture obviously of the enterprise. Tell us what you see there.
[01:12:16] enterprise. Tell us what you see there. >> I mean, our business is doing well, but
[01:12:20] >> I mean, our business is doing well, but what I'm trying to get across to you
[01:12:22] what I'm trying to get across to you guys is that you can't will profits to
[01:12:26] guys is that you can't will profits to go up.
[01:12:28] go up. Okay? So, ultimately what happens is I'm
[01:12:31] Okay? So, ultimately what happens is I'm just going to take a company randomly.
[01:12:33] just going to take a company randomly. Annheiser Bush, they have to eventually
[01:12:37] Annheiser Bush, they have to eventually sell more beer.
[01:12:39] sell more beer. Okay? Take Nike. they ultimately have to
[01:12:43] Okay? Take Nike. they ultimately have to sell more shoes. Take a medical devices
[01:12:46] sell more shoes. Take a medical devices company. They have to sell more
[01:12:47] company. They have to sell more artificial hips and knees. So the point
[01:12:50] artificial hips and knees. So the point I'm trying to get across is right now
[01:12:52] I'm trying to get across is right now there's an enormous amount of very
[01:12:54] there's an enormous amount of very constructive and creative
[01:12:56] constructive and creative experimentation.
[01:12:58] experimentation. But I think it's what is also true is a
[01:13:02] But I think it's what is also true is a lot of that has not yet proven value. I
[01:13:05] lot of that has not yet proven value. I don't think that means it's going to
[01:13:06] don't think that means it's going to stop. All I'm just trying to say is
[01:13:09] stop. All I'm just trying to say is until a company can trace very directly
[01:13:12] until a company can trace very directly I spent X and I made Y where Y is now
[01:13:14] I spent X and I made Y where Y is now greater than X and it's lifted my
[01:13:16] greater than X and it's lifted my margins.
[01:13:18] margins. That is the thing that causes the
[01:13:20] That is the thing that causes the flywheel to spin faster.
[01:13:22] flywheel to spin faster. And right now we've started the first
[01:13:24] And right now we've started the first part of that equation. We've spent the X
[01:13:28] part of that equation. We've spent the X and we have not seen the Y.
[01:13:31] and we have not seen the Y. You would see it in global GDP. You
[01:13:33] You would see it in global GDP. You haven't. You would see it in global
[01:13:35] haven't. You would see it in global productivity. You haven't. You would see
[01:13:37] productivity. You haven't. You would see it in the global profit margins of the
[01:13:38] it in the global profit margins of the S&P 500. We haven't. It doesn't mean
[01:13:41] S&P 500. We haven't. It doesn't mean it's not coming.
[01:13:41] it's not coming. >> Brad, you want you want to pick up on
[01:13:43] >> Brad, you want you want to pick up on this because yeah, I'm definitely taking
[01:13:44] this because yeah, I'm definitely taking the other side of it because I'm seeing
[01:13:46] the other side of it because I'm seeing with a lot of these companies,
[01:13:48] with a lot of these companies, >> massive lowering of costs. Their ads are
[01:13:50] >> massive lowering of costs. Their ads are getting more effective. At the same
[01:13:52] getting more effective. At the same time, they've stopped hiring. They're
[01:13:53] time, they've stopped hiring. They're not adding positions in a lot of cases.
[01:13:55] not adding positions in a lot of cases. And things like, just picked the Nike
[01:13:58] And things like, just picked the Nike example, a lot of the photo shoots they
[01:14:01] example, a lot of the photo shoots they used to do for their app, excuse me. A
[01:14:02] used to do for their app, excuse me. A lot of the imagery they used to make,
[01:14:04] lot of the imagery they used to make, now they're able to make more of it
[01:14:06] now they're able to make more of it without having to hire photographers and
[01:14:08] without having to hire photographers and do that stuff. I know this example
[01:14:09] do that stuff. I know this example because we have a startup that does this
[01:14:10] because we have a startup that does this specifically for brands like Nike.
[01:14:12] specifically for brands like Nike. They've seen a massive drop. We have one
[01:14:14] They've seen a massive drop. We have one that helped Door Dash with their food
[01:14:16] that helped Door Dash with their food pictures. All those pictures used to
[01:14:17] pictures. All those pictures used to have to be taken by photographers. Now
[01:14:19] have to be taken by photographers. Now it's all done by AI. Massive reduction
[01:14:20] it's all done by AI. Massive reduction in cost. And they're using ads and ad
[01:14:23] in cost. And they're using ads and ad creative now. that is, you know, you
[01:14:26] creative now. that is, you know, you know, double digit percentage more
[01:14:28] know, double digit percentage more effective while costing half as much.
[01:14:30] effective while costing half as much. So, I definitely think we're seeing it
[01:14:32] So, I definitely think we're seeing it on the earnings, but is is that true?
[01:14:34] on the earnings, but is is that true? Are you seeing it in the earnings of
[01:14:35] Are you seeing it in the earnings of these companies yet?
[01:14:36] these companies yet? >> Yeah. So, so two data points. Number
[01:14:38] >> Yeah. So, so two data points. Number one, we just saw Azure grow 39% in the
[01:14:42] one, we just saw Azure grow 39% in the quarter. We saw Google Cloud grow 63% in
[01:14:45] quarter. We saw Google Cloud grow 63% in the quarter. Headcount growth for those
[01:14:47] the quarter. Headcount growth for those companies the last 3 years, Mag 5
[01:14:48] companies the last 3 years, Mag 5 combined is about 3%. So, their
[01:14:51] combined is about 3%. So, their operating margins are all expanding. If
[01:14:54] operating margins are all expanding. If you look at the S&P 500 at large, in Q1
[01:14:57] you look at the S&P 500 at large, in Q1 of 24, operating margins were about
[01:14:59] of 24, operating margins were about 11.8%.
[01:15:00] 11.8%. That was up from 11% in 23. This year,
[01:15:04] That was up from 11% in 23. This year, they're 13%. So, we've had a 200 basis
[01:15:06] they're 13%. So, we've had a 200 basis point improvement in the operating
[01:15:09] point improvement in the operating margins of the S&P 500, which is
[01:15:12] margins of the S&P 500, which is massive.
[01:15:12] massive. >> Do you think that's AI across those
[01:15:14] >> Do you think that's AI across those businesses? And Chimath, I think that
[01:15:16] businesses? And Chimath, I think that that's the question where where it could
[01:15:20] that's the question where where it could donuts, it's not AI.
[01:15:22] donuts, it's not AI. >> Yeah. So, so any amount of money. I bet
[01:15:24] >> Yeah. So, so any amount of money. I bet you are the same financial engineering
[01:15:26] you are the same financial engineering that got these earnings to rise in the
[01:15:28] that got these earnings to rise in the last decade.
[01:15:29] last decade. >> Yeah. So I I I think that's the
[01:15:30] >> Yeah. So I I I think that's the question. Are is this margin expansion
[01:15:33] question. Are is this margin expansion durable? The forecast the consensus
[01:15:35] durable? The forecast the consensus forecast and estimate is that margins
[01:15:38] forecast and estimate is that margins are going to continue to expand over the
[01:15:40] are going to continue to expand over the course of the next 2 years. You and I
[01:15:42] course of the next 2 years. You and I both know back in 22 23 we went from the
[01:15:45] both know back in 22 23 we went from the age of excess to the age of fitness.
[01:15:47] age of excess to the age of fitness. Right? A lot of these companies were
[01:15:48] Right? A lot of these companies were able to shed people, you know, with the
[01:15:50] able to shed people, you know, with the excuse of AI just because they had
[01:15:53] excuse of AI just because they had become uh, you know, uh, uh, too
[01:15:55] become uh, you know, uh, uh, too excessive during the period of co. So I
[01:15:58] excessive during the period of co. So I I think it's a legitimate question
[01:16:00] I think it's a legitimate question whether or not that's all from AI. But I
[01:16:01] whether or not that's all from AI. But I will tell you anecdotally, it maps for
[01:16:04] will tell you anecdotally, it maps for me. I'm hearing like Jason and David, a
[01:16:06] me. I'm hearing like Jason and David, a lot of these companies that are, you
[01:16:08] lot of these companies that are, you know, uh, really growing their top lines
[01:16:10] know, uh, really growing their top lines at an accelerating rate without
[01:16:12] at an accelerating rate without expanding headcount nearly at the same
[01:16:14] expanding headcount nearly at the same pace. Okay, Sax, I'll give you the final
[01:16:16] pace. Okay, Sax, I'll give you the final word while we're wrap.
[01:16:17] word while we're wrap. >> Brad was talking about how we got all
[01:16:19] >> Brad was talking about how we got all these operating efficiency improvements.
[01:16:21] these operating efficiency improvements. The unemployment rate stayed at historic
[01:16:24] The unemployment rate stayed at historic lows during that time. I mean, the
[01:16:26] lows during that time. I mean, the economists consider full employment to
[01:16:27] economists consider full employment to be 4 to 5%. And we've stayed at, you
[01:16:30] be 4 to 5%. And we've stayed at, you know, the low low 4% 4.2% roughly during
[01:16:34] know, the low low 4% 4.2% roughly during this time. So, you're able to get these
[01:16:36] this time. So, you're able to get these efficiency improvements while
[01:16:38] efficiency improvements while unemployment is still extremely low.
[01:16:40] unemployment is still extremely low. Moreover, there was just a big article
[01:16:43] Moreover, there was just a big article saying that the unemployment rate for
[01:16:45] saying that the unemployment rate for young college graduates has dropped. So,
[01:16:48] young college graduates has dropped. So, you know, there was this whole narrative
[01:16:50] you know, there was this whole narrative recently that recent college graduates
[01:16:52] recently that recent college graduates were going to have the hardest time
[01:16:53] were going to have the hardest time finding jobs because, you know, there's
[01:16:55] finding jobs because, you know, there's going to be no no work left for entry-
[01:16:56] going to be no no work left for entry- level jobs because of AI. And in fact,
[01:16:59] level jobs because of AI. And in fact, it has gotten easier for recent college
[01:17:00] it has gotten easier for recent college graduates to find work recently. Maybe
[01:17:03] graduates to find work recently. Maybe that's because they're AI natives. Maybe
[01:17:04] that's because they're AI natives. Maybe that's because they know how to use AI
[01:17:07] that's because they know how to use AI better. So in any event, I mean, we're
[01:17:09] better. So in any event, I mean, we're just not seeing any evidence yet of
[01:17:11] just not seeing any evidence yet of these theoretical downsides of AI around
[01:17:14] these theoretical downsides of AI around job loss and unemployment, and we are
[01:17:16] job loss and unemployment, and we are starting to see big productivity gains.
[01:17:17] starting to see big productivity gains. >> Yeah, I this is going to be a circular
[01:17:19] >> Yeah, I this is going to be a circular discussion, but yeah, there's a lot of
[01:17:21] discussion, but yeah, there's a lot of conflicting evidence. The last piece of
[01:17:23] conflicting evidence. The last piece of conflicting evidence obviously is the
[01:17:25] conflicting evidence obviously is the labor participation rate because if you
[01:17:27] labor participation rate because if you are not even opting in to participate,
[01:17:30] are not even opting in to participate, then you, you know, don't get counted as
[01:17:33] then you, you know, don't get counted as unemployed. And that's been, I think, a
[01:17:35] unemployed. And that's been, I think, a big challenge. 61.9% in March. Labor
[01:17:38] big challenge. 61.9% in March. Labor participation rate back in uh before we
[01:17:41] participation rate back in uh before we were at 63.3. Yeah. And college
[01:17:43] were at 63.3. Yeah. And college graduates are hearing different stories.
[01:17:45] graduates are hearing different stories. Certain degrees getting jobs, other ones
[01:17:47] Certain degrees getting jobs, other ones not getting jobs.
[01:17:49] not getting jobs. >> It's too early to tell. Uh I think is
[01:17:51] >> It's too early to tell. Uh I think is probably what we all agree in. And it's
[01:17:53] probably what we all agree in. And it's a it's a mixed bag.
[01:17:54] a it's a mixed bag. >> Agree with that.
[01:17:55] >> Agree with that. >> No. Look, whenever whenever I have data
[01:17:58] >> No. Look, whenever whenever I have data to refute one of your narratives, you
[01:18:00] to refute one of your narratives, you always say it's too soon to tell.
[01:18:01] always say it's too soon to tell. >> No, no, no. What I do
[01:18:02] >> No, no, no. What I do >> online Wall Street Journal Nick put it
[01:18:04] >> online Wall Street Journal Nick put it on the screen. College graduates are
[01:18:05] on the screen. College graduates are finally catching a break in this job
[01:18:07] finally catching a break in this job market.
[01:18:08] market. >> Yeah,
[01:18:08] >> Yeah, >> Jake, you should be happy about this.
[01:18:10] >> Jake, you should be happy about this. >> I know. I'm Listen, I'm happy anybody
[01:18:12] >> I know. I'm Listen, I'm happy anybody gets a job. But what you do is then you
[01:18:13] gets a job. But what you do is then you say we don't trust the numbers and we
[01:18:15] say we don't trust the numbers and we should get rid of the Fed and we should
[01:18:16] should get rid of the Fed and we should get rid of the numbers. So, we all know
[01:18:18] get rid of the numbers. So, we all know >> we should get rid of the Fed. That was
[01:18:19] >> we should get rid of the Fed. That was But you you get rid of the Fed because
[01:18:21] But you you get rid of the Fed because we don't like the numbers. Listen, it's
[01:18:22] we don't like the numbers. Listen, it's all great. Welcome to the debate club.
[01:18:25] all great. Welcome to the debate club. >> What did I say?
[01:18:25] >> What did I say? >> You said abolish the Fed. Abolish the
[01:18:28] >> You said abolish the Fed. Abolish the Fed.
[01:18:30] Fed. What is the Fed here for? All right,
[01:18:31] What is the Fed here for? All right, listen. Enough. We're getting into Trump
[01:18:33] listen. Enough. We're getting into Trump derangement syndrome or Trump bend the
[01:18:35] derangement syndrome or Trump bend the knee syndrome. It's the end of the show.
[01:18:36] knee syndrome. It's the end of the show. We had a great show, everybody. We had
[01:18:38] We had a great show, everybody. We had some laughs. We all learned. We
[01:18:40] some laughs. We all learned. We workshopped some stuff. Let's leave it
[01:18:41] workshopped some stuff. Let's leave it where it is. Great job, President Trump.
[01:18:44] where it is. Great job, President Trump. >> I want to I want to congratulate
[01:18:46] >> I want to I want to congratulate >> Oh, here we go.
[01:18:47] >> Oh, here we go. >> I want to congratulate all of our
[01:18:48] >> I want to congratulate all of our innovators. And I want to congratulate
[01:18:50] innovators. And I want to congratulate Elon and um Daario D. Rockefeller on
[01:18:53] Elon and um Daario D. Rockefeller on their recent deal.
[01:18:55] their recent deal. >> Oh, shots fired. He's getting straight.
[01:18:57] >> Oh, shots fired. He's getting straight. Come on anytime, Daario. Hey, you know
[01:19:00] Come on anytime, Daario. Hey, you know Dario well, Brad. Well, get him on the
[01:19:02] Dario well, Brad. Well, get him on the program next week. I want him on the
[01:19:03] program next week. I want him on the program. Have him come.
[01:19:04] program. Have him come. >> Will you ask him for me?
[01:19:06] >> Will you ask him for me? >> I um
[01:19:07] >> I um >> Will you ask him for me?
[01:19:09] >> Will you ask him for me? >> Well, sure. I'll ask him. And yeah, you
[01:19:10] >> Well, sure. I'll ask him. And yeah, you know, the the fact of the matter is
[01:19:13] know, the the fact of the matter is >> I think our lucky stars that we have
[01:19:16] >> I think our lucky stars that we have Elon, that we have Anthropic, that we
[01:19:18] Elon, that we have Anthropic, that we have OpenAI, that we have Google, that
[01:19:20] have OpenAI, that we have Google, that we have Amazon, all innovating in this
[01:19:23] we have Amazon, all innovating in this country. And you know, I know we like
[01:19:25] country. And you know, I know we like to, you know, kind of poke poke fun on
[01:19:27] to, you know, kind of poke poke fun on the edges at these things, but the fact
[01:19:29] the edges at these things, but the fact of the matter is, you know, I see them
[01:19:32] of the matter is, you know, I see them all showing up, sharing their models,
[01:19:35] all showing up, sharing their models, driving as hard as they can to innovate.
[01:19:37] driving as hard as they can to innovate. We have the best competitive framework
[01:19:38] We have the best competitive framework in the country. David's right. It's been
[01:19:41] in the country. David's right. It's been transformed over the course of the last
[01:19:43] transformed over the course of the last 14 months. We need to stay the course.
[01:19:46] 14 months. We need to stay the course. We're on the winning horse. We just had
[01:19:47] We're on the winning horse. We just had the derby last week. We're on the
[01:19:50] the derby last week. We're on the winning horse. Stay on the horse.
[01:19:51] winning horse. Stay on the horse. America for the win. America for the
[01:19:54] America for the win. America for the win.
[01:19:54] win. >> There it is. Senator Brad Gersonner. I
[01:19:56] >> There it is. Senator Brad Gersonner. I think if you're gonna run, you got to
[01:19:57] think if you're gonna run, you got to get rid of the red glasses. We gota get
[01:19:59] get rid of the red glasses. We gota get maybe to shells in there. But I think
[01:20:00] maybe to shells in there. But I think you got a you got a serious shot.
[01:20:02] you got a you got a serious shot. Senator Senator
[01:20:04] Senator Senator >> I like I like secretary better.
[01:20:06] >> I like I like secretary better. >> Secretary.
[01:20:07] >> Secretary. >> Secretary. Who's Yeah. Secretary of the
[01:20:10] >> Secretary. Who's Yeah. Secretary of the Treasury. Brad Gersonner.
[01:20:12] Treasury. Brad Gersonner. >> Secretary.
[01:20:13] >> Secretary. >> Secretary of State David Saxs. Secretary
[01:20:16] >> Secretary of State David Saxs. Secretary of Cashmere and Wine. Chamath Paul. what
[01:20:19] of Cashmere and Wine. Chamath Paul. what are you doing with your uh
[01:20:20] are you doing with your uh >> there's so much fake news out there
[01:20:21] >> there's so much fake news out there because I mean look I totally agree with
[01:20:22] because I mean look I totally agree with everything Brad said look I poke fun at
[01:20:24] everything Brad said look I poke fun at at some of these companies for some of
[01:20:26] at some of these companies for some of the things they do but I am happy that
[01:20:27] the things they do but I am happy that they are American companies and that
[01:20:29] they are American companies and that they're innovating here congratulations
[01:20:32] they're innovating here congratulations on your dad
[01:20:33] on your dad >> yes and look there's so much fake news
[01:20:36] >> yes and look there's so much fake news out there I mean we just covered on this
[01:20:38] out there I mean we just covered on this podcast how beneficial some of these
[01:20:41] podcast how beneficial some of these economic trends are you never get it
[01:20:42] economic trends are you never get it from the media
[01:20:43] from the media >> no
[01:20:44] >> no >> and they are trying to derail us from
[01:20:46] >> and they are trying to derail us from from you know the the policies that have
[01:20:48] from you know the the policies that have been so successful Yes, but but they did
[01:20:50] been so successful Yes, but but they did some great inspiring coverage of micro
[01:20:52] some great inspiring coverage of micro looting. So get your microl looting on.
[01:20:55] looting. So get your microl looting on. Congratulations, New York Times. We'll
[01:20:57] Congratulations, New York Times. We'll see you next time, everybody. Bye-bye.
[01:20:59] see you next time, everybody. Bye-bye. Love you, boys. Love you.
[01:21:03] Love you, boys. Love you. >> We'll let your winners ride.
[01:21:06] >> We'll let your winners ride. >> Rainman David
[01:21:10] and we open sourced it to the fans and
[01:21:12] and we open sourced it to the fans and they've just gone crazy with it.
[01:21:14] they've just gone crazy with it. >> Love you, queen of
[01:21:17] >> Love you, queen of all. Winners,
[01:21:23] >> besties are gone.
[01:21:26] >> besties are gone. >> That is my dog taking your driveway.
[01:21:31] >> Oh man, my appetiter will meet you.
[01:21:33] >> Oh man, my appetiter will meet you. >> We should all just get a room and just
[01:21:35] >> We should all just get a room and just have one big huge orgy cuz they're all
[01:21:37] have one big huge orgy cuz they're all just useless. It's like this like sexual
[01:21:38] just useless. It's like this like sexual tension that we just need to release
[01:21:40] tension that we just need to release somehow.
[01:21:43] >> Wet your feet.
[01:21:46] >> Wet your feet. >> That's going to be good. waiting to get
[01:21:47] >> That's going to be good. waiting to get merch. I'm going
[01:21:57] all in.