Full Transcript
https://www.youtube.com/watch?v=NdU6UdUKaYc
[00:01] I think the greatest brand destruction over the last 18 months is the US brand.
[00:04] over the last 18 months is the US brand abroad and AI.
[00:06] So with America, Trump has been convinced that this was his defining moment of being known as the president that liberated the Middle East.
[00:11] But I think the Trump administration will be known for criminal corruption and incompetence.
[00:15] criminal corruption and incompetence.
[00:17] And that incompetence is bubbling up.
[00:20] And then the second greatest fall is AI.
[00:22] And also Sam Alman who's gone to the dark side.
[00:24] And I'm an AI optimist.
[00:26] But here's what we fail to understand.
[00:28] These techs, they do not have our best interests at heart.
[00:30] Scott, I often come to you to help myself form my opinions on some of the most sort of consequential issues going on in the world.
[00:34] And when I look at some of the quotes from the CEOs of these AI companies, I've got one here from Elon that says, "AI and robots will replace all jobs."
[00:42] What's your view here?
[00:44] I think it's mostly the catastrophizing is nothing more than thinly veiled attempt to say my technology is so devastating that it's going to shift society and you should invest at this crazy valuation.
[00:52] And the data doesn't reflect that there's some big exogenous meteor coming for the employment market.
[00:58] But I believe it's actually going to create more jobs than it destroys.
[01:00] But the scary thing
[01:01] than it destroys.
[01:04] But the scary thing for me is that because of AI and because of these frictionless relationships that people are engaging in online.
[01:08] I think a lot of young people are losing the ability to endure rejection.
[01:11] But I think a more important technology in terms of how it's going to change the world and what will create more shareholder value is not AI.
[01:20] It's really >> This is super interesting to me.
[01:22] My team given me this report to show me how many of you that watch this show subscribe.
[01:25] And some of you have told us according to this that you are unsubscribed from the channel randomly.
[01:29] So favor to ask all of you please could you check right now if you've hit the subscribe button if you are a regular viewer of the show and you like what we do here.
[01:36] We're approaching quite a significant landmark on this show in terms of a subscriber number.
[01:40] So if there was one simple free thing that you could do to help us, my team, everyone here to keep this show free, to keep it improving year over year and week over week, it is just to hit that subscribe button and to double check if you've hit it.
[01:52] Only thing I'll ever ask of you.
[01:55] Do we have a deal?
[01:56] If you do it, I'll tell you what I'll do.
[01:59] I'll make sure every single week, every single month, we fight harder and harder and harder and harder to bring you the guests and conversations that you want
[02:02] guests and conversations that you want to hear.
[02:04] I've stayed true to that promise since the very beginning of the D of Sio, and I will not let you down.
[02:10] Please help us. Really appreciate it. Let's get on with the show.
[02:18] Scott, welcome back.
[02:22] I um I often come to you to help myself form my opinions on some of the most sort of consequential issues going on in the world because I I find you always have a very interesting perspective and one of the things that's been front of mind for me and I think the audience as well is this subject of artificial intelligence.
[02:38] It's moving incredibly quickly.
[02:39] It feels like a moving target.
[02:41] And I think I saw some stats the other day that said the subject of artificial intelligence is less popular as like an industry than even ICE in the United States.
[02:50] Mhm. In part because the big CEOs of these companies are saying that our jobs and our way of life is going to be fundamentally disrupted and people don't feel like they have a say in that.
[02:59] We didn't choose this.
[03:02] What's your perspective on everything that's going on?
[03:03] that's going on?
[03:05] There have been few brands that have fallen further faster in the last 18 months than two brands.
[03:08] The United States brand abroad.
[03:10] We used to be the enforcer of the operating system to keep rogue nations in check.
[03:14] The US is now that rogue nation.
[03:17] So the US brand abroad for the first time in history more people feel that China is a force of good in the world than the US.
[03:23] That's never happened before.
[03:25] So the brand US has fallen furthest fastest.
[03:28] The second greatest fall is AI.
[03:33] Your view of AI is directly correlated to your wealth.
[03:35] The only cohort that has a positive rating of AI is people making over $200,000.
[03:41] Because generally speaking, wealthy people look at AI as something that's fueling their portfolios and uh wealthy people are the biggest users of AI.
[03:52] So they see it as a positive.
[03:53] They see it as innovation.
[03:55] They see it as the S&P going up.
[03:59] But if you're an average middle class person, what you may see is that your electricity bills have gone up and you don't even have
[04:04] have gone up and you don't even have access to invest in these companies.
[04:07] access to invest in these companies.
[04:10] And you see some statements from people Sam Alman saying, "Stop complaining about energy costs.
[04:12] Think about the amount of energy it takes to raise a child."
[04:16] So they haven't really managed the brand well.
[04:18] In the last 18 months, it's gone from something kind of scary, but we can be optimistic and a wealth creator to something that's very scary and something that's a wealth creator for the wealthiest already.
[04:29] The brand in the last 18 months has had tremendous, tremendous erosion.
[04:32] When I look at some of the quotes from the CEOs of these AI companies, I've got one here from Elon that says, "AI and robots will replace all jobs.
[04:39] Working will be optional, like growing your own vegetables instead of buying them from a store.
[04:43] The challenge will be fulfillment.
[04:44] How do you derive fulfillment and meaning in life?"
[04:48] And Sam Molman said, "By the end of 2028, more of the world's intellectual capacity could reside inside data centers than outside them."
[04:57] And when I went through Sam's blog several times, I mean, he's talking about a fairly dystopian future.
[05:00] And many of the CEOs are I mean even Amadea Enthropic talks a
[05:05] are I mean even Amadea Enthropic talks a lot about you know his his bare case for AI and the job disruption that will follow.
[05:12] And so I think I'm trying to get clarity for myself really on like what what is the truth what's marketing and what's the truth.
[05:16] And I want to add another point which is I hire a lot of people.
[05:21] And I'm already finding that our framework specifically for entry-level roles has has quite radically changed.
[05:27] And so I didn't believe this stuff like a lot of people told me about oh Sam Alman these guys they're doing marketing to make their company seem powerful.
[05:33] But then when I noticed a change in my own behavior when I'm sifting through these CVs, yeah, I thought, "Oh, maybe things are going to change."
[05:41] I think it's mostly and catastrophizing and a means of fundraising.
[05:45] Every technology in history goes through a similar arc.
[05:47] There's some catastrophizing, there's some job loss, that increase in productivity results in additional margin, new business opportunities, and employment growth.
[05:58] I don't see any reason why this would be any different.
[06:00] And I think that catastrophizing and talking about this massive destruction in jobs is a way of saying or justifying.
[06:06] in jobs is a way of saying or justifying the massive investments these companies
[06:09] the massive investments these companies want enterprises to make in their
[06:10] want enterprises to make in their companies.
[06:11] Because if you look at the amount of capital they have committed
[06:13] amount of capital they have committed and the valuations of these companies,
[06:15] and the valuations of these companies, one of two things needs to happen in the
[06:17] one of two things needs to happen in the next 5 years at least in the US.
[06:18] next 5 years at least in the US. There either needs to be a trillion dollars in
[06:20] either needs to be a trillion dollars in incremental revenue from new products
[06:22] incremental revenue from new products from companies that have licensed AI.
[06:24] from companies that have licensed AI. So L'Oreal does a big site license with say
[06:26] L'Oreal does a big site license with say open AI. Does it come up with new
[06:28] open AI. Does it come up with new products? It's like how does it justify
[06:30] products? It's like how does it justify that investment? We're not seeing a lot
[06:31] that investment? We're not seeing a lot of AI moisturizer or cars that are built
[06:34] of AI moisturizer or cars that are built by AI. I mean, there's AI has hit
[06:37] by AI. I mean, there's AI has hit industrialized robots at Amazon, but
[06:39] industrialized robots at Amazon, but there's not a lot you would call new
[06:40] there's not a lot you would call new AIdriven or AI products. AI plays a role
[06:43] AIdriven or AI products. AI plays a role in the background, but it's very hard
[06:46] in the background, but it's very hard for companies right now to point to
[06:48] for companies right now to point to incremental growth or revenues from AI.
[06:50] incremental growth or revenues from AI. So, in order to justify these
[06:52] So, in order to justify these valuations, then they have to say, all
[06:54] valuations, then they have to say, all right, there's going to be massive cost
[06:56] right, there's going to be massive cost savings or efficiencies. And there's
[06:57] savings or efficiencies. And there's some examples. I think Meta Meta has
[07:00] some examples. I think Meta Meta has just announced new layoffs because of
[07:02] just announced new layoffs because of efficiencies they're getting with AI
[07:03] efficiencies they're getting with AI targeting. So, one of two things needs
[07:05] targeting. So, one of two things needs to happen in the next three three years.
[07:07] to happen in the next three three years.
[07:09] Either these companies valuations need to be cut by 50 or 70%.
[07:12] Or you need a massive destruction in the labor market that creates tons of efficiencies that their customers can then flow to the bottom line.
[07:16] And I think what the CEOs are saying is that there's going to be massive efficiencies.
[07:23] I also think that you sound more interesting and it makes your technology sound more seinal when you say it's changing the world and we don't know how to control it.
[07:32] And quite frankly, I find it a little bit obnoxious that some of the founders in key figures in AI catastrophize just about the time they take a secondary and peace out to the Kotazura.
[07:41] That's not helpful.
[07:42] I'm I'm Dr. Frankenstein and I've created this monster, but I don't know how to deal with it.
[07:46] So, I'm going to go peace out to Sanrope.
[07:47] That's just not very helpful.
[07:49] They always talk about the peril.
[07:51] It's like, well, what do you mean by that?
[07:52] So, the first the first big fear catastrophizing is just this unprecedented job destruction.
[07:57] They've been talking about that for three years, but let's look at the data.
[08:01] The unemployment rate in the US is 4 and a half%.
[08:03] Among youth, it's 8.8%.
[08:08] half%. Among youth, it's 8.8%.
[08:10] That is slightly below the historical average.
[08:13] The number of new business permits or new businesses started per capita in the United States has doubled in the last 10 years.
[08:20] And we hear about, well, what about Meta?
[08:21] They announced they were laying off I think eight or 10,000 people yesterday.
[08:26] 2019 to 2025 they went from 16,000 people to 80.
[08:29] So even if they go back to 60,000 it takes them back to it takes them back about 24 months.
[08:33] So I don't doubt that there'll be a real dip or a severe V down in certain industries, customer service, probably the legal field.
[08:41] But I believe over the medium and the long term it's actually going to create more jobs than it destroys.
[08:47] And I don't I think some of the catastrophizing is nothing more than thinly veiled attempt to say my technology is so devastating that it's going to shift society and you should invest at this crazy valuation.
[09:00] Could you be wrong?
[09:02] Oh 100 I'm wrong all the time, Stephen.
[09:04] No, but I mean like what's the what's what would have to be the case for you to find out that you were wrong in terms of how you've
[09:08] were wrong in terms of how you've reasoned up from the first principles of your thinking there?
[09:13] Job destruction, layoffs.
[09:14] But what would you have misunderstood for that to be the case?
[09:16] like what what did you miss would you have uh misestimated for that to be the case?
[09:20] Ground zero for the job that was going to go go away was radiologists.
[09:24] You could scan billions of images and radiologists kid mama don't let your your daughter grow up to be a radiologist cuz that job's going away.
[09:34] The new job listings for radiologists in 2026 is up because what it ends up is that while scanning the image is important, it's a small part of the job.
[09:42] The value out of a radiologist is diagnosing the illness and then coming up with a treatment plan.
[09:46] And that's as important as ever.
[09:51] The number of coders, job listings for coders yearon year is up 11%.
[09:55] So people who understand how to use these technologies and come up with different prompts or different means of vibe coding that demand has gone up because now AI can be applied to almost any startup.
[10:06] Where I will be wrong is if there is sustained job destruction and
[10:08] there is sustained job destruction and the new jobs created and new businesses.
[10:11] the new jobs created and new businesses and the employment those jobs create.
[10:12] and the employment those jobs create doesn't keep up with new jobs.
[10:15] And there doesn't keep up with new jobs.
[10:17] is a scenario you don't need 100% unemployment like Musk is predicting.
[10:20] unemployment like Musk is predicting.
[10:22] At 20% unemployment the French had a revolution in Weimar Germany turned very ugly.
[10:24] revolution in Weimar Germany turned very ugly.
[10:26] At 20% unemployment especially among youth especially young men tend to get very angry and take to the streets.
[10:28] among youth especially young men tend to get very angry and take to the streets.
[10:31] get very angry and take to the streets.
[10:33] So you could see a V that's so severe even if there's a job recovery if it hits 20% unemployment that could cause real uh civil unrest.
[10:35] even if there's a job recovery if it hits 20% unemployment that could cause real uh civil unrest.
[10:37] real uh civil unrest.
[10:41] But I just look at the data.
[10:42] I look at the employment numbers.
[10:45] If you didn't know there was this seminal technology that very smart people were predict predicting a job apocalypse around and you just looked at the data, you wouldn't know there's anything going on there.
[10:47] this seminal technology that very smart people were predict predicting a job apocalypse around and you just looked at the data, you wouldn't know there's anything going on there.
[10:49] people were predict predicting a job apocalypse around and you just looked at the data, you wouldn't know there's anything going on there.
[10:51] apocalypse around and you just looked at the data, you wouldn't know there's anything going on there.
[10:52] the data, you wouldn't know there's anything going on there.
[10:55] anything going on there.
[10:57] the the data doesn't reflect that there's some big exogenous meteor meteor coming for the employment market.
[11:00] exogenous meteor meteor coming for the employment market.
[11:03] Is there a case that this technology because it's you know built on the internet so it has inherent rapid distribution every day anthropic
[11:04] this technology because it's you know built on the internet so it has inherent rapid distribution every day anthropic
[11:07] built on the internet so it has inherent rapid distribution every day anthropic
[11:09] rapid distribution every day anthropic release a model chat release something.
[11:11] release a model chat release something it spreads across the world very quickly.
[11:13] it spreads across the world very quickly and everybody's updating their technology at the speed of light.
[11:16] unlike the industrial revolution where you had to build infrastructure and physical infrastructure.
[11:17] is there a case that because of the speed of this technology and the proliferation of it that it will be unlike the revolutions of the past?
[11:27] right that's the fear that the V is so severe and so vicious that we don't have time to recover.
[11:32] and even if there is more jobs on the other side that there's too much civil unrest around around the V.
[11:35] But so far the V doesn't be decelerating or diving as quickly as people had predicted.
[11:43] I was kind of reminiscing on the way over here.
[11:45] I moved here almost four years ago.
[11:48] The first thing I did I got off a plane at night and I went to this amazing party at Annabelle's.
[11:50] this Brazilian party with all these hot people everywhere and I thought I love London.
[11:52] And the next morning I woke up feeling pretty hung over and my team said, "You have this podcast with this young guy."
[11:54] And I said, "Cancel, I'm hung over."
[11:55] And they said, they said, "No, he's really up and coming. He'll"
[12:11] No, he's really up and coming. He'll like him. He's a nice kid.
[12:13] And I came like him. He's a nice kid.
[12:15] And I came into this little place in Shortitch or wherever with all these like cool coffee houses.
[12:17] I had no idea who you were.
[12:20] And within two years, you were everywhere.
[12:22] It was like I'm in the airport and I see your banner on your book.
[12:24] And then I get on the plane and the safety video has you in it.
[12:26] I'm like, make him stop.
[12:27] And my point is where I'm going with this is this place is like uh I mean it feels like the Google Plex.
[12:29] You're hiring like crazy.
[12:33] You're one of the most technically sophisticated people I know in media and yet you're hiring a bunch of organic things, walking around and eating and and going out and drinking and having kids.
[12:35] So you're you're the example.
[12:38] Old media is going to lose jobs, but you're creating how many people have you hired in the last 24 months >> across all of our companies?
[12:41] About I'd say about 220.
[12:43] Okay. And those are high-paying jobs.
[12:47] Those are young people probably making six figures plus.
[12:49] That's a lot of
[13:12] Six figures plus.
[13:12] That's a lot of employment.
[13:15] So is the BBC or is the employment.
[13:15] So is the BBC or is the Daily Mail or whatever laying off people?
[13:17] Yeah.
[13:17] But I'm not sure they're laying off people faster than you're hiring them.
[13:19] Yeah.
[13:21] And what I found is that we're hiring a different type of uh person and AI fluency is becoming increasingly important.
[13:25] I say the other thing is the only thing I noticed was 60 days ago I'm I consider myself to be head of recruitment.
[13:31] We have a recruitment team downstairs and I own a recruitment business.
[13:34] But as I go through that inbox people that I otherwise two months ago ago would have been you know jumping to hire, I'm now pausing because there's new technology that's been launched by these companies in America.
[13:48] That means there's alternative solutions and that's what's giving me pause.
[13:51] People do need to consider ways that they can upskill themselves with these technologies.
[13:57] The labor market is definitely reshaping and it might reshape faster, but I'm not sure it's going to be the apocalypse.
[14:01] There'll be winners and losers.
[14:02] For example, for the last 30 or 40 years, the unemployment rate among college grads has been lower than non-ol grads.
[14:11] It's shifted this year.
[14:11] The unemployment rate
[14:13] shifted this year.
[14:15] The unemployment rate among non-ol grads because of the boom in vocational work is now lower than the unemployment rate among college grads.
[14:20] Mhm.
[14:21] And if you think about it, you're like, okay, well, that's bad for new college grads.
[14:24] But at the same time, all of these data centers need carpenters, welders, plumbers.
[14:30] So, there's been booms in other parts of the employment market.
[14:33] But yeah, the employment market's absolutely reshaping.
[14:36] But when I graduated from business school, a lot of it, quite frankly, is your generation is spoiled.
[14:40] What do I mean by that?
[14:42] When I graduated from Berkeley in 1992, 40% of us had a job on graduation day.
[14:46] Meaning 60% did not have a job.
[14:49] I would bet that kids in my class at Stern, I bet 40% of them were starting their own business
[14:55] because with AI when I graduated, there were two there were two entrepreneurs in my class and the second person was my co-founder.
[15:01] We were the only business started in the high school of business from 1992.
[15:05] I would bet there's going to be 30 to 50 businesses that are started on graduation day.
[15:10] There's kids dropping
[15:13] On graduation day, there's kids dropping out of their second year of business.
[15:14] Out of their second year of business school to start businesses because they just raised $10 million in a series A.
[15:18] Now, you have to have certain skills, a certain type of personality, a certain facility with technology.
[15:24] I'm actually quite excited about this.
[15:26] We're starting to see an uptick in productivity, which should result in incremental um profitability.
[15:32] Will some people be on the wrong side of the trade? Absolutely.
[15:34] Do we need to do a better job of providing more unemployment, more retraining?
[15:38] Denmark spends 2% of GDP on retraining and vocational training.
[15:41] We spent 2% in the US.
[15:42] So, we're not good at retraining.
[15:45] We're really bad at it, actually.
[15:47] We're not good at taking care of the people on the wrong side.
[15:49] But as a global in terms of global employment, I just don't I think that catastrophizing is is dressed up fundraising.
[15:58] What do you think about what Elon's doing with um Optimus?
[16:00] Because if you bring those two forces together, you got Optimus robots, these humanoid robots where he says there'll be I mean some of the quotes that he said, I mean it might be marketing, but he's predicting that you won't need to be a surgeon because
[16:14] you won't need to be a surgeon because there'll be so many Optimus robots that
[16:15] there'll be so many Optimus robots that are more advanced than any surgeon on
[16:17] are more advanced than any surgeon on earth. So Optimus robots will be doing
[16:19] earth. So Optimus robots will be doing surgery. Um, and if you combine
[16:21] surgery. Um, and if you combine intelligence with sort of physical um,
[16:25] intelligence with sort of physical um, physical power that comes from these
[16:27] physical power that comes from these robots, it does beg the question where
[16:29] robots, it does beg the question where human skills remain outside of the
[16:32] human skills remain outside of the relational stuff and Maslo serving Maslo
[16:34] relational stuff and Maslo serving Maslo Maslo's hierarchy of needs. That's the
[16:36] Maslo's hierarchy of needs. That's the one of the things I contend with. I'm
[16:37] one of the things I contend with. I'm like, you're seeing in China especially
[16:41] like, you're seeing in China especially how robots are really changing
[16:42] how robots are really changing production lines. And I saw this video
[16:45] production lines. And I saw this video the other day of I think it was in India
[16:46] the other day of I think it was in India or Bangladesh where they had their
[16:49] or Bangladesh where they had their factory workers have cameras on their
[16:51] factory workers have cameras on their heads. heads. >> Yeah.
[16:51] heads. >> Yeah. >> So that they could film their hands
[16:53] >> So that they could film their hands because they're they're intent on
[16:54] because they're they're intent on replacing them with robots. replacing them with robots. >> Yeah.
[16:56] >> Yeah. >> This combination of intelligence plus
[16:58] >> This combination of intelligence plus robotics feels like
[17:01] robotics feels like >> it's two tsunamis at the same time.
[17:03] >> it's two tsunamis at the same time. >> I think that's right. But let's use the
[17:04] >> I think that's right. But let's use the surgeon as an example. I think the robot
[17:07] surgeon as an example. I think the robot in the context of surgery will be not a
[17:09] in the context of surgery will be not a replacement but a supplement. and that
[17:11] replacement but a supplement. and that is there'll be a large class of surgeons
[17:13] is there'll be a large class of surgeons who know how to weaponize robotics to be
[17:15] who know how to weaponize robotics to be better surgeons and quite frankly do two surgeries a day instead of one.
[17:20] I think most brain surgeries take four or six hours using robotics and precision instruments.
[17:24] A great neurosurgeon will be able to increase the productivity and the accuracy of their surgeries.
[17:31] My big tech stock pick of 25 was Alphabet cuz it was trading at a P of 17.
[17:35] I just saw it as being ridiculously cheap cuz everyone thought OpenAI was going to put it out of business.
[17:39] My big tech stock pick for 26 is Amazon because I think where robotics really hit the rubber meets the road in terms of shareholder value is the collision of AI and industrialized robots and there are 400,000 um industrialized robots in the US and there's a million at Amazon.
[17:57] So million Amazon has two and a half times the number of industrialized robots as the rest of the nation combined.
[18:04] But the notion that we're going to have someone in our house bringing our tea a robot, I don't buy that at all.
[18:07] I think that automated robotics, the collision of advanced manufacturing in China and using AI with robotics will yield
[18:16] using AI with robotics will yield tremendous advantage.
[18:18] Amazon is saying tremendous advantage.
[18:20] Amazon is saying they're going to double their largest business, which is their retail business by 2032 without an incremental hire using robotics, industrialized robots.
[18:26] And they were early.
[18:28] They made an acquisition 1015 years ago of a robotics company called KA.
[18:33] But the notion that there's going to be a robot bringing me my tea here.
[18:37] Oh, that's The scary thing for me that I'd want to know more about is weaponized uh industrialized robots as warriors.
[18:43] because I wonder, you know, there's just you can go some weird places, but again, I think AI and robotics, yes, this notion that we're going to have robots serving us our food or in our house, I don't I don't see that.
[18:57] It's interesting because well I guess one could argue that we we already have robots in our houses but they just don't move.
[19:01] So the quite the leap really is would you allow a robot to move through your house?
[19:07] Because I mean a Hoover is a robot.
[19:08] A fridge is kind of like a robot especially the smart fridge is you one could say a TV is a robot a smart TV but the the difference is they don't move through our house or would we allow the Hoover to move itself and then once we
[19:18] Hoover to move itself and then once we allow that would we allow it to potentially bring us something?
[19:20] Yeah, I get it.
[19:22] I just don't the the stuff I've seen on actual application of these robots and keep in mind these robots and keep in mind these individuals the job of the CEO now is it used to be to underpromise and overd deliver.
[19:31] Now it's to overpromise and underdel and create a vision that creates cheap capital so you can pull the future forward.
[19:36] My understanding is three years ago we had a million autonomous Tesla taxis on the road and that did not happen.
[19:40] In 2016 Mus said or 2017 there was going to be autonomous was two years away driving.
[19:49] So their job is to predict a very exciting future.
[19:51] I also think Musk in particular is very good at saying look over here as he stuffs a rabbit into a hat.
[19:55] I mean Tesla is an automobile company so he's got to create a stories to justify its 155 times earnings when most automobile companies trade at 10 or 15.
[20:04] So it's robots it's space it's connectivity with AI and autonomous and robots.
[20:10] It's constantly look over here because I think it's very difficult to justify the valuations he's raising money at.
[20:14] So,
[20:18] valuations he's raising money at. So, and to be clear, he as much as anybody
[20:21] and to be clear, he as much as anybody is able to pull that vision forward.
[20:22] is able to pull that vision forward. He's launched 90% of the rockets sent
[20:24] He's launched 90% of the rockets sent into space have been by SpaceX, but this
[20:27] into space have been by SpaceX, but this is a company that's trading that has $16
[20:30] is a company that's trading that has $16 billion in revenue with 8 billion in
[20:32] billion in revenue with 8 billion in profits and it's going to go out in its
[20:34] profits and it's going to go out in its IPO at a projected valuation of 90 to
[20:37] IPO at a projected valuation of 90 to 110 times revenues. When Google went
[20:39] 110 times revenues. When Google went public, it went out at 10 times revenues
[20:41] public, it went out at 10 times revenues and it was growing 10 times as fast. So,
[20:43] and it was growing 10 times as fast. So, what is this? The key attribute of an
[20:45] what is this? The key attribute of an innovator right now is storytelling. And
[20:48] innovator right now is storytelling. And that is to make sure the promise is way
[20:50] that is to make sure the promise is way ahead of the performance such you can
[20:51] ahead of the performance such you can access cheap capital and pull the future
[20:53] access cheap capital and pull the future forward. But a lot of this stuff I find
[20:56] forward. But a lot of this stuff I find is a lot of jazz hands. He's always he's
[20:59] is a lot of jazz hands. He's always he's you know he's often known for being
[21:00] you know he's often known for being wrong about his timelines. But he does
[21:02] wrong about his timelines. But he does seem to deliver magic to the world.
[21:04] seem to deliver magic to the world. >> The best product in my mind, the two the
[21:07] >> The best product in my mind, the two the two products that have changed my life
[21:09] two products that have changed my life from a technology standpoint, most
[21:10] from a technology standpoint, most underrated product is AirPods.
[21:13] underrated product is AirPods. If AirPods alone, you know, Tim Cook
[21:15] If AirPods alone, you know, Tim Cook just stepped down were its own product.
[21:17] just stepped down were its own product. It'd be a Fortune50 company. I think
[21:19] It'd be a Fortune50 company. I think it's I don't even think it's technology.
[21:21] it's I don't even think it's technology. I think it's the most profitable
[21:23] I think it's the most profitable ubiquitous piece of jewelry in history.
[21:24] ubiquitous piece of jewelry in history. I just walk around with them in my ears
[21:26] I just walk around with them in my ears now, right? And then the best product I
[21:29] now, right? And then the best product I think of the last few years has been
[21:30] think of the last few years has been Starlink.
[21:31] Starlink. >> That's amazing.
[21:32] >> That's amazing. >> I just think it's absolutely I've done
[21:34] >> I just think it's absolutely I've done podcasts from planes. I can talk to my
[21:37] podcasts from planes. I can talk to my sons on FaceTime. That product is, you
[21:41] sons on FaceTime. That product is, you know, all airlines are flying the same
[21:42] know, all airlines are flying the same tin can, same routes, same bad food, a
[21:46] tin can, same routes, same bad food, a real point of differentiation for them
[21:47] real point of differentiation for them and also in maritime. I think Starlink
[21:50] and also in maritime. I think Starlink is the best tech product. So, power to
[21:52] is the best tech product. So, power to him. When they go public, is it an is
[21:55] him. When they go public, is it an is SpaceX an amazing company or is it
[21:57] SpaceX an amazing company or is it overvalued? The answer is yes. Two can
[21:59] overvalued? The answer is yes. Two can be true at the same time.
[22:01] be true at the same time. >> What do you think of Tesla? Because for
[22:03] >> What do you think of Tesla? Because for me it's magic. I don't think people in
[22:05] me it's magic. I don't think people in Britain realize this or other parts of
[22:06] Britain realize this or other parts of the world, but for me the first time I
[22:08] the world, but for me the first time I got in the Tesla that I I got in LA and
[22:11] got in the Tesla that I I got in LA and pressed the location and took my hands
[22:13] pressed the location and took my hands off the wheel. All I had to do is
[22:15] off the wheel. All I had to do is occasionally look look forward, which
[22:17] occasionally look look forward, which they're removing now. And it took me 3
[22:19] they're removing now. And it took me 3 and a half hours to Joshua Tree without
[22:21] and a half hours to Joshua Tree without any intervention.
[22:23] any intervention. >> And the safety record in it is is safer
[22:25] >> And the safety record in it is is safer than if I had driven myself there.
[22:27] than if I had driven myself there. >> It's just magic to me. And then when I
[22:29] >> It's just magic to me. And then when I watched this, the space rockets come
[22:30] watched this, the space rockets come down and be caught with the chopsticks.
[22:32] down and be caught with the chopsticks. Incredible.
[22:32] Incredible. >> How this is just magic.
[22:33] >> How this is just magic. >> Incredible.
[22:34] >> Incredible. >> And so when when he I've got to be
[22:35] >> And so when when he I've got to be honest, when he says that these Optimus
[22:37] honest, when he says that these Optimus robots are going to be I I go, do you
[22:39] robots are going to be I I go, do you know what
[22:40] know what >> he's delivered a lot of magic in the
[22:42] >> he's delivered a lot of magic in the past that no one would have thought
[22:43] past that no one would have thought possible. Can you even imagine 10 years
[22:44] possible. Can you even imagine 10 years ago him saying that what people would
[22:47] ago him saying that what people would have said if he said, "We're going to
[22:48] have said if he said, "We're going to launch this massive sort of 70ft
[22:50] launch this massive sort of 70ft skyscraper into the air and then we're
[22:52] skyscraper into the air and then we're going to catch it on chopsticks."
[22:53] going to catch it on chopsticks." Staggering. Incredible.
[22:54] Staggering. Incredible. >> And then we're going to relaunch it
[22:55] >> And then we're going to relaunch it again and then we're going to catch it
[22:56] again and then we're going to catch it again.
[22:56] again. >> You would have gone impossible.
[22:59] >> You would have gone impossible. >> Edison of our generation. But you asked
[23:01] >> Edison of our generation. But you asked about Tesla.
[23:03] about Tesla. Uh, I bought a Tesla. Incredible
[23:05] Uh, I bought a Tesla. Incredible product, far superior to anything on the
[23:07] product, far superior to anything on the market. He deserves credit for inspiring
[23:09] market. He deserves credit for inspiring the EV race, which will be good for the
[23:11] the EV race, which will be good for the environment. But the reality is
[23:13] environment. But the reality is everyone's caught up. The fastest
[23:15] everyone's caught up. The fastest growing automobile company in history is
[23:17] growing automobile company in history is BYD. BY company
[23:19] BYD. BY company >> BYYD is basically an 80% of a Tesla.
[23:21] >> BYYD is basically an 80% of a Tesla. Some people would say 100% of a Tesla
[23:23] Some people would say 100% of a Tesla for 40% of the price. If there weren't
[23:25] for 40% of the price. If there weren't ter if there wasn't a ridiculous tariff
[23:27] ter if there wasn't a ridiculous tariff war right now BYYD I think would be the
[23:29] war right now BYYD I think would be the number one EV in the United States it's
[23:31] number one EV in the United States it's eating Tesla's launch in addition what I
[23:34] eating Tesla's launch in addition what I think is going to happen about Tesla
[23:36] think is going to happen about Tesla credit where it's due change the market
[23:38] credit where it's due change the market inspired the EV race good for good for
[23:40] inspired the EV race good for good for the planet but when you look at its
[23:42] the planet but when you look at its valuation right now I think what you're
[23:43] valuation right now I think what you're going to see is when SpaceX goes public
[23:45] going to see is when SpaceX goes public there's a lot of money in the market
[23:47] there's a lot of money in the market that wants to get some of that Musk RZ
[23:51] that wants to get some of that Musk RZ what you were talking about this guy is
[23:54] what you were talking about this guy is incredible I I don't care how expensive
[23:55] incredible I I don't care how expensive it is. I want to invest behind Elon
[23:57] it is. I want to invest behind Elon Musk. I get it. That money is about to
[24:00] Musk. I get it. That money is about to come out of Tesla, which people are
[24:02] come out of Tesla, which people are finally figuring out is just a great car
[24:04] finally figuring out is just a great car company that should be trading at 30
[24:05] company that should be trading at 30 times earnings, not 150. And it's going
[24:08] times earnings, not 150. And it's going to go into SpaceX. So, I think the
[24:10] to go into SpaceX. So, I think the boost, the retail surge in SpaceX you're
[24:13] boost, the retail surge in SpaceX you're going to see, and the valuation they're
[24:15] going to see, and the valuation they're planning to go out at is just
[24:16] planning to go out at is just extraordinary. Some of that is going to
[24:18] extraordinary. Some of that is going to come at the cost of Tesla. I just I
[24:20] come at the cost of Tesla. I just I think people are waking up to the fact
[24:22] think people are waking up to the fact that Tesla's a great auto company. It
[24:24] that Tesla's a great auto company. It should be trading at about a fifth of
[24:25] should be trading at about a fifth of the valuation it is. Now
[24:27] the valuation it is. Now >> on that point, what jobs do you think
[24:28] >> on that point, what jobs do you think will be impacted by AI? I've heard you
[24:30] will be impacted by AI? I've heard you talk about logistics and transport. Um I
[24:34] talk about logistics and transport. Um I think London have just announced that
[24:35] think London have just announced that Whimo and Tesla's full self-driving
[24:37] Whimo and Tesla's full self-driving technology are on the way. They've
[24:38] technology are on the way. They've greenlighted Whimo, for example.
[24:40] greenlighted Whimo, for example. >> I'd hate to be a truck driver. To me, a
[24:42] >> I'd hate to be a truck driver. To me, a long haul truck driver is the first
[24:44] long haul truck driver is the first place. Trucks can drive between 10:00
[24:46] place. Trucks can drive between 10:00 p.m. and 4:00 a.m. when there's no
[24:49] p.m. and 4:00 a.m. when there's no traffic on the road. I don't think it's
[24:50] traffic on the road. I don't think it's a very good job. What's interesting
[24:51] a very good job. What's interesting about or one of the things interesting
[24:53] about or one of the things interesting about truck drivers, it's the biggest
[24:54] about truck drivers, it's the biggest employer of non- high school grads in
[24:56] employer of non- high school grads in America of non- high school graduate
[24:58] America of non- high school graduate males. It's the biggest employer. It's
[25:00] males. It's the biggest employer. It's actually one of the two or three I think
[25:01] actually one of the two or three I think biggest jobs by number of people in the
[25:03] biggest jobs by number of people in the United States. I would think within 10
[25:05] United States. I would think within 10 years we're going to have very few long
[25:06] years we're going to have very few long haul truckers. Obviously, customer
[25:08] haul truckers. Obviously, customer service, it feels like that will likely
[25:10] service, it feels like that will likely go away.
[25:11] go away. >> Yeah. But where I see job destruction in
[25:13] >> Yeah. But where I see job destruction in my world is that I used to anytime I got
[25:17] my world is that I used to anytime I got an agreement from anybody, an advertiser
[25:18] an agreement from anybody, an advertiser for a contract, an employment agreement,
[25:21] for a contract, an employment agreement, whatever it is, I'd send it to my lawyer
[25:22] whatever it is, I'd send it to my lawyer and say, "Can you review this?" And
[25:24] and say, "Can you review this?" And because the bills, there's latency in
[25:26] because the bills, there's latency in the bills, it probably cost somewhere
[25:28] the bills, it probably cost somewhere between 400 bucks and 2,000 to review
[25:30] between 400 bucks and 2,000 to review every contract, right? Because I'm I'm a
[25:33] every contract, right? Because I'm I'm a narcissist and I think I signal by
[25:35] narcissist and I think I signal by having a name brand law firm. And so
[25:36] having a name brand law firm. And so they hire some kid at 120 or 80 to 120
[25:39] they hire some kid at 120 or 80 to 120 bucks an hour and they charge me 4 to
[25:41] bucks an hour and they charge me 4 to 500. Right now I say to the person in
[25:43] 500. Right now I say to the person in charge of that project and they say,
[25:46] charge of that project and they say, "Well, I'm sending it the contract
[25:47] "Well, I'm sending it the contract review to our lawyer." I'm like, "No,
[25:48] review to our lawyer." I'm like, "No, no, no, no. Put it put the agreement.
[25:51] no, no, no. Put it put the agreement. Describe what we're looking for. Ask
[25:53] Describe what we're looking for. Ask Claude or uh Chat GPT to pretend it's a
[25:56] Claude or uh Chat GPT to pretend it's a $1,200 lawyer, an hour lawyer,
[26:00] $1,200 lawyer, an hour lawyer, employment lawyer or contract lawyer,
[26:01] employment lawyer or contract lawyer, and ask them to review this document,
[26:03] and ask them to review this document, redline it, and do the same thing at
[26:04] redline it, and do the same thing at another LLM." and then you make those
[26:07] another LLM." and then you make those changes. Congratulations, you are now a
[26:09] changes. Congratulations, you are now a senior associate at a law firm. I
[26:11] senior associate at a law firm. I probably spend a hundred or $300,000 a
[26:14] probably spend a hundred or $300,000 a year on legal fees across my
[26:16] year on legal fees across my organization. I think I'll cut that
[26:18] organization. I think I'll cut that easily by a third this year. So, that's
[26:20] easily by a third this year. So, that's one way I see it.
[26:21] one way I see it. >> But I have a podcast company, not nearly
[26:23] >> But I have a podcast company, not nearly the industrial behemoth this thing is,
[26:26] the industrial behemoth this thing is, but I keep asking everybody, how do we
[26:30] but I keep asking everybody, how do we use AI to do this? And I find that
[26:32] use AI to do this? And I find that they're getting better at it and it
[26:34] they're getting better at it and it enhances our work, but we're still
[26:37] enhances our work, but we're still hiring. Again, I think it's a
[26:39] hiring. Again, I think it's a supplement. And if you want to protect
[26:40] supplement. And if you want to protect against AI, it's pretty easy. I've said
[26:43] against AI, it's pretty easy. I've said this for a while. AI is not going to
[26:44] this for a while. AI is not going to take your job. Someone who understands
[26:45] take your job. Someone who understands AI is going to take your job. So, what I
[26:47] AI is going to take your job. So, what I tell people is have a second screen. And
[26:49] tell people is have a second screen. And that is always have a second screen open
[26:51] that is always have a second screen open that has nothing but AI on it, LLMs. And
[26:54] that has nothing but AI on it, LLMs. And anything you get digitally,
[26:55] anything you get digitally, >> port it into your AI screen and start
[26:58] >> port it into your AI screen and start playing with it. Do you know that quote
[26:59] playing with it. Do you know that quote is so interesting because I've heard
[27:00] is so interesting because I've heard that quote a lot that AI won't take your
[27:01] that quote a lot that AI won't take your job someone that understands AI will
[27:03] job someone that understands AI will take your job and I as you said it then
[27:05] take your job and I as you said it then I thought you know I was thinking about
[27:06] I thought you know I was thinking about different um roles in my team and my
[27:09] different um roles in my team and my open roles that we haven't hired for and
[27:11] open roles that we haven't hired for and I thought
[27:12] I thought >> someone with AI will take your job but
[27:14] >> someone with AI will take your job but they won't just take one job
[27:16] they won't just take one job they'll take five of those jobs and so
[27:18] they'll take five of those jobs and so if you think about an analyst
[27:19] if you think about an analyst >> right
[27:20] >> right >> we've got Molly downstairs analyst for
[27:22] >> we've got Molly downstairs analyst for my investment fund we were well intent
[27:23] my investment fund we were well intent on hiring maybe five analysts for my
[27:26] on hiring maybe five analysts for my second fund we only need Molly now
[27:28] second fund we only need Molly now because Molly's He's got two agents that
[27:29] because Molly's He's got two agents that she works with in two two Mac minis and
[27:32] she works with in two two Mac minis and she's got a a setup that screens the
[27:34] she's got a a setup that screens the inbound interest and also goes
[27:36] inbound interest and also goes proactively into the market looking for
[27:37] proactively into the market looking for new opportunities and pulls them back
[27:38] new opportunities and pulls them back into her runs them through a framework
[27:40] into her runs them through a framework scores them prepares them for the IC. So
[27:43] scores them prepares them for the IC. So Molly enabled by AI, she's probably
[27:45] Molly enabled by AI, she's probably taken out five jobs. And this is I think
[27:47] taken out five jobs. And this is I think even with the law example, that that
[27:50] even with the law example, that that lawyer that's using these models um
[27:52] lawyer that's using these models um probably doesn't need to have five
[27:53] probably doesn't need to have five junior lawyers now um if they're really
[27:56] junior lawyers now um if they're really really competent with and if you think
[27:57] really competent with and if you think about an executive assistant.
[27:59] about an executive assistant. >> Mhm.
[28:00] >> Mhm. >> So with executive assistants, we might
[28:02] >> So with executive assistants, we might for our executives would have hired
[28:03] for our executives would have hired maybe probably 10 um EAS in total. Mhm.
[28:06] maybe probably 10 um EAS in total. Mhm. >> Now, you really need one that has is
[28:09] >> Now, you really need one that has is powered by a to do all the travel, one
[28:10] powered by a to do all the travel, one that does all of the scheduling, and
[28:12] that does all of the scheduling, and then one that meets people at the door.
[28:14] then one that meets people at the door. >> So, you need three versus potentially
[28:15] >> So, you need three versus potentially 10. And this is kind of what I'm saying.
[28:18] 10. And this is kind of what I'm saying. Um, in most roles, but not all of them.
[28:20] Um, in most roles, but not all of them. There are still some roles when I looked
[28:21] There are still some roles when I looked at the rogue chart where I go, actually,
[28:23] at the rogue chart where I go, actually, you know, this sales, a lot of the sales
[28:25] you know, this sales, a lot of the sales we do from a media perspective are take
[28:27] we do from a media perspective are take someone for lunch, whine and dine, call
[28:29] someone for lunch, whine and dine, call them, relationships, pitch the deck.
[28:31] them, relationships, pitch the deck. >> Those kinds of things seem to be
[28:33] >> Those kinds of things seem to be completely un uh untouched. But let's
[28:36] completely un uh untouched. But let's continue down that path. You no longer
[28:38] continue down that path. You no longer need five mollies than you'd want.
[28:40] need five mollies than you'd want. >> Yeah.
[28:40] >> Yeah. >> That creates additional EBIT margin.
[28:42] >> That creates additional EBIT margin. Creates a more profitable diary of a CEO
[28:44] Creates a more profitable diary of a CEO empire means you can raise more money
[28:46] empire means you can raise more money and then you go buy more podcasts or
[28:48] and then you go buy more podcasts or invest in them who hire more people. So,
[28:50] invest in them who hire more people. So, >> and it creates a faster cycle of money
[28:52] >> and it creates a faster cycle of money and new ideas and when we were walking
[28:54] and new ideas and when we were walking in here, you're talking about acquiring
[28:56] in here, you're talking about acquiring podcasting and you're getting into CPG
[28:58] podcasting and you're getting into CPG and you're getting into different forms
[29:00] and you're getting into different forms of media and you're driving growth and
[29:01] of media and you're driving growth and innovation. Will some people be on the
[29:03] innovation. Will some people be on the wrong end of it? My mom was a secretary.
[29:06] wrong end of it? My mom was a secretary. She started as a typist. Typus went
[29:08] She started as a typist. Typus went away. She oversaw the the the
[29:10] away. She oversaw the the the secretarial typist pool at a law school
[29:12] secretarial typist pool at a law school downtown and I worked in the mail room
[29:13] downtown and I worked in the mail room in high school. But my mom realized that
[29:16] in high school. But my mom realized that the hard part of her job was interacting
[29:18] the hard part of her job was interacting with the senior level executive and she
[29:20] with the senior level executive and she became an executive assistant and she
[29:22] became an executive assistant and she never made a lot of money but she made
[29:24] never made a lot of money but she made good money. Realizing the typing was the
[29:26] good money. Realizing the typing was the easy part of her job. The hard part was
[29:27] easy part of her job. The hard part was managing some dude's life. Right.
[29:30] managing some dude's life. Right. >> So accounting should go away. But what
[29:33] >> So accounting should go away. But what you're saying is there's actually more
[29:34] you're saying is there's actually more accountants this year than there was
[29:35] accountants this year than there was last year because accountants, the smart
[29:37] last year because accountants, the smart ones, are moving into wealth management
[29:38] ones, are moving into wealth management and tax optimization, which keeps
[29:40] and tax optimization, which keeps getting more and more complicated. So
[29:43] getting more and more complicated. So yeah, do you have an onus to update your
[29:45] yeah, do you have an onus to update your skills to understand new? I remember
[29:46] skills to understand new? I remember when I first moved to New York and they
[29:48] when I first moved to New York and they hired an assistant for me and she came
[29:50] hired an assistant for me and she came in and said, "Oh, I don't use
[29:51] in and said, "Oh, I don't use computers." And I'm like, "Well, you
[29:53] computers." And I'm like, "Well, you can't work here." So it's like it to say
[29:56] can't work here." So it's like it to say you're not going to learn AI or at least
[29:57] you're not going to learn AI or at least the basics on how to use it. It's like
[29:59] the basics on how to use it. It's like being in 1998 and saying, "Well, I don't
[30:01] being in 1998 and saying, "Well, I don't use PCs." Mhm.
[30:03] use PCs." Mhm. >> So I don't Yeah, you got to update your
[30:05] >> So I don't Yeah, you got to update your skills. The cycle times decreasing. A
[30:07] skills. The cycle times decreasing. A lot of people will find themselves on
[30:08] lot of people will find themselves on the wrong side of the trade. I have to
[30:11] the wrong side of the trade. I have to force myself to use AI. I don't, you
[30:12] force myself to use AI. I don't, you know, I'm the part of your brain around
[30:15] know, I'm the part of your brain around understanding new technologies begins to
[30:17] understanding new technologies begins to die, right? I can no longer perceive a
[30:20] die, right? I can no longer perceive a calendar. I can't tell you what I have
[30:21] calendar. I can't tell you what I have next. For some reason, I used to be able
[30:23] next. For some reason, I used to be able to do fant I could do a Scotsman accent,
[30:26] to do fant I could do a Scotsman accent, Stephen, like no one's business a few
[30:27] Stephen, like no one's business a few years ago. That part of my brain has
[30:29] years ago. That part of my brain has died. And when I try to do an accent, it
[30:31] died. And when I try to do an accent, it sounds like a dead language that twins
[30:33] sounds like a dead language that twins speak to each other and it offends
[30:34] speak to each other and it offends everybody. There's parts of your brain
[30:36] everybody. There's parts of your brain that die. And the part that where you
[30:38] that die. And the part that where you can understand new technologies or at
[30:40] can understand new technologies or at least have the will to learn them starts
[30:41] least have the will to learn them starts to go away as you get older. I have to
[30:44] to go away as you get older. I have to force myself to play with these new
[30:46] force myself to play with these new technologies. Everyone who wants to make
[30:48] technologies. Everyone who wants to make more money or wants to hold on to their
[30:50] more money or wants to hold on to their job should have the same onus to learn
[30:52] job should have the same onus to learn these new skills.
[30:53] these new skills. >> On that individual level, there's people
[30:54] >> On that individual level, there's people listening now that that want to make
[30:56] listening now that that want to make sure that they have the skills, their
[30:57] sure that they have the skills, their kids have the skills of the future,
[30:58] kids have the skills of the future, right? What are some of those important
[31:00] right? What are some of those important skills? I mean, you've got kids. What do
[31:01] skills? I mean, you've got kids. What do you What would you say to them?
[31:02] you What would you say to them? >> I get asked that all the time. What is
[31:04] >> I get asked that all the time. What is the skill? So, the honest answer is I
[31:07] the skill? So, the honest answer is I have a view, but nobody knows. Do you
[31:08] have a view, but nobody knows. Do you realize 10 years ago in private schools,
[31:10] realize 10 years ago in private schools, the biggest incremental investments in
[31:12] the biggest incremental investments in curriculum were two things? Computer
[31:14] curriculum were two things? Computer science and Mandarin.
[31:16] science and Mandarin. >> How's that worked out? Like, thank God
[31:19] >> How's that worked out? Like, thank God my kid knows Mandarin. I mean, said
[31:21] my kid knows Mandarin. I mean, said nobody right now, but they thought it
[31:23] nobody right now, but they thought it was going to be computer science. And
[31:25] was going to be computer science. And you could argue that hasn't worked out
[31:26] you could argue that hasn't worked out as well as everyone thought. I would say
[31:28] as well as everyone thought. I would say that the enduring skill is storytelling
[31:31] that the enduring skill is storytelling and that is that your ability to look at
[31:33] and that is that your ability to look at data, create a narrative arc and then
[31:34] data, create a narrative arc and then communicate that story in a compelling
[31:36] communicate that story in a compelling way via all the different mediums
[31:38] way via all the different mediums >> whether it's podcasting I think you have
[31:40] >> whether it's podcasting I think you have to write well to be a great storyteller
[31:42] to write well to be a great storyteller but if you think about the most
[31:43] but if you think about the most successful people in the world at the
[31:45] successful people in the world at the end of the day they're usually
[31:46] end of the day they're usually storytellers the great CEOs I read Jeff
[31:49] storytellers the great CEOs I read Jeff Bezos's 1997 letter to shareholders
[31:52] Bezos's 1997 letter to shareholders where he focused on those three
[31:53] where he focused on those three principles and I'm like take my money
[31:55] principles and I'm like take my money right I see even a guy like Alex Karp at
[31:58] right I see even a guy like Alex Karp at Palunteer walking around doing a live
[32:00] Palunteer walking around doing a live earnings call on his phone. It's very
[32:03] earnings call on his phone. It's very compelling. Um Jensen Hang, you know,
[32:06] compelling. Um Jensen Hang, you know, when he does these giant Buffett-like
[32:08] when he does these giant Buffett-like stadiums where he gets up and he's like
[32:10] stadiums where he gets up and he's like a rockstar. That's storytelling.
[32:12] a rockstar. That's storytelling. Technology is, I think, going to create
[32:14] Technology is, I think, going to create a equalization among the product.
[32:17] a equalization among the product. Reverse engineering created parity among
[32:19] Reverse engineering created parity among manufactured products. I think most
[32:21] manufactured products. I think most technology are technologies are going to
[32:22] technology are technologies are going to converge and we're seeing that
[32:23] converge and we're seeing that convergence in AI models. So I think the
[32:26] convergence in AI models. So I think the point of differentiation is
[32:27] point of differentiation is relationships, right? Do I want to work
[32:29] relationships, right? Do I want to work with this person? Do I know about their
[32:31] with this person? Do I know about their kids? Do I like them? At the end of the
[32:32] kids? Do I like them? At the end of the day, I have three different law firms
[32:35] day, I have three different law firms pitching me business, three different
[32:36] pitching me business, three different investment banks, three different CRM
[32:39] investment banks, three different CRM companies. Who do I have the best
[32:41] companies. Who do I have the best relationship with and who do I want to
[32:42] relationship with and who do I want to work with? So storytelling, it sounds
[32:45] work with? So storytelling, it sounds very p, but the ability to establish
[32:47] very p, but the ability to establish strong relationships with other sentient
[32:49] strong relationships with other sentient beings. I still believe that everything
[32:51] beings. I still believe that everything reverse engineers to biology. I think a
[32:53] reverse engineers to biology. I think a certain fundamental understanding of the
[32:56] certain fundamental understanding of the sciences, those seem to be pretty
[32:58] sciences, those seem to be pretty enduring. You know, my my oldest said he
[33:00] enduring. You know, my my oldest said he wanted to be a marine biologist. I'm
[33:02] wanted to be a marine biologist. I'm like, who are you kidding? You got
[33:03] like, who are you kidding? You got seasick. I mean, at 18, what you want
[33:06] seasick. I mean, at 18, what you want it? You want them to be smart, good
[33:08] it? You want them to be smart, good people, aggressive, support them
[33:10] people, aggressive, support them becoming their own person, which I've
[33:12] becoming their own person, which I've had trouble with. I want them to be mini
[33:13] had trouble with. I want them to be mini me. And then helping them find something
[33:17] me. And then helping them find something they're great at. But telling your kid,
[33:19] they're great at. But telling your kid, "No, you need to go into this because
[33:20] "No, you need to go into this because this is where the future's going." Nope,
[33:22] this is where the future's going." Nope, we don't know. I We have We know the
[33:25] we don't know. I We have We know the basics. I would want my kid to be able
[33:26] basics. I would want my kid to be able to write well, to be able to look
[33:28] to write well, to be able to look someone in the eye, to be competitive.
[33:30] someone in the eye, to be competitive. You know, I I encourage them to play
[33:32] You know, I I encourage them to play sports, to do chores, which they do none
[33:34] sports, to do chores, which they do none of. But other than that, I don't know.
[33:39] of. But other than that, I don't know. What do you think are going to be the
[33:40] What do you think are going to be the skills of the future? I think it's
[33:41] skills of the future? I think it's really difficult to know.
[33:42] really difficult to know. >> I think what you said about
[33:43] >> I think what you said about relationships is really important. You
[33:44] relationships is really important. You said storytelling, and I think
[33:45] said storytelling, and I think storytelling is a proxy of of sales and
[33:48] storytelling is a proxy of of sales and persuasion. And whether that's
[33:50] persuasion. And whether that's persuading an investor to believe in you
[33:52] persuading an investor to believe in you or people to come and work for you or
[33:53] or people to come and work for you or customers to come and buy your thing. So
[33:55] customers to come and buy your thing. So I think I think storytelling and sales
[33:58] I think I think storytelling and sales is is going to be an enduring skill. The
[34:01] is is going to be an enduring skill. The skill that is the biggest threat that
[34:03] skill that is the biggest threat that people are young people are losing
[34:04] people are young people are losing especially young men and this is a skill
[34:06] especially young men and this is a skill and it's hugely underrated
[34:08] and it's hugely underrated is the ability to endure rejection.
[34:11] is the ability to endure rejection. And because of AI and because of these
[34:13] And because of AI and because of these frictionless relationships that people
[34:15] frictionless relationships that people are engaging in online, I think a lot of
[34:18] are engaging in online, I think a lot of young people, especially young men, are
[34:20] young people, especially young men, are losing the perseverance, endurance,
[34:23] losing the perseverance, endurance, willingness to hear no. Whether it's
[34:26] willingness to hear no. Whether it's expressing friendship, whether it's
[34:27] expressing friendship, whether it's applying for a job you're not qualified
[34:29] applying for a job you're not qualified for, whether it's approaching someone
[34:31] for, whether it's approaching someone and expressing romantic interest, I
[34:32] and expressing romantic interest, I think a lot of young people, especially
[34:33] think a lot of young people, especially young men, I think they can have a
[34:35] young men, I think they can have a frictionless relationship online and are
[34:37] frictionless relationship online and are they're losing this sense of resilience
[34:39] they're losing this sense of resilience and aggressiveness. Uh, so I think that
[34:42] and aggressiveness. Uh, so I think that skill when I mentor young men, the first
[34:45] skill when I mentor young men, the first skill I try to reinccorporate back into
[34:46] skill I try to reinccorporate back into their life is what I call no. And that
[34:48] their life is what I call no. And that is I need you to go out. I need you to
[34:50] is I need you to go out. I need you to go put yourself in the agency of
[34:52] go put yourself in the agency of strangers, whether it's a church group,
[34:53] strangers, whether it's a church group, a sports league, a riding class, and I
[34:56] a sports league, a riding class, and I want you to make an overture, an
[34:58] want you to make an overture, an expression of friendship. Hey, do you
[34:59] expression of friendship. Hey, do you want to grab the Arsenal game or watch
[35:00] want to grab the Arsenal game or watch come over my house and watch it? And
[35:02] come over my house and watch it? And then most of these kids, when you ask
[35:04] then most of these kids, when you ask them what they want, they usually want
[35:05] them what they want, they usually want two things. The men I mentor, they want
[35:07] two things. The men I mentor, they want to move out of their parents house and
[35:08] to move out of their parents house and they want a girlfriend. And so once we
[35:10] they want a girlfriend. And so once we do the friend approach, then I say,
[35:13] do the friend approach, then I say, "Okay, find someone you may be
[35:14] "Okay, find someone you may be potentially interested in. Ask them out
[35:15] potentially interested in. Ask them out for coffee." And this is the goal. The
[35:17] for coffee." And this is the goal. The goal is no. Because what's going to
[35:19] goal is no. Because what's going to happen is you might get a no. Probably
[35:21] happen is you might get a no. Probably going to get a no. And then I'm going to
[35:22] going to get a no. And then I'm going to call you the next day and I'm going to
[35:23] call you the next day and I'm going to say, "Are you okay?" And the answer is
[35:25] say, "Are you okay?" And the answer is going to be yes. Is he or she okay? The
[35:27] going to be yes. Is he or she okay? The person who said no. Yeah, they're fine.
[35:29] person who said no. Yeah, they're fine. Because
[35:31] Because And I'm I'm boasting. The secret to my
[35:33] And I'm I'm boasting. The secret to my success is rejection. I ran for
[35:36] success is rejection. I ran for sophomore, junior, and senior class
[35:37] sophomore, junior, and senior class president of my high school. I lost all
[35:39] president of my high school. I lost all three times and based on my track
[35:40] three times and based on my track record, I decided to run for senior body
[35:42] record, I decided to run for senior body or student body president where I went
[35:43] or student body president where I went on to get this lose. Never bothered me.
[35:47] on to get this lose. Never bothered me. I mean, I mourned and I moved on. Every
[35:49] I mean, I mourned and I moved on. Every entrepreneur, how many nos have you
[35:50] entrepreneur, how many nos have you gotten?
[35:51] gotten? >> Oh, hell. Yeah, more so than
[35:52] >> Oh, hell. Yeah, more so than anybody.
[35:53] anybody. >> I mean, so the any person you look at
[35:55] >> I mean, so the any person you look at and think that person has made more
[35:57] and think that person has made more money than I would have thought or any
[35:58] money than I would have thought or any person who's hanging out with someone
[36:00] person who's hanging out with someone much higher character and hotter than
[36:02] much higher character and hotter than them has one thing in common. They
[36:03] them has one thing in common. They either have very rich parents or more
[36:05] either have very rich parents or more likely.
[36:06] likely. >> They're they're comfortable with no. So,
[36:09] >> They're they're comfortable with no. So, and unfortunately, I think young people,
[36:11] and unfortunately, I think young people, especially young men, are becoming less
[36:12] especially young men, are becoming less and less resilient to know. Apply for
[36:16] and less resilient to know. Apply for jobs you're not qualified for. Apply to
[36:18] jobs you're not qualified for. Apply to graduate programs you shouldn't get
[36:20] graduate programs you shouldn't get into. Approach people who you perceive
[36:21] into. Approach people who you perceive as being cooler and hotter than you. And
[36:23] as being cooler and hotter than you. And express friendship and romantic
[36:24] express friendship and romantic interest. That is the key. You want to
[36:26] interest. That is the key. You want to punch above your weight class, get out a
[36:28] punch above your weight class, get out a big spoon and get ready to eat
[36:30] big spoon and get ready to eat That is the only thing that is common
[36:32] That is the only thing that is common across all great entrepreneurs who are
[36:34] across all great entrepreneurs who are self-made is they have the ability to
[36:36] self-made is they have the ability to mourn and move on.
[36:38] mourn and move on. >> I've been thinking about this a lot
[36:39] >> I've been thinking about this a lot lately. It was inspired by listening to
[36:41] lately. It was inspired by listening to Adam Newman on that podcast a couple of
[36:43] Adam Newman on that podcast a couple of uh a couple of weeks ago. He did one
[36:44] uh a couple of weeks ago. He did one with um Rick Rubin I think it was and
[36:47] with um Rick Rubin I think it was and and in it he tells the story of sitting
[36:48] and in it he tells the story of sitting in the back of the cab with Massa who's
[36:50] in the back of the cab with Massa who's the the owner of SoftBank and ask he was
[36:54] the the owner of SoftBank and ask he was going to ask him to give him $300
[36:55] going to ask him to give him $300 million and Massa turns to him and says
[36:57] million and Massa turns to him and says you're just not ambitious enough. And he
[36:59] you're just not ambitious enough. And he says by the time that 23minute cab drive
[37:02] says by the time that 23minute cab drive had finished, Massa had offered him 4
[37:04] had finished, Massa had offered him 4 billion and he had signed a little
[37:05] billion and he had signed a little napkin thing that Massa had in the back
[37:07] napkin thing that Massa had in the back of the cab. And that story and other
[37:09] of the cab. And that story and other stories like it have inspired have made
[37:11] stories like it have inspired have made me realize that like some of the game in
[37:13] me realize that like some of the game in life and success is what I now call
[37:15] life and success is what I now call selling yourself long and most of us go
[37:18] selling yourself long and most of us go through our lives selling ourselves
[37:19] through our lives selling ourselves short. And I think in part because we
[37:21] short. And I think in part because we don't see ourselves on an exponential
[37:22] don't see ourselves on an exponential curve. We see ourselves as a fixed
[37:24] curve. We see ourselves as a fixed state. This is who I am. This is what
[37:25] state. This is who I am. This is what I'm worth. But you've almost got to like
[37:27] I'm worth. But you've almost got to like factor in your own expon exponential
[37:29] factor in your own expon exponential improvement. If you take you from 21 to
[37:31] improvement. If you take you from 21 to now, how your intelligence and wisdom
[37:33] now, how your intelligence and wisdom and connections have all compounded. But
[37:36] and connections have all compounded. But you would have sold yourself at the
[37:38] you would have sold yourself at the value of you at 21 years old. So I'm now
[37:40] value of you at 21 years old. So I'm now I actually now think in every season of
[37:42] I actually now think in every season of my life I've sold myself short. When I
[37:44] my life I've sold myself short. When I was 18 years old, my I sold 20% of my
[37:46] was 18 years old, my I sold 20% of my company for 5K, $5,000.
[37:49] company for 5K, $5,000. And then I thought that was I couldn't
[37:50] And then I thought that was I couldn't believe it. I was celebrating in my room
[37:51] believe it. I was celebrating in my room at the time. I was stealing Chicago Town
[37:53] at the time. I was stealing Chicago Town pizzas to feed myself. And then a couple
[37:54] pizzas to feed myself. And then a couple of years later, I sold another 30% of
[37:56] of years later, I sold another 30% of that company uh 30% of my company for
[37:58] that company uh 30% of my company for 300k and I thought I'd hit the
[38:00] 300k and I thought I'd hit the euromillions,
[38:01] euromillions, >> right?
[38:01] >> right? >> That company ran up and was on the stock
[38:03] >> That company ran up and was on the stock market for many hundreds of millions
[38:04] market for many hundreds of millions many years later. I thought, God, I've
[38:05] many years later. I thought, God, I've sold myself short my whole life. So
[38:07] sold myself short my whole life. So assume I'm doing the same now. And if I
[38:10] assume I'm doing the same now. And if I if I assume that, how
[38:13] if I assume that, how how what would I say? What would I what
[38:15] how what would I say? What would I what what opportunities would I go for? And I
[38:18] what opportunities would I go for? And I think that app for me that apply should
[38:19] think that app for me that apply should apply to everybody. I think it's
[38:21] apply to everybody. I think it's self-fulfilling.
[38:22] self-fulfilling. >> Yeah. I there's something about
[38:23] >> Yeah. I there's something about imagining where you want to be in 5
[38:25] imagining where you want to be in 5 years outrageously in the reverse
[38:26] years outrageously in the reverse engineering back. But the moment you
[38:28] engineering back. But the moment you took that 300K, you really needed it. I
[38:30] took that 300K, you really needed it. I mean that was so it's a function of your
[38:33] mean that was so it's a function of your opportunity set and what's actually the
[38:35] opportunity set and what's actually the market says to you. Uh I think pricing
[38:37] market says to you. Uh I think pricing is a signal. I struggle with pricing and
[38:40] is a signal. I struggle with pricing and what I tell whenever we're writing a
[38:42] what I tell whenever we're writing a proposal or okay whatever I get back I'm
[38:45] proposal or okay whatever I get back I'm like okay increase the pricing 30 or 50%
[38:48] like okay increase the pricing 30 or 50% more because pricing is a signal and
[38:50] more because pricing is a signal and it's you can always take the price down
[38:52] it's you can always take the price down and the client the person will always
[38:53] and the client the person will always will always come back and ask for the
[38:55] will always come back and ask for the price down but I mean quite frankly and
[38:57] price down but I mean quite frankly and this is sexist I think men are better at
[39:00] this is sexist I think men are better at imagining an unrealistic self and women
[39:02] imagining an unrealistic self and women are more measured
[39:04] are more measured >> and I think that's one of the things
[39:05] >> and I think that's one of the things that has held back women as
[39:07] that has held back women as entrepreneurs and the fact that not as
[39:08] entrepreneurs and the fact that not as much capital has been available to them
[39:10] much capital has been available to them because the people allocating capital
[39:11] because the people allocating capital have been white dudes from Stanford or
[39:13] have been white dudes from Stanford or Harvard. 40% of VCs are from two schools
[39:15] Harvard. 40% of VCs are from two schools of I think it was Charlie Munger who
[39:17] of I think it was Charlie Munger who said someone who has a crazy vision of
[39:21] said someone who has a crazy vision of their own potential
[39:23] their own potential that's stupid and obnoxious.
[39:25] that's stupid and obnoxious. His attitude was never bet against that
[39:27] His attitude was never bet against that person because occasionally they're
[39:28] person because occasionally they're right. And so for we talked about Musk
[39:32] right. And so for we talked about Musk for a guy to think I'm going to raise so
[39:34] for a guy to think I'm going to raise so much money and command technology that I
[39:36] much money and command technology that I can be responsible for 90% of launches
[39:39] can be responsible for 90% of launches and control 70% of the Earth's low orbit
[39:44] and control 70% of the Earth's low orbit satellites.
[39:45] satellites. That's pretty arrogant. So yeah, I agree
[39:48] That's pretty arrogant. So yeah, I agree with you and you know it. Let me put it
[39:50] with you and you know it. Let me put it this way. If you're going to air to the
[39:52] this way. If you're going to air to the upside because the market will bring you
[39:54] upside because the market will bring you back on its own.
[39:57] back on its own. >> That's so true. Um Sam Salman has been
[39:59] >> That's so true. Um Sam Salman has been in all the news lately in part because
[40:01] in all the news lately in part because of this article that the New Yorker did,
[40:02] of this article that the New Yorker did, two reporters there, Ronan Faroh and
[40:05] two reporters there, Ronan Faroh and Andrew Morren.
[40:07] Andrew Morren. >> Um which was this article and then he
[40:08] >> Um which was this article and then he responded with with this blog post here
[40:11] responded with with this blog post here two years ago. I thought Sam had a
[40:13] two years ago. I thought Sam had a really good brand
[40:15] really good brand >> cuz he was out in Congress. He was
[40:17] >> cuz he was out in Congress. He was saying this tech, you know, he was
[40:18] saying this tech, you know, he was asking for regulation. It seemed it has
[40:20] asking for regulation. It seemed it has turned unbelievably quickly,
[40:22] turned unbelievably quickly, unbelievably fast. Well, we've said this
[40:24] unbelievably fast. Well, we've said this before, he's a proxy for all of AI. I
[40:26] before, he's a proxy for all of AI. I think the greatest brand destruction
[40:28] think the greatest brand destruction other than the US over the last 18
[40:30] other than the US over the last 18 months has been AI and also Sam Alman.
[40:33] months has been AI and also Sam Alman. But we keep falling into this trap over
[40:35] But we keep falling into this trap over and over and that is something happened
[40:37] and over and that is something happened through the 70s, 80s and 90s and that is
[40:40] through the 70s, 80s and 90s and that is America used its heroes used to be
[40:42] America used its heroes used to be athletes, government officials and
[40:45] athletes, government officials and actors and then there was a dramatic
[40:48] actors and then there was a dramatic decline in um attendance to religious
[40:51] decline in um attendance to religious institutions but people still have
[40:53] institutions but people still have really big questions and want sort of an
[40:55] really big questions and want sort of an authority or a godlike figure. The
[40:57] authority or a godlike figure. The closest thing we have to religion is
[40:59] closest thing we have to religion is technology. Most people don't understand
[41:00] technology. Most people don't understand it. It has a mystical feel to it. I have
[41:02] it. It has a mystical feel to it. I have no idea how my phone works. I can ask it
[41:04] no idea how my phone works. I can ask it anything and it comes back with a
[41:06] anything and it comes back with a relatively authoritative answer that I
[41:07] relatively authoritative answer that I trust. Similar if I was talking or
[41:09] trust. Similar if I was talking or praying. Like most of the time when
[41:10] praying. Like most of the time when we're on AI, we're kind of praying.
[41:12] we're on AI, we're kind of praying. We're we're sending a question into the
[41:14] We're we're sending a question into the echo to a being thinking it's smarter
[41:16] echo to a being thinking it's smarter than us and it's going to come back with
[41:18] than us and it's going to come back with an empathetic, loving, correct answer.
[41:20] an empathetic, loving, correct answer. So the new Jesus Christ was born in the
[41:23] So the new Jesus Christ was born in the '9s and that was Steve Jobs. And he was
[41:26] '9s and that was Steve Jobs. And he was taken from us early like Jesus Christ.
[41:28] taken from us early like Jesus Christ. The idolatry of innovators has
[41:29] The idolatry of innovators has absolutely gone crazy. And the new Jesus
[41:33] absolutely gone crazy. And the new Jesus Christ, if you will, are these tax CEOs.
[41:36] Christ, if you will, are these tax CEOs. And here's what we fail to understand.
[41:38] And here's what we fail to understand. They do not have our best interests at
[41:39] They do not have our best interests at heart. They are not concerned with our
[41:42] heart. They are not concerned with our emotional well-being. They are not going
[41:43] emotional well-being. They are not going to comfort us when we're older. They are
[41:45] to comfort us when we're older. They are there, and they play a key component in
[41:47] there, and they play a key component in capitalism to do anything that is
[41:49] capitalism to do anything that is required to get their earnings up 1 cent
[41:51] required to get their earnings up 1 cent per share every day. and they will make
[41:54] per share every day. and they will make incremental decisions that justify the
[41:56] incremental decisions that justify the harming and self harm of teen girls.
[41:58] harming and self harm of teen girls. Cheryl Samberg, the weaponization of our
[42:02] Cheryl Samberg, the weaponization of our platforms to make our the coarsening of
[42:05] platforms to make our the coarsening of our discourse, Mark Zuckerberg,
[42:08] our discourse, Mark Zuckerberg, I mean, the radicalization of young men,
[42:11] I mean, the radicalization of young men, uh, the people running YouTube,
[42:13] uh, the people running YouTube, they we keep thinking that the newest
[42:16] they we keep thinking that the newest tech CEO is going to save us, that this
[42:19] tech CEO is going to save us, that this guy cares. Sam Alman was the gay son we
[42:22] guy cares. Sam Alman was the gay son we all wanted. Like super nice, super
[42:24] all wanted. Like super nice, super friendly, hush voices, well senator. I'm
[42:28] friendly, hush voices, well senator. I'm worried about that as well. Right. These
[42:31] worried about that as well. Right. These guys would sleep with their cousin for a
[42:34] guys would sleep with their cousin for a nickel. That's their job. Their job is
[42:36] nickel. That's their job. Their job is to increase earnings. They're not here
[42:38] to increase earnings. They're not here to save us. We're supposed to elect
[42:41] to save us. We're supposed to elect people who put in guard rails for them.
[42:43] people who put in guard rails for them. So everyone is saying or asking me, can
[42:45] So everyone is saying or asking me, can we trust Sam Alman? I'm like, no. And we
[42:48] we trust Sam Alman? I'm like, no. And we shouldn't have to. We should have
[42:49] shouldn't have to. We should have regulators putting in guard rails on AI.
[42:52] regulators putting in guard rails on AI. We should be testing these things.
[42:53] We should be testing these things. Government agencies should be testing
[42:55] Government agencies should be testing these things. We should be ensuring that
[42:57] these things. We should be ensuring that these technologies are not used to
[42:59] these technologies are not used to surveil Americans, right? That we would
[43:02] surveil Americans, right? That we would still have cartoons like characters on
[43:05] still have cartoons like characters on cigarettes marketing to 12y olds if we
[43:08] cigarettes marketing to 12y olds if we hadn't put in place regulation. And the
[43:11] hadn't put in place regulation. And the tobacco executives would claim that they
[43:13] tobacco executives would claim that they weren't marketing to young people. Ford
[43:15] weren't marketing to young people. Ford would still be pouring mercury into the
[43:17] would still be pouring mercury into the river if we didn't have an EPA and
[43:18] river if we didn't have an EPA and regulation. So Sam is doing exactly what
[43:21] regulation. So Sam is doing exactly what he's supposed to be doing. The latest
[43:23] he's supposed to be doing. The latest hero that's going to save us is Dario
[43:24] hero that's going to save us is Dario Amod. We've decided he's the good guy.
[43:27] Amod. We've decided he's the good guy. Now the villain's journey in tech is the
[43:29] Now the villain's journey in tech is the same. Some compelling person we think is
[43:33] same. Some compelling person we think is a wonderful guy and it's almost always a
[43:35] a wonderful guy and it's almost always a guy, occasionally a woman. You know, we
[43:38] guy, occasionally a woman. You know, we need to do better gender balance. That
[43:40] need to do better gender balance. That was Cheryl Samberg in the workplace. And
[43:42] was Cheryl Samberg in the workplace. And then we find out that they are doing
[43:44] then we find out that they are doing their job and that is doing anything
[43:46] their job and that is doing anything they can legally to increase shareholder
[43:47] they can legally to increase shareholder value regardless of whether it prevents
[43:49] value regardless of whether it prevents a tragedy of the commons or not. And
[43:51] a tragedy of the commons or not. And then we get angry at them. The journey
[43:53] then we get angry at them. The journey from Anakin Skywalker to Darth Vader
[43:55] from Anakin Skywalker to Darth Vader gets shorter and shorter, but they all
[43:57] gets shorter and shorter, but they all follow the same path. And then 18 months
[44:00] follow the same path. And then 18 months later, we realize that they would too
[44:02] later, we realize that they would too will do will say and do whatever they
[44:04] will do will say and do whatever they can to try and delay and opuscate
[44:05] can to try and delay and opuscate regulation. And that sometimes when we
[44:08] regulation. And that sometimes when we don't have a government with a lack of
[44:09] don't have a government with a lack of domain expertise, with a lack of
[44:11] domain expertise, with a lack of insight, with people who are a cross
[44:13] insight, with people who are a cross between the land of the dead and the
[44:15] between the land of the dead and the golden girls and don't understand these
[44:16] golden girls and don't understand these technologies, that when these
[44:18] technologies, that when these technologies are allowed to run just
[44:21] technologies are allowed to run just free with no regulation, they do bad
[44:24] free with no regulation, they do bad things. So Sam Alman is the latest
[44:26] things. So Sam Alman is the latest person we've discovered as Darth Vader,
[44:28] person we've discovered as Darth Vader, who's gone to the dark side. It happens
[44:30] who's gone to the dark side. It happens with all of them because they're doing
[44:32] with all of them because they're doing their job. We as citizens aren't doing
[44:34] their job. We as citizens aren't doing our job. and that is we're not putting
[44:35] our job. and that is we're not putting in place elected officials that regulate
[44:36] in place elected officials that regulate these companies. So yeah, can we trust
[44:39] these companies. So yeah, can we trust Sam Alman? No. But we shouldn't need to
[44:41] Sam Alman? No. But we shouldn't need to trust him. We should be able to trust
[44:42] trust him. We should be able to trust that we have smart elected officials
[44:44] that we have smart elected officials that will regulate these companies.
[44:45] that will regulate these companies. >> Did you see this article he wrote in
[44:47] >> Did you see this article he wrote in response to being having a Molotov
[44:49] response to being having a Molotov cocktail thrown at his house?
[44:50] cocktail thrown at his house? >> I read this article without reading it.
[44:52] >> I read this article without reading it. So there's been death threats on his
[44:53] So there's been death threats on his life. And let me be clear, there's
[44:56] life. And let me be clear, there's absolutely no justification whatsoever
[44:59] absolutely no justification whatsoever for this. And it needs to be I don't
[45:01] for this. And it needs to be I don't even like it when people yell at JD
[45:02] even like it when people yell at JD Vance when he's skiing with his family.
[45:04] Vance when he's skiing with his family. One of the we don't want to be that
[45:06] One of the we don't want to be that nation or we don't in the west we have a
[45:07] nation or we don't in the west we have a if you're operating with the confines of
[45:09] if you're operating with the confines of the law which Sam is he deserves to live
[45:11] the law which Sam is he deserves to live in peace. It is sad. Do you realize in
[45:13] in peace. It is sad. Do you realize in the US now there's no more people
[45:14] the US now there's no more people working in private security than there
[45:16] working in private security than there are cops really because there's always
[45:19] are cops really because there's always going to be a number of people who are
[45:22] going to be a number of people who are likely I don't know about the
[45:23] likely I don't know about the perpetrator here who are mentally ill
[45:25] perpetrator here who are mentally ill and who will find an excuse essentially
[45:29] and who will find an excuse essentially the the makeup usually of these
[45:31] the the makeup usually of these perpetrators or these these these
[45:32] perpetrators or these these these criminal actors is they're usually young
[45:35] criminal actors is they're usually young men looking to restore their social
[45:37] men looking to restore their social capital through what they perceive as a
[45:39] capital through what they perceive as a historic act of violence against a
[45:40] historic act of violence against a famous person. This has gone on for a
[45:43] famous person. This has gone on for a long time. He's now famous. I don't
[45:45] long time. He's now famous. I don't think it's fair. I've seen some people
[45:47] think it's fair. I've seen some people try to equate AI's dangers as if these
[45:51] try to equate AI's dangers as if these acts of violence are somehow justified.
[45:54] acts of violence are somehow justified. They're not. There's no justification
[45:56] They're not. There's no justification for a healthcare executive being
[45:57] for a healthcare executive being executed in the streets. There's no
[45:58] executed in the streets. There's no justification for violence against Sam
[46:01] justification for violence against Sam or attempted violence against Sam and
[46:03] or attempted violence against Sam and his family. I think it's young men with
[46:05] his family. I think it's young men with a lack of opportunity, mental illness,
[46:06] a lack of opportunity, mental illness, and unfortunately access to firearms
[46:08] and unfortunately access to firearms everywhere. But it's always happened. I
[46:10] everywhere. But it's always happened. I don't make the connection that somehow
[46:12] don't make the connection that somehow it's more justified. I think famous
[46:13] it's more justified. I think famous people, one in three presidents has been
[46:15] people, one in three presidents has been shot at.
[46:16] shot at. >> Crazy, isn't it?
[46:16] >> Crazy, isn't it? >> So, the more famous you get, quite
[46:18] >> So, the more famous you get, quite frankly, the more personal risk there is
[46:20] frankly, the more personal risk there is because someone is going to decide that
[46:21] because someone is going to decide that taking you out for whatever reason is
[46:23] taking you out for whatever reason is going to restore their social capital.
[46:25] going to restore their social capital. Um, actually, if you look at
[46:27] Um, actually, if you look at assassination attempts and violence, the
[46:30] assassination attempts and violence, the it's actually gone down. But these
[46:32] it's actually gone down. But these stories are cinematic and very
[46:33] stories are cinematic and very interesting and they get a lot of
[46:35] interesting and they get a lot of attention. But I think it's a shame we
[46:36] attention. But I think it's a shame we live in a society where especially in
[46:38] live in a society where especially in the west where they have to endure that
[46:40] the west where they have to endure that kind of risk. The one of the really most
[46:42] kind of risk. The one of the really most lovely moments I've seen, I don't know
[46:43] lovely moments I've seen, I don't know if I don't know if it was the Danish or
[46:45] if I don't know if it was the Danish or the Norwegian prime minister who stepped
[46:47] the Norwegian prime minister who stepped down, they applauded and he got on a
[46:48] down, they applauded and he got on a bike and rode home.
[46:50] bike and rode home. >> I thought you can't do that in the US.
[46:52] >> I thought you can't do that in the US. You know, Obama's kids still have Secret
[46:54] You know, Obama's kids still have Secret Service protection.
[46:55] Service protection. >> So there's definitely something I I
[46:58] >> So there's definitely something I I quite frankly blame it on big tech that
[47:00] quite frankly blame it on big tech that tries to convince people that this
[47:02] tries to convince people that this person is their enemy and this person is
[47:04] person is their enemy and this person is evil. Sam's not evil. I don't I don't
[47:06] evil. Sam's not evil. I don't I don't know him personally. I sat next to him
[47:08] know him personally. I sat next to him at this party I went to, but he's doing
[47:10] at this party I went to, but he's doing his job. Quite frankly, we're not doing
[47:12] his job. Quite frankly, we're not doing ours.
[47:12] ours. >> Do you think he's a nihilist? And I
[47:15] >> Do you think he's a nihilist? And I don't really know the definition of the
[47:16] don't really know the definition of the word nihilist, but I guess it's someone
[47:17] word nihilist, but I guess it's someone who believes that life is essentially
[47:18] who believes that life is essentially meaningless.
[47:20] meaningless. >> I think there's a nihilist vein running
[47:22] >> I think there's a nihilist vein running through big tech where a lot of them
[47:24] through big tech where a lot of them have go bags.
[47:26] have go bags. They have a plan for if gets real.
[47:29] They have a plan for if gets real. If there's a zombie apocalypse or
[47:30] If there's a zombie apocalypse or there's a revolution or there's a
[47:32] there's a revolution or there's a massive pandemic, they have definitive
[47:36] massive pandemic, they have definitive plans and they've spent tens of millions
[47:37] plans and they've spent tens of millions of dollars. They meet their pilots at uh
[47:40] of dollars. They meet their pilots at uh Oakland airport. They get on their
[47:42] Oakland airport. They get on their Gulfream 650. They fly to New Zealand to
[47:45] Gulfream 650. They fly to New Zealand to Auckland and they have a bunker built
[47:46] Auckland and they have a bunker built out.
[47:47] out. >> Is this true?
[47:48] >> Is this true? >> Oh, there's a lot of them.
[47:49] >> Oh, there's a lot of them. >> Really?
[47:49] >> Really? >> There's a lot of goat plans amongst the
[47:51] >> There's a lot of goat plans amongst the wealthy. And sometimes it's not as
[47:53] wealthy. And sometimes it's not as dramatic. Sometimes it's just a home
[47:55] dramatic. Sometimes it's just a home nuclear attack, cyber attack,
[47:57] nuclear attack, cyber attack, revolution, but there I would say
[48:00] revolution, but there I would say amongst billionaires, I would bet
[48:01] amongst billionaires, I would bet conservatively one in three have some
[48:03] conservatively one in three have some sort of go plan. And that's nihilist.
[48:07] sort of go plan. And that's nihilist. And also I to a certain extent I think
[48:09] And also I to a certain extent I think wanting to be an interplanetary species
[48:11] wanting to be an interplanetary species is a little bit nihilist.
[48:12] is a little bit nihilist. >> Also, you know, if they're saying that
[48:15] >> Also, you know, if they're saying that in the future people aren't going to
[48:16] in the future people aren't going to have jobs and there's robotics and
[48:18] have jobs and there's robotics and there's this AGI, this super
[48:19] there's this AGI, this super intelligence which is going to be
[48:20] intelligence which is going to be smarter than everybody. Um, and even if
[48:23] smarter than everybody. Um, and even if there was a 1% chance, a 1% chance that
[48:27] there was a 1% chance, a 1% chance that these doomer predictions around AI could
[48:29] these doomer predictions around AI could come true, I would I would stop. A 1%
[48:32] come true, I would I would stop. A 1% chance of humanity um sort of
[48:34] chance of humanity um sort of self-imploding or destroying itself.
[48:37] self-imploding or destroying itself. Wouldn't a rational person stop?
[48:39] Wouldn't a rational person stop? >> Well, okay, you're Oenheimer. It's going
[48:41] >> Well, okay, you're Oenheimer. It's going to take 3 million American deaths to
[48:44] to take 3 million American deaths to invade Japan. Japan will not surrender.
[48:48] invade Japan. Japan will not surrender. And you're you're been charged with
[48:49] And you're you're been charged with splitting the atom. there was much more
[48:51] splitting the atom. there was much more than a 1% chance that our ability to
[48:53] than a 1% chance that our ability to split the atom and then show and then
[48:55] split the atom and then show and then show we would use it against a civilian
[48:56] show we would use it against a civilian population would result in the end of
[48:58] population would result in the end of the world should he have stopped
[49:00] the world should he have stopped development of it. So a lot of
[49:02] development of it. So a lot of scientists nuclear scientists killed
[49:04] scientists nuclear scientists killed themselves. They committed suicide
[49:05] themselves. They committed suicide because they were convinced once we'd
[49:06] because they were convinced once we'd split the atom and used it against
[49:07] split the atom and used it against civilians it was the end of the world.
[49:09] civilians it was the end of the world. It was just a matter of time before we
[49:10] It was just a matter of time before we all all using nuclear devices. So I
[49:13] all all using nuclear devices. So I don't I think you keep moving forward
[49:16] don't I think you keep moving forward and nihilism where nihilism impacts
[49:17] and nihilism where nihilism impacts these people. Let me go back to the go
[49:19] these people. Let me go back to the go bag. I talked to one of these people. He
[49:21] bag. I talked to one of these people. He outlined his plan for me. Right.
[49:23] outlined his plan for me. Right. >> He's a billionaire.
[49:24] >> He's a billionaire. >> Yeah. Outlined his plan. His plan B if
[49:27] >> Yeah. Outlined his plan. His plan B if things get really ugly, right?
[49:29] things get really ugly, right? And my view was, "Okay, boss, let's play
[49:31] And my view was, "Okay, boss, let's play this out. Something happens. There's an
[49:33] this out. Something happens. There's an event. It's too dangerous to stay here.
[49:36] event. It's too dangerous to stay here. You meet your pilots at the airport. I
[49:38] You meet your pilots at the airport. I mean, really gets real. People
[49:40] mean, really gets real. People scouring for food. People become feral.
[49:43] scouring for food. People become feral. Like, you don't think your pilots are
[49:44] Like, you don't think your pilots are going to kill you and your wife?
[49:47] going to kill you and your wife? Like what what do you think's going to
[49:49] Like what what do you think's going to happen? You don't think the people in
[49:51] happen? You don't think the people in New Zealand are going to be like let's
[49:52] New Zealand are going to be like let's go take the rich guys Like it
[49:56] go take the rich guys Like it strikes me that these are such mil
[49:58] strikes me that these are such mil misallocated
[50:00] misallocated efforts and capital that if you're
[50:03] efforts and capital that if you're really focused on your own future,
[50:04] really focused on your own future, whether you're wealthy or not wealthy,
[50:07] whether you're wealthy or not wealthy, that your resources and your talent
[50:09] that your resources and your talent should be allocated towards trying to
[50:11] should be allocated towards trying to make this place a little bit more
[50:12] make this place a little bit more habitable, not trying to colonize Mars,
[50:15] habitable, not trying to colonize Mars, not having a go bag. And this is part of
[50:17] not having a go bag. And this is part of the problem with the 0.1%. There are 900
[50:20] the problem with the 0.1%. There are 900 billionaires in the United States and I
[50:23] billionaires in the United States and I would imagine 300 of them 300 of them
[50:24] would imagine 300 of them 300 of them give money. They're responsible for 20%
[50:27] give money. They're responsible for 20% of all political donations. And that 20%
[50:29] of all political donations. And that 20% number is misleading because they can be
[50:31] number is misleading because they can be more strategic. So unions are big
[50:33] more strategic. So unions are big givers, but they just give money to
[50:35] givers, but they just give money to whoever's pro- union. So it's not that
[50:37] whoever's pro- union. So it's not that strategic. Billionaires can allocate
[50:39] strategic. Billionaires can allocate capital to a specific issue, really ramp
[50:41] capital to a specific issue, really ramp it up. So they have a disproportionate
[50:43] it up. So they have a disproportionate amount of control over our elected
[50:46] amount of control over our elected population. And the problem is the 0.1%
[50:48] population. And the problem is the 0.1% are not invested in the health of
[50:50] are not invested in the health of America. Why? They don't have to put up
[50:53] America. Why? They don't have to put up with TSA lines. They fly private. They
[50:56] with TSA lines. They fly private. They don't have to put up with shitty
[50:57] don't have to put up with shitty healthcare. They have concierge medicine
[51:00] healthcare. They have concierge medicine and access to the best medicine in the
[51:01] and access to the best medicine in the world. They don't have to put up with
[51:03] world. They don't have to put up with 40% of third graders can't read or
[51:05] 40% of third graders can't read or write. Their kids go to amazing schools
[51:06] write. Their kids go to amazing schools that spend on average $75,000 a year on
[51:10] that spend on average $75,000 a year on their kid versus 15,000 at a public
[51:12] their kid versus 15,000 at a public school, 10,000 at a public school in a
[51:14] school, 10,000 at a public school in a low-inccome area. They don't need
[51:16] low-inccome area. They don't need police. They have dormant security.
[51:18] police. They have dormant security. Where I live in Soho, there's no
[51:19] Where I live in Soho, there's no homeless. As far as I can tell, there's
[51:21] homeless. As far as I can tell, there's no crime. There's no cops. There's
[51:23] no crime. There's no cops. There's cameras everywhere and dormant
[51:24] cameras everywhere and dormant everywhere. My point is, they are no
[51:26] everywhere. My point is, they are no longer invested in the well-being of
[51:29] longer invested in the well-being of America. They've totally dissociated
[51:32] America. They've totally dissociated from the investments required to make
[51:35] from the investments required to make America a better place because they're
[51:37] America a better place because they're sequestered from it. They're no longer
[51:38] sequestered from it. They're no longer invested in the well-being of America.
[51:40] invested in the well-being of America. And what I find strange about this
[51:41] And what I find strange about this nihilism is that these individuals think
[51:44] nihilism is that these individuals think that they are no longer as invested in
[51:46] that they are no longer as invested in the well-being or the safety of the
[51:49] the well-being or the safety of the world. I think that's very unhealthy.
[51:51] world. I think that's very unhealthy. >> These are some of the most concerning
[51:53] >> These are some of the most concerning conversations I've had. So, like one of
[51:54] conversations I've had. So, like one of the things that happens when you have a
[51:55] the things that happens when you have a podcast and you, you know, interesting
[51:57] podcast and you, you know, interesting people come on and you start hanging
[51:58] people come on and you start hanging around with more interesting people and
[51:59] around with more interesting people and more powerful people is you get a world,
[52:01] more powerful people is you get a world, you get little peaks into new worlds
[52:03] you get little peaks into new worlds that you didn't know existed. You kind
[52:04] that you didn't know existed. You kind of get to see behind certain curtains.
[52:06] of get to see behind certain curtains. And I remember having a particular
[52:07] And I remember having a particular conversation in um in London at a
[52:10] conversation in um in London at a kitchen table where someone was
[52:12] kitchen table where someone was describing one of these AI CEOs to me
[52:15] describing one of these AI CEOs to me and basically making the case that they
[52:17] and basically making the case that they believe there's a roughly 10% chance I
[52:20] believe there's a roughly 10% chance I think they said seven if I'm not
[52:21] think they said seven if I'm not misunderstanding percent chance that
[52:23] misunderstanding percent chance that it's um that it will result in the end
[52:25] it's um that it will result in the end of the world some sort of catastrophic
[52:28] of the world some sort of catastrophic event. But they really didn't care
[52:29] event. But they really didn't care because
[52:31] because they think that being the person that
[52:33] they think that being the person that summoned this new intelligence amongst
[52:37] summoned this new intelligence amongst us was probably more consequential than
[52:41] us was probably more consequential than whatever happens. And like if I wasn't
[52:43] whatever happens. And like if I wasn't privy to, again, hearing this secondhand
[52:46] privy to, again, hearing this secondhand from someone I know who knows this
[52:49] from someone I know who knows this person extremely well, I know he knows
[52:50] person extremely well, I know he knows him extremely well cuz they hang out.
[52:52] him extremely well cuz they hang out. Him telling me that, I would have
[52:54] Him telling me that, I would have wouldn't have believed this. I would
[52:55] wouldn't have believed this. I would have thought that's nonsense. I would
[52:56] have thought that's nonsense. I would have thought to myself, you know, they
[52:57] have thought to myself, you know, they must be kind because they have kids and
[52:59] must be kind because they have kids and they're, you know, but actually now I I
[53:01] they're, you know, but actually now I I very much believe what you're saying
[53:02] very much believe what you're saying about this nihilism where I don't
[53:04] about this nihilism where I don't actually believe they give a Some
[53:06] actually believe they give a Some of them I actually don't believe some of
[53:08] of them I actually don't believe some of them give a I actually think some
[53:10] them give a I actually think some of them are playing a game and they're
[53:11] of them are playing a game and they're playing a video game with with our
[53:13] playing a video game with with our futures
[53:14] futures or they the incentives. Look, in the
[53:17] or they the incentives. Look, in the United States, when I was growing up, my
[53:20] United States, when I was growing up, my dad's boss had a slightly bigger house
[53:22] dad's boss had a slightly bigger house and he drove a Cadillac and we drove a
[53:24] and he drove a Cadillac and we drove a Grand Torino, but we were members of the
[53:26] Grand Torino, but we were members of the same country club and his house was a
[53:28] same country club and his house was a 5-minute drive away.
[53:29] 5-minute drive away. >> And maybe my dad's boss got to fly
[53:31] >> And maybe my dad's boss got to fly business class and we flew coach. The
[53:34] business class and we flew coach. The delta in having money, having a decent,
[53:37] delta in having money, having a decent, you know, having a middle class life and
[53:39] you know, having a middle class life and an upper class was like this. Now the
[53:41] an upper class was like this. Now the delta is this. Let's look at air
[53:45] delta is this. Let's look at air transportation. Flying on Spirit
[53:46] transportation. Flying on Spirit Airlines is a humiliating, dangerous
[53:48] Airlines is a humiliating, dangerous experience. United Airlines, economy,
[53:51] experience. United Airlines, economy, economy, comfort, economy, plus business
[53:54] economy, comfort, economy, plus business class, first class. Oh, wait, that's not
[53:56] class, first class. Oh, wait, that's not enough. Let's go to private. There's
[53:57] enough. Let's go to private. There's charter. There's fractional. There's a
[53:59] charter. There's fractional. There's a cessence nitation. There's a Bombardier
[54:01] cessence nitation. There's a Bombardier Challenger 300. There's a global 650.
[54:03] Challenger 300. There's a global 650. Now there's a new G7. The delta in your
[54:07] Now there's a new G7. The delta in your life, in your healthcare. My mom was
[54:09] life, in your healthcare. My mom was very sick when I was very young. And
[54:11] very sick when I was very young. And when I think about how humiliating it
[54:13] when I think about how humiliating it was for the two of us having her to
[54:15] was for the two of us having her to endure cancer when we were underinsured,
[54:18] endure cancer when we were underinsured, >> it was a devastating, traumatic,
[54:20] >> it was a devastating, traumatic, humiliating experience.
[54:22] humiliating experience. >> Now, if I need I take Lunesto when I
[54:26] >> Now, if I need I take Lunesto when I travel cuz I have trouble sleeping. I
[54:27] travel cuz I have trouble sleeping. I have a medical concierge company. They
[54:29] have a medical concierge company. They get me the drug delivered within two
[54:32] get me the drug delivered within two hours to my house. I have NAD
[54:35] hours to my house. I have NAD treatments. They arrange for someone to
[54:37] treatments. They arrange for someone to come to my house when it's convenient
[54:38] come to my house when it's convenient for me and do it in my my living room. I
[54:41] for me and do it in my my living room. I can text my doctor. I have a medical kit
[54:44] can text my doctor. I have a medical kit with all these things in it. If I'm
[54:45] with all these things in it. If I'm abroad, I can text my doctor and say,
[54:47] abroad, I can text my doctor and say, "My son's not feeling well." They
[54:49] "My son's not feeling well." They FaceTime me. I open the kit. They can
[54:51] FaceTime me. I open the kit. They can tell me exactly what to do. The the
[54:53] tell me exactly what to do. The the healthc care I am receiving right now, I
[54:56] healthc care I am receiving right now, I have my blood taken every 3 months. They
[54:58] have my blood taken every 3 months. They send it to some bond layer in Norway
[55:00] send it to some bond layer in Norway with PhDs who can tell me that I have
[55:03] with PhDs who can tell me that I have less than a 7% chance of any cancers as
[55:06] less than a 7% chance of any cancers as evidenced by a lack of whatever it is
[55:08] evidenced by a lack of whatever it is antibodies, the cancer cells. The health
[55:10] antibodies, the cancer cells. The health care now for the point for the 1% is
[55:13] care now for the point for the 1% is dramatic. And it's all the same thing.
[55:15] dramatic. And it's all the same thing. The bottom 99% of Western societies are
[55:18] The bottom 99% of Western societies are essentially being optimized and
[55:19] essentially being optimized and monetized to make the life of the 1%
[55:23] monetized to make the life of the 1% just unbelievable.
[55:25] just unbelievable. The life you can lead in the 1% is so
[55:28] The life you can lead in the 1% is so dramatic and so incredible. Broader
[55:30] dramatic and so incredible. Broader selections set of mates. Your kids have
[55:31] selections set of mates. Your kids have better health care, better educational
[55:33] better health care, better educational opportunities, you have influence,
[55:35] opportunities, you have influence, people want to know you. So the the
[55:38] people want to know you. So the the incentives to not be rich but to be very
[55:40] incentives to not be rich but to be very rich are so incredible. And the way you
[55:43] rich are so incredible. And the way you get very rich is through stock options
[55:44] get very rich is through stock options typically in technology that I think the
[55:47] typically in technology that I think the incentives are so great that people will
[55:48] incentives are so great that people will make incremental decisions and ignore
[55:51] make incremental decisions and ignore the fact that oh one in 18 girls who are
[55:54] the fact that oh one in 18 girls who are self harming in the UK site Instagram as
[55:56] self harming in the UK site Instagram as a reason for that. Well, let's find
[55:59] a reason for that. Well, let's find research that maybe create some doubts
[56:01] research that maybe create some doubts because once I become a billionaire,
[56:03] because once I become a billionaire, I'll be a better person. and they're
[56:05] I'll be a better person. and they're getting so much criticism and so much
[56:07] getting so much criticism and so much posting and it's so competitive and
[56:10] posting and it's so competitive and there's fewer and fewer winners that
[56:12] there's fewer and fewer winners that become much bigger, right? It used to be
[56:14] become much bigger, right? It used to be five companies won and got hundreds of
[56:16] five companies won and got hundreds of millions of dollars. Now one company
[56:18] millions of dollars. Now one company wins and gets tens of billions. So you
[56:20] wins and gets tens of billions. So you can see how incrementally this path to
[56:23] can see how incrementally this path to Darth Vader is pretty tempting. So I I
[56:27] Darth Vader is pretty tempting. So I I think these I think they just get caught
[56:29] think these I think they just get caught up in they get caught up in this
[56:33] up in they get caught up in this competition and they lose all sight.
[56:34] competition and they lose all sight. Also my uh co-host on the pivot podcast
[56:38] Also my uh co-host on the pivot podcast said something that was illuminating to
[56:39] said something that was illuminating to me during co I was walking down the
[56:42] me during co I was walking down the street with my sister and a homeless
[56:44] street with my sister and a homeless person kind of stirred. There was a lot
[56:45] person kind of stirred. There was a lot of homeless people because they to the
[56:47] of homeless people because they to the homeless shelters turned out everybody
[56:48] homeless shelters turned out everybody and she jumped out of her skin and I
[56:51] and she jumped out of her skin and I said well that was a bit of an
[56:53] said well that was a bit of an overreaction. He's not going to hurt us.
[56:55] overreaction. He's not going to hurt us. And she said, "You know, you're 6'2",
[56:57] And she said, "You know, you're 6'2", 190. Easy for you to say. You
[57:00] 190. Easy for you to say. You understand? She women walk around in a
[57:02] understand? She women walk around in a constant state of fear. And that fear is
[57:04] constant state of fear. And that fear is rational. And it really struck me. And
[57:06] rational. And it really struck me. And then Cara said, "These guys don't
[57:09] then Cara said, "These guys don't understand anything about victimization
[57:11] understand anything about victimization because they've never been victims.
[57:13] because they've never been victims. They're traditionally white males who
[57:15] They're traditionally white males who have grown up in upper income households
[57:17] have grown up in upper income households and they've never really understood what
[57:18] and they've never really understood what it's like to be a victim. So they don't
[57:20] it's like to be a victim. So they don't put in place the safeguards. It's so
[57:22] put in place the safeguards. It's so competitive that it's just easier for
[57:24] competitive that it's just easier for them to give money, some strategic money
[57:26] them to give money, some strategic money to key people and ensure that they're
[57:28] to key people and ensure that they're not regulated such that they can get to
[57:30] not regulated such that they can get to be billionaires because all the
[57:31] be billionaires because all the incentives are do whatever is required
[57:34] incentives are do whatever is required to win. Sam Alman isn't going to be
[57:35] to win. Sam Alman isn't going to be remembered for being a good or a bad
[57:37] remembered for being a good or a bad guy. He's going to be remembered for the
[57:39] guy. He's going to be remembered for the guy who either was the number one AI
[57:42] guy who either was the number one AI company or blew it. That's what he's
[57:44] company or blew it. That's what he's going to be remembered for. And right
[57:45] going to be remembered for. And right now he's blowing it, it appears, right?
[57:47] now he's blowing it, it appears, right? But people are going to look back and
[57:49] But people are going to look back and say, "Well, he didn't get public.
[57:52] say, "Well, he didn't get public. Anthropic ended up being at worth more.
[57:54] Anthropic ended up being at worth more. It was kind of a disappointment
[57:56] It was kind of a disappointment financially, but he was a good guy."
[57:58] financially, but he was a good guy." That's not what he's going for. And
[58:00] That's not what he's going for. And quite frankly, that's not what our
[58:01] quite frankly, that's not what our society rewards.
[58:03] society rewards. Whenever I talk to new founders, they
[58:05] Whenever I talk to new founders, they all carry the same belief that the
[58:06] all carry the same belief that the biggest threat to their small business
[58:07] biggest threat to their small business is having a bad product. It's not. The
[58:10] is having a bad product. It's not. The biggest threat they face is hiring the
[58:12] biggest threat they face is hiring the wrong person for their small team.
[58:13] wrong person for their small team. Because bad hires put the handbreak on
[58:16] Because bad hires put the handbreak on growth and rapidly infect company
[58:18] growth and rapidly infect company culture. And I get it. When you're
[58:20] culture. And I get it. When you're starting out, you need a players to grow
[58:21] starting out, you need a players to grow your company, but you don't have 40
[58:23] your company, but you don't have 40 spare hours to spend sifting through
[58:25] spare hours to spend sifting through interviews. So, you hire fast to fill
[58:27] interviews. So, you hire fast to fill the seat. But this ends up costing you
[58:28] the seat. But this ends up costing you even more in the long term. What any
[58:30] even more in the long term. What any small business actually needs is a
[58:31] small business actually needs is a reliable and efficient way to find the
[58:32] reliable and efficient way to find the right talent without stalling momentum.
[58:34] right talent without stalling momentum. That's where our long-term sponsor
[58:36] That's where our long-term sponsor LinkedIn and LinkedIn Hiring Pro can
[58:38] LinkedIn and LinkedIn Hiring Pro can help you. It streamlines the entire
[58:39] help you. It streamlines the entire hiring process from drafting a job
[58:41] hiring process from drafting a job description to shortlisting candidates
[58:43] description to shortlisting candidates and conducting AI powered interviews for
[58:45] and conducting AI powered interviews for initial screenings. Nearly 60% of hirers
[58:48] initial screenings. Nearly 60% of hirers who've used this tool have found a
[58:50] who've used this tool have found a candidate to interview within a week.
[58:51] candidate to interview within a week. Hire right the first time around and get
[58:53] Hire right the first time around and get started by posting your first job for
[58:55] started by posting your first job for free at linkedin.com/doac.
[58:58] free at linkedin.com/doac. That's linkedin.com/doac.
[59:00] That's linkedin.com/doac. And terms and conditions apply.
[59:03] And terms and conditions apply. Inflammation is something I'd never
[59:04] Inflammation is something I'd never really thought to check until fairly
[59:06] really thought to check until fairly recently. There aren't any symptoms or
[59:08] recently. There aren't any symptoms or obvious signs for a lot of people, and
[59:09] obvious signs for a lot of people, and yet it's consistently linked to some of
[59:11] yet it's consistently linked to some of the most serious health outcomes that we
[59:12] the most serious health outcomes that we know of. But our sponsor, Function,
[59:15] know of. But our sponsor, Function, tests twice a year for inflammation
[59:17] tests twice a year for inflammation markers alongside hormones and heart
[59:19] markers alongside hormones and heart health and stress and toxins, giving you
[59:22] health and stress and toxins, giving you a far more specific view of what's going
[59:24] a far more specific view of what's going on inside your body. I did my tests
[59:26] on inside your body. I did my tests recently and I was shocked a little bit,
[59:29] recently and I was shocked a little bit, a little bit amazed, a little bit
[59:30] a little bit amazed, a little bit surprised by my results. But what I
[59:33] surprised by my results. But what I really loved was having all of the
[59:35] really loved was having all of the information about my body in front of
[59:36] information about my body in front of me. It makes it so much easier to make a
[59:39] me. It makes it so much easier to make a more informed decision immediately about
[59:41] more informed decision immediately about what I do in terms of exercise and in
[59:42] what I do in terms of exercise and in terms of what I do in terms of my diet
[59:44] terms of what I do in terms of my diet and everything else. So, if you want to
[59:46] and everything else. So, if you want to look into it yourself, go to
[59:47] look into it yourself, go to functionhealth.com/doac
[59:50] and use code doc25
[59:53] and use code doc25 for a $25 credit towards your
[59:55] for a $25 credit towards your membership. It gives you 160 lab tests
[59:58] membership. It gives you 160 lab tests for $365 a year, which is just $1 a day.
[01:00:03] for $365 a year, which is just $1 a day. I think your assessment of AI in terms
[01:00:06] I think your assessment of AI in terms of it creating more opportunity than it
[01:00:08] of it creating more opportunity than it destroys is accurate. I also think that
[01:00:11] destroys is accurate. I also think that in the near term and all of our lives
[01:00:13] in the near term and all of our lives are going to be quite quite different. I
[01:00:16] are going to be quite quite different. I spend a lot of time on this podcast. We
[01:00:17] spend a lot of time on this podcast. We talk we've spoken to a lot of the big
[01:00:18] talk we've spoken to a lot of the big CEOs from AI companies and tried to pass
[01:00:20] CEOs from AI companies and tried to pass through this and I've never really
[01:00:21] through this and I've never really gotten to a solid a solid conclusion of
[01:00:24] gotten to a solid a solid conclusion of what the future looks like. But um do
[01:00:25] what the future looks like. But um do you have a a a case for the future where
[01:00:28] you have a a a case for the future where this technology is the first that
[01:00:29] this technology is the first that actually delivers on the promise of
[01:00:30] actually delivers on the promise of making us more human? There's talk about
[01:00:33] making us more human? There's talk about loneliness. There's talk, you know,
[01:00:34] loneliness. There's talk, you know, algorithms are going to get get more
[01:00:36] algorithms are going to get get more attentive. But I have this like thesis
[01:00:37] attentive. But I have this like thesis that potentially, you know, social
[01:00:39] that potentially, you know, social networking say it's going to connect us
[01:00:40] networking say it's going to connect us and make us more human. That's kind of
[01:00:42] and make us more human. That's kind of the narrative of all these tools. But
[01:00:43] the narrative of all these tools. But maybe this technology renders us only
[01:00:45] maybe this technology renders us only useful for that which humans can do,
[01:00:47] useful for that which humans can do, which is the relationships thing you
[01:00:48] which is the relationships thing you were talking about and connection and
[01:00:50] were talking about and connection and being out touching grass with our
[01:00:51] being out touching grass with our friends and family. Is there a utopia
[01:00:53] friends and family. Is there a utopia >> just with social media has some real
[01:00:55] >> just with social media has some real upsides? I would argue at this point
[01:00:57] upsides? I would argue at this point meta is actually a net negative. It's I
[01:00:59] meta is actually a net negative. It's I think done so much damage.
[01:01:01] think done so much damage. >> But with AI, there's a lot of upsides. A
[01:01:04] >> But with AI, there's a lot of upsides. A really encouraging piece of data I saw
[01:01:06] really encouraging piece of data I saw is that when you spend time on social
[01:01:07] is that when you spend time on social media, it takes you to the extremes. The
[01:01:10] media, it takes you to the extremes. The algorithms see money and more Nissan ads
[01:01:13] algorithms see money and more Nissan ads in figuring out if you're conservative
[01:01:15] in figuring out if you're conservative or liberal and then demonizing the other
[01:01:17] or liberal and then demonizing the other side and tickling your sensors. Right.
[01:01:18] side and tickling your sensors. Right. >> Yeah. I'll get served a video of Ted
[01:01:22] >> Yeah. I'll get served a video of Ted Cruz or Victor Orban looking like an ass
[01:01:24] Cruz or Victor Orban looking like an ass because that I'm a progressive and that
[01:01:26] because that I'm a progressive and that makes me happy. And then they elevate
[01:01:28] makes me happy. And then they elevate content of some progressive of AOC
[01:01:31] content of some progressive of AOC calling out somebody and I start to hate
[01:01:33] calling out somebody and I start to hate the other side. What they're finding
[01:01:35] the other side. What they're finding with AI is it's actually people who
[01:01:36] with AI is it's actually people who spend more time in AI, it's actually
[01:01:38] spend more time in AI, it's actually moderating their views because AI is
[01:01:40] moderating their views because AI is about uh taking the medium or the
[01:01:42] about uh taking the medium or the average of every piece of data. Also, do
[01:01:45] average of every piece of data. Also, do you notice how nice AI is?
[01:01:47] you notice how nice AI is? >> Yeah,
[01:01:47] >> Yeah, >> great question, Stephen. You know, no
[01:01:49] >> great question, Stephen. You know, no matter what you say or how stupid it is,
[01:01:51] matter what you say or how stupid it is, it won't say, "That's a really stupid
[01:01:52] it won't say, "That's a really stupid idea, Stephen." It'll say, "I can
[01:01:54] idea, Stephen." It'll say, "I can understand why you're asking this
[01:01:55] understand why you're asking this question." And generally speaking,
[01:01:57] question." And generally speaking, because it looks for the median of
[01:01:58] because it looks for the median of things, it looks for the average. What
[01:01:59] things, it looks for the average. What is the most often used seventh word
[01:02:02] is the most often used seventh word after these six words are strung
[01:02:03] after these six words are strung together?
[01:02:04] together? >> Mh.
[01:02:04] >> Mh. >> That it's pushing people towards the
[01:02:06] >> That it's pushing people towards the middle. It's actually having a
[01:02:07] middle. It's actually having a moderating effect, which I find very
[01:02:09] moderating effect, which I find very encouraging. I think old people in
[01:02:11] encouraging. I think old people in senior care who maybe have lost family
[01:02:13] senior care who maybe have lost family members or don't have a lot of
[01:02:15] members or don't have a lot of relationships in their life, I can see
[01:02:18] relationships in their life, I can see AI characters playing a big role in
[01:02:20] AI characters playing a big role in their life. I believe for a long time
[01:02:23] their life. I believe for a long time that the biggest danger of AI is not
[01:02:28] that the biggest danger of AI is not weapons, it's not income inequality. We
[01:02:30] weapons, it's not income inequality. We voted for income inequality or
[01:02:32] voted for income inequality or contamination of our elections. I mean,
[01:02:34] contamination of our elections. I mean, all of these things are big issues. The
[01:02:36] all of these things are big issues. The biggest downside of AI in my view is
[01:02:39] biggest downside of AI in my view is loneliness. And that is, and we talk
[01:02:41] loneliness. And that is, and we talk about this a lot, AI is convincing
[01:02:44] about this a lot, AI is convincing people they can have a reasonable
[01:02:45] people they can have a reasonable faximile of life on a screen with an
[01:02:47] faximile of life on a screen with an algorithm. Why go through the effort of
[01:02:48] algorithm. Why go through the effort of having friends when you have Discord and
[01:02:50] having friends when you have Discord and Reddit? Why go through the hassle of
[01:02:52] Reddit? Why go through the hassle of getting a job when you can make money on
[01:02:54] getting a job when you can make money on Coinbase or Robin Hood supposedly? And
[01:02:56] Coinbase or Robin Hood supposedly? And why would you go through the effort of
[01:02:58] why would you go through the effort of trying to have a romantic relationship
[01:03:00] trying to have a romantic relationship when you have lifelike synthetic porn?
[01:03:02] when you have lifelike synthetic porn? And the young male brain, I think, is
[01:03:04] And the young male brain, I think, is especially susceptible to this. Men aged
[01:03:06] especially susceptible to this. Men aged 20 to 30 are spending less time outdoors
[01:03:07] 20 to 30 are spending less time outdoors in prison inmates. 42% of men 18 to 24
[01:03:11] in prison inmates. 42% of men 18 to 24 and we talked about this on your last
[01:03:12] and we talked about this on your last show have never asked a woman out in
[01:03:14] show have never asked a woman out in person. So people are starting to
[01:03:16] person. So people are starting to believe they can have a reasonable
[01:03:17] believe they can have a reasonable faximile of life. And what they need to
[01:03:19] faximile of life. And what they need to know is that 40% of the S&P our economy
[01:03:22] know is that 40% of the S&P our economy is trying to sequester you from the most
[01:03:24] is trying to sequester you from the most important and rewarding thing in your
[01:03:26] important and rewarding thing in your life and that is your relationships. So
[01:03:28] life and that is your relationships. So my prediction for America is we will
[01:03:30] my prediction for America is we will never be as prosperous. incredible
[01:03:32] never be as prosperous. incredible prosperity, incredible economic growth,
[01:03:35] prosperity, incredible economic growth, and massive loneliness, depression,
[01:03:37] and massive loneliness, depression, anxiety, and obesity.
[01:03:38] anxiety, and obesity. >> I wanted to get your POV as well on all
[01:03:40] >> I wanted to get your POV as well on all this war stuff that's going on over in
[01:03:42] this war stuff that's going on over in the Middle East. What you thought Trump
[01:03:44] the Middle East. What you thought Trump was trying to do and actually what
[01:03:46] was trying to do and actually what happened because, you know, he's he's
[01:03:47] happened because, you know, he's he's talking about, oh, no, we said it would
[01:03:49] talking about, oh, no, we said it would be 6 weeks, it's 5 and a half weeks,
[01:03:51] be 6 weeks, it's 5 and a half weeks, everything's going to plan. We're we're
[01:03:53] everything's going to plan. We're we're winning the war. And I you're good at
[01:03:56] winning the war. And I you're good at seeing through things. What what
[01:03:57] seeing through things. What what actually happened in your view here? I
[01:03:59] actually happened in your view here? I think that he was talked into by some of
[01:04:02] think that he was talked into by some of his security adviserss and Netanyahu
[01:04:03] his security adviserss and Netanyahu that his legacy could be restoring or
[01:04:06] that his legacy could be restoring or making the Middle East more peaceful and
[01:04:08] making the Middle East more peaceful and being the the president that that
[01:04:11] being the the president that that severely diminished the capacity of what
[01:04:14] severely diminished the capacity of what has been arguably the number one um
[01:04:17] has been arguably the number one um source of terror in the region and maybe
[01:04:19] source of terror in the region and maybe even around the world for a long time.
[01:04:21] even around the world for a long time. that this was his defining moment as a
[01:04:23] that this was his defining moment as a president to go in that they were
[01:04:24] president to go in that they were weakened, they were hobbled and we had
[01:04:26] weakened, they were hobbled and we had the military assets and the intelligence
[01:04:28] the military assets and the intelligence to go in and topple the regime and he
[01:04:30] to go in and topple the regime and he would be known as the president that
[01:04:31] would be known as the president that liberated the Middle East. I think he
[01:04:33] liberated the Middle East. I think he bought into that
[01:04:34] bought into that >> off the back of Venezuela.
[01:04:36] >> off the back of Venezuela. >> Well,
[01:04:37] >> Well, >> which was a kind of a success.
[01:04:38] >> which was a kind of a success. >> Not kind of. That was arguably one of
[01:04:40] >> Not kind of. That was arguably one of the greatest military operations in
[01:04:42] the greatest military operations in history, but it was a military
[01:04:43] history, but it was a military operation. It wasn't a war. I mean,
[01:04:45] operation. It wasn't a war. I mean, supposedly his guards didn't know what
[01:04:48] supposedly his guards didn't know what happened. and all of a sudden they were
[01:04:49] happened. and all of a sudden they were on the floor throwing up because there
[01:04:50] on the floor throwing up because there was some sort of crazy radiation or
[01:04:51] was some sort of crazy radiation or radio waves. Not a single American
[01:04:54] radio waves. Not a single American soldier uh killed, lifted him out. I
[01:04:57] soldier uh killed, lifted him out. I mean that that was that was an
[01:04:59] mean that that was that was an incredible operation. I think he got
[01:05:01] incredible operation. I think he got drunk on that macho and that success and
[01:05:03] drunk on that macho and that success and thought well I can do the same thing.
[01:05:05] thought well I can do the same thing. And the problem with wars is that the
[01:05:07] And the problem with wars is that the enemy has a say. And what we have found
[01:05:10] enemy has a say. And what we have found since I think a lot of our leaders
[01:05:12] since I think a lot of our leaders believe that wars could ultimately if
[01:05:14] believe that wars could ultimately if you have a bigger army will result in
[01:05:16] you have a bigger army will result in unconditional surrender on the other
[01:05:17] unconditional surrender on the other side. And there really hasn't been
[01:05:19] side. And there really hasn't been unconditional surrender since World War
[01:05:20] unconditional surrender since World War II. And all the enemy needs to do
[01:05:22] II. And all the enemy needs to do whether it's the Vietkong or the Taliban
[01:05:24] whether it's the Vietkong or the Taliban or the IRGC, all they need to do is
[01:05:26] or the IRGC, all they need to do is survive and they win. In addition, I
[01:05:29] survive and they win. In addition, I would describe the war as being
[01:05:31] would describe the war as being operational excellence and strategic
[01:05:32] operational excellence and strategic incompetence. I actually believe there
[01:05:35] incompetence. I actually believe there was justification and some merit to the
[01:05:37] was justification and some merit to the idea of a military operation that tried
[01:05:39] idea of a military operation that tried to take advantage of a weakened Iran
[01:05:41] to take advantage of a weakened Iran whose air defenses were down to take out
[01:05:44] whose air defenses were down to take out the IRGC or at least maybe provide cloud
[01:05:46] the IRGC or at least maybe provide cloud cover for a revolution against the IRGC
[01:05:50] cover for a revolution against the IRGC to diminish their navy, diminish their
[01:05:51] to diminish their navy, diminish their nuclear capabilities, diminish their
[01:05:53] nuclear capabilities, diminish their missile launch capability. I think there
[01:05:54] missile launch capability. I think there was real justification for that. The
[01:05:56] was real justification for that. The problem was the strategic incompetence
[01:05:59] problem was the strategic incompetence here. not going to not enlisting some
[01:06:01] here. not going to not enlisting some European allies, not not briefing
[01:06:04] European allies, not not briefing Congress, not coordinating with Gulf
[01:06:06] Congress, not coordinating with Gulf allies, not recognizing the game theory
[01:06:09] allies, not recognizing the game theory around what happens if they start firing
[01:06:10] around what happens if they start firing at ships in the straight of Hormuz. It
[01:06:13] at ships in the straight of Hormuz. It just was strategically incompetent. And
[01:06:16] just was strategically incompetent. And now we're in a situation where we want
[01:06:19] now we're in a situation where we want to diminish their nuclear capacity. But
[01:06:21] to diminish their nuclear capacity. But the reality is if they can maintain an
[01:06:23] the reality is if they can maintain an ability to to turn off and on the
[01:06:25] ability to to turn off and on the straight of hormones, that is probably
[01:06:27] straight of hormones, that is probably more powerful than them having nuclear
[01:06:29] more powerful than them having nuclear weapons. If you look at if you look at
[01:06:32] weapons. If you look at if you look at what happens if if the fertilizer or the
[01:06:37] what happens if if the fertilizer or the the products that create fertilizer are
[01:06:39] the products that create fertilizer are further sequestered, you could have a
[01:06:42] further sequestered, you could have a mass starvation event around the world.
[01:06:44] mass starvation event around the world. And so we've essentially given them
[01:06:49] And so we've essentially given them potentially a weapon that's more
[01:06:50] potentially a weapon that's more powerful than a nuclear weapon. And also
[01:06:54] powerful than a nuclear weapon. And also we look like we have a bit of a glass
[01:06:56] we look like we have a bit of a glass jaw. We go in, we lose 14 soldiers.
[01:06:58] jaw. We go in, we lose 14 soldiers. Tragedy for them and their families. The
[01:07:00] Tragedy for them and their families. The Russians are losing a thousand people a
[01:07:02] Russians are losing a thousand people a day and they don't show any signs of
[01:07:03] day and they don't show any signs of giving up. We look like this giant boxer
[01:07:06] giving up. We look like this giant boxer who's the most skilled and biggest boxer
[01:07:07] who's the most skilled and biggest boxer in the world. We go in argument over
[01:07:10] in the world. We go in argument over whether we should go in or not. But now
[01:07:12] whether we should go in or not. But now if we withdraw, what incentive do the
[01:07:15] if we withdraw, what incentive do the Iranians right now have to negotiate
[01:07:18] Iranians right now have to negotiate when every day this goes on, they look
[01:07:20] when every day this goes on, they look like the little guy that's that that
[01:07:23] like the little guy that's that that stood up to the evil America and Israel.
[01:07:25] stood up to the evil America and Israel. I think this has been a disaster from a
[01:07:28] I think this has been a disaster from a US brand standpoint. And I would argue
[01:07:30] US brand standpoint. And I would argue that them blocking inbound and outbound
[01:07:33] that them blocking inbound and outbound traffic from Iranian ports is actually
[01:07:35] traffic from Iranian ports is actually the way to go. But that lends itself to
[01:07:38] the way to go. But that lends itself to the argument we should have just
[01:07:39] the argument we should have just continued with economic sanctions. But I
[01:07:42] continued with economic sanctions. But I have to own it. I thought military
[01:07:43] have to own it. I thought military action was a good idea. I would argue, I
[01:07:46] action was a good idea. I would argue, I apologize for the word salad here, that
[01:07:47] apologize for the word salad here, that we've been at war with Iran for 47
[01:07:49] we've been at war with Iran for 47 years. Their first act to this regime in
[01:07:50] years. Their first act to this regime in 1979 was to take 110 Americans hostage.
[01:07:54] 1979 was to take 110 Americans hostage. 70% of the IEDs in Iraq were built in
[01:07:57] 70% of the IEDs in Iraq were built in Iran. We have been at war with Iran for
[01:07:59] Iran. We have been at war with Iran for a long time. The question is whether or
[01:08:01] a long time. The question is whether or not this military action/war was worth
[01:08:04] not this military action/war was worth it. And it looks as if it was so poorly
[01:08:07] it. And it looks as if it was so poorly executed on a strategic level that it is
[01:08:11] executed on a strategic level that it is really diminished our brand. And we're
[01:08:13] really diminished our brand. And we're now in the definition of a quagmire. If
[01:08:15] now in the definition of a quagmire. If we leave, we look weak. And any nation
[01:08:17] we leave, we look weak. And any nation we perform military action against just
[01:08:19] we perform military action against just says start firing missiles at their
[01:08:21] says start firing missiles at their neighbors. You know who summarized every
[01:08:23] neighbors. You know who summarized every war for the US right now is in this war
[01:08:27] war for the US right now is in this war was Ho Chi Min. When he was describing
[01:08:29] was Ho Chi Min. When he was describing Vietnam, he said, "They will kill a lot
[01:08:31] Vietnam, he said, "They will kill a lot of our people. We will kill some of
[01:08:33] of our people. We will kill some of theirs. They will tire. They will go
[01:08:36] theirs. They will tire. They will go home.
[01:08:38] home. >> We killed a million people in V Vietnam.
[01:08:41] >> We killed a million people in V Vietnam. We lost 58,000 and we left.
[01:08:44] We lost 58,000 and we left. >> Yeah.
[01:08:44] >> Yeah. >> Right. So, Americans tolerance for pain.
[01:08:47] >> Right. So, Americans tolerance for pain. And I'm not suggesting that we should go
[01:08:49] And I'm not suggesting that we should go on military misadventures. But this
[01:08:52] on military misadventures. But this right now is no doubt turning out to be
[01:08:54] right now is no doubt turning out to be a giant win for the IRGC and a pretty
[01:08:58] a giant win for the IRGC and a pretty severe loss for America that fur further
[01:09:00] severe loss for America that fur further alienates us from our allies and shows
[01:09:03] alienates us from our allies and shows uh kind of our soft tissue around our
[01:09:05] uh kind of our soft tissue around our willingness to actually finish the job.
[01:09:06] willingness to actually finish the job. The other statement that kind of
[01:09:08] The other statement that kind of perfectly summarizes this war is we
[01:09:09] perfectly summarizes this war is we broke it now you fix it. We're leaving
[01:09:12] broke it now you fix it. We're leaving the world a much more insecure place,
[01:09:15] the world a much more insecure place, much more economically vulnerable. Gulf
[01:09:17] much more economically vulnerable. Gulf States thought that having a US military
[01:09:19] States thought that having a US military base was Kevlar and ended up just
[01:09:22] base was Kevlar and ended up just putting a giant bullseye on them. So,
[01:09:24] putting a giant bullseye on them. So, people are going to think twice before
[01:09:25] people are going to think twice before they cooperate with our US military by
[01:09:27] they cooperate with our US military by hosting bases. Uh we appear to have not
[01:09:30] hosting bases. Uh we appear to have not thought through getting our expats out
[01:09:32] thought through getting our expats out of the Gulf, protecting the Gulf States.
[01:09:35] of the Gulf, protecting the Gulf States. And I'm biased. I think the Trump
[01:09:36] And I'm biased. I think the Trump administration will be known for
[01:09:37] administration will be known for criminal for corruption and
[01:09:39] criminal for corruption and incompetence. And that incompetence is
[01:09:41] incompetence. And that incompetence is bubbling up. I watched a press
[01:09:43] bubbling up. I watched a press conference of his last night where he
[01:09:45] conference of his last night where he said this which kind of links to what
[01:09:46] said this which kind of links to what you were saying.
[01:09:47] you were saying. >> They're they're delaying it because they
[01:09:49] >> They're they're delaying it because they we don't know who to deal with. They are
[01:09:51] we don't know who to deal with. They are in, you know, they know who the leader
[01:09:53] in, you know, they know who the leader is in this country. We don't know who
[01:09:55] is in this country. We don't know who the leader is in Iran.
[01:09:57] the leader is in Iran. >> And I just thought, do you know what? If
[01:09:59] >> And I just thought, do you know what? If I was running Iran,
[01:10:01] I was running Iran, that's exactly what I would want. We
[01:10:04] that's exactly what I would want. We don't know who to deal with, Trump. We
[01:10:05] don't know who to deal with, Trump. We don't even know who to negotiate with.
[01:10:06] don't even know who to negotiate with. If he can't negotiate, he can't do
[01:10:08] If he can't negotiate, he can't do anything. So I think I think the
[01:10:10] anything. So I think I think the Iranians potentially the the IRGC are
[01:10:13] Iranians potentially the the IRGC are saying no we don't even know who's
[01:10:14] saying no we don't even know who's leading because then what can you do?
[01:10:16] leading because then what can you do? You can't just keep bombing. That's not
[01:10:18] You can't just keep bombing. That's not going to work. You also can't negotiate.
[01:10:20] going to work. You also can't negotiate. So it runs the clock down. You can
[01:10:22] So it runs the clock down. You can affect regime change from the air. They
[01:10:26] affect regime change from the air. They the Iranians are smart. They have
[01:10:27] the Iranians are smart. They have distributed power.
[01:10:28] distributed power. >> Yeah.
[01:10:28] >> Yeah. >> So there's no kind of head of the snake
[01:10:30] >> So there's no kind of head of the snake to cut off. But also a lot of this is
[01:10:32] to cut off. But also a lot of this is incompetence on our part. Do you know
[01:10:34] incompetence on our part. Do you know what we have done in the US to our
[01:10:35] what we have done in the US to our diplomatic corps? We've absolutely
[01:10:37] diplomatic corps? We've absolutely gutted it. When Vance goes to Islamabad
[01:10:40] gutted it. When Vance goes to Islamabad thinking he's going to do a deal, if you
[01:10:42] thinking he's going to do a deal, if you look at the history of every truce of
[01:10:44] look at the history of every truce of every deal of every summit, 97% of the
[01:10:48] every deal of every summit, 97% of the work is done before they hit the tarmac
[01:10:51] work is done before they hit the tarmac by diplomats who go through line by line
[01:10:54] by diplomats who go through line by line and iron out and talk to each other and
[01:10:57] and iron out and talk to each other and negotiate. We've gutted our diplomatic
[01:10:59] negotiate. We've gutted our diplomatic core. Our counter intelligence
[01:11:02] core. Our counter intelligence operations have been gutted. So, we're
[01:11:05] operations have been gutted. So, we're kind of flying blind without
[01:11:06] kind of flying blind without instruments. and they think that JD
[01:11:08] instruments. and they think that JD Vance going to Islamabad for a Tik Tok
[01:11:10] Vance going to Islamabad for a Tik Tok moment where he tries to look
[01:11:11] moment where he tries to look presidential and then blames it on the
[01:11:13] presidential and then blames it on the IRGC, it's just not going to work. So I
[01:11:16] IRGC, it's just not going to work. So I think a a lack of investment in key
[01:11:18] think a a lack of investment in key areas, a lack of soft power, whether
[01:11:21] areas, a lack of soft power, whether it's through cutting USA aid, a lack of
[01:11:23] it's through cutting USA aid, a lack of respect for our diplomatic core and the
[01:11:26] respect for our diplomatic core and the engine room, like like you know actually
[01:11:28] engine room, like like you know actually working with people and talking to them.
[01:11:30] working with people and talking to them. I think this level of incompetence and
[01:11:32] I think this level of incompetence and arrogance is all bubbling up and they
[01:11:35] arrogance is all bubbling up and they you're right, they don't even know who
[01:11:36] you're right, they don't even know who to talk to. That's good. They don't even
[01:11:38] to talk to. That's good. They don't even know who who do we negotiate with. And
[01:11:41] know who who do we negotiate with. And by the way, you got to look at
[01:11:42] by the way, you got to look at incentives. If I'm the IRGC, I'm like,
[01:11:45] incentives. If I'm the IRGC, I'm like, all we need to do is survive. Yeah.
[01:11:46] all we need to do is survive. Yeah. Every day this goes on. And by the way,
[01:11:48] Every day this goes on. And by the way, we're in the middle of what I'd call a
[01:11:50] we're in the middle of what I'd call a the first AI meme war. And that is Trump
[01:11:54] the first AI meme war. And that is Trump is communicating with truths on truth
[01:11:56] is communicating with truths on truth social and these macho videos and memes
[01:12:00] social and these macho videos and memes of him as Jesus Christ, but he's been
[01:12:02] of him as Jesus Christ, but he's been bested by kind of the sloppi memes of
[01:12:08] bested by kind of the sloppi memes of the IRGC. Have you seen these Legos
[01:12:10] the IRGC. Have you seen these Legos memes? They understand him better than
[01:12:13] memes? They understand him better than he understands them. They are
[01:12:15] he understands them. They are outstanding. And it's a group of 18 to
[01:12:17] outstanding. And it's a group of 18 to 25 year old probably people who look
[01:12:19] 25 year old probably people who look like this crew.
[01:12:20] like this crew. >> None of the none of this what we're
[01:12:21] >> None of the none of this what we're involved in Tehran
[01:12:24] involved in Tehran figuring out AIdriven means, highly
[01:12:26] figuring out AIdriven means, highly produced, highly effective, very strong
[01:12:28] produced, highly effective, very strong message going right after the, you know,
[01:12:30] message going right after the, you know, portrayals of the Epstein island and I
[01:12:33] portrayals of the Epstein island and I mean they are doing a much better job
[01:12:36] mean they are doing a much better job because as Lincoln said and it goes back
[01:12:37] because as Lincoln said and it goes back to basics, you can't win a war without
[01:12:39] to basics, you can't win a war without public support and you can't lose a war
[01:12:41] public support and you can't lose a war with public support. And right now Iran
[01:12:44] with public support. And right now Iran is winning what you would call the
[01:12:45] is winning what you would call the slopp.
[01:12:47] slopp. They're putting out better information
[01:12:48] They're putting out better information and propaganda than us right now.
[01:12:50] and propaganda than us right now. >> And what how do you predict this ends?
[01:12:51] >> And what how do you predict this ends? Because as you said, we can't back out
[01:12:53] Because as you said, we can't back out of this war as the West. Trump can't
[01:12:55] of this war as the West. Trump can't back out of it, but he also just can't
[01:12:57] back out of it, but he also just can't keep doing it. His the midterms are
[01:12:59] keep doing it. His the midterms are coming up. His approval ratings are I
[01:13:00] coming up. His approval ratings are I think an all-time low because of this
[01:13:02] think an all-time low because of this war. His base is turning on him. So, he
[01:13:04] war. His base is turning on him. So, he can't it's like lose loose at this
[01:13:06] can't it's like lose loose at this moment.
[01:13:07] moment. >> Yeah. So, the honest answer is I don't
[01:13:09] >> Yeah. So, the honest answer is I don't know cuz he's unpredictable. If I had to
[01:13:10] know cuz he's unpredictable. If I had to pick anything, I think there'll be a
[01:13:12] pick anything, I think there'll be a multinational force that has a vested
[01:13:13] multinational force that has a vested interest in ensuring the straight of
[01:13:15] interest in ensuring the straight of hormones is open. There's so many things
[01:13:17] hormones is open. There's so many things we've taken for granted in the West and
[01:13:19] we've taken for granted in the West and one of them is freedom of navigation. If
[01:13:21] one of them is freedom of navigation. If Malaysia, Singapore, and Indonesia
[01:13:23] Malaysia, Singapore, and Indonesia decide we need money or we need
[01:13:24] decide we need money or we need geopolitical power, we're going to close
[01:13:26] geopolitical power, we're going to close the straits of Malika or the Strait of
[01:13:28] the straits of Malika or the Strait of Singapore. What if China closes off
[01:13:30] Singapore. What if China closes off certain straits in the South China Sea
[01:13:32] certain straits in the South China Sea where something like 60% of trade goes
[01:13:34] where something like 60% of trade goes through? We have always enforced, we've
[01:13:37] through? We have always enforced, we've gone to war over this basic notion of
[01:13:39] gone to war over this basic notion of freedom of navigation that pretty much
[01:13:40] freedom of navigation that pretty much any ship goes pretty much gets to go
[01:13:42] any ship goes pretty much gets to go anywhere with pretty much anything on
[01:13:44] anywhere with pretty much anything on it. So I think that European nations,
[01:13:47] it. So I think that European nations, Asian nations all have a vested interest
[01:13:49] Asian nations all have a vested interest in ensuring the straight of Hormos is
[01:13:51] in ensuring the straight of Hormos is open. The way to do that is by
[01:13:54] open. The way to do that is by sequestering or blocking offloading and
[01:13:56] sequestering or blocking offloading and onloading of oil into Iranian ports
[01:13:59] onloading of oil into Iranian ports because at some point it not only shuts
[01:14:00] because at some point it not only shuts them off economically. I didn't know
[01:14:02] them off economically. I didn't know this, but within I think about 12 weeks
[01:14:06] this, but within I think about 12 weeks if you can't offload the oil and get it
[01:14:09] if you can't offload the oil and get it somewhere, it backs up and actually
[01:14:12] somewhere, it backs up and actually starts to damage the source. Well,
[01:14:14] starts to damage the source. Well, >> oh, okay.
[01:14:15] >> oh, okay. >> It like stops up. So, I think ideally
[01:14:18] >> It like stops up. So, I think ideally the optimistic scenario is there's a
[01:14:20] the optimistic scenario is there's a multinational force that enforces the
[01:14:22] multinational force that enforces the straight of Hormos. He comes up with a
[01:14:24] straight of Hormos. He comes up with a series of objectives. If he had in his
[01:14:26] series of objectives. If he had in his first if he had outlined, we're going to
[01:14:28] first if he had outlined, we're going to take out most of their navy. We're going
[01:14:29] take out most of their navy. We're going to give the Iranian people an
[01:14:30] to give the Iranian people an opportunity to overthrow the IRGC. We're
[01:14:33] opportunity to overthrow the IRGC. We're going to take out the majority of their
[01:14:34] going to take out the majority of their missile capability.
[01:14:36] missile capability. And after 72 hours said, "This is a
[01:14:38] And after 72 hours said, "This is a win." and got the hell out and
[01:14:40] win." and got the hell out and coordinated with Gulf States to say,
[01:14:41] coordinated with Gulf States to say, "What can we do to ensure your safety?"
[01:14:44] "What can we do to ensure your safety?" I think it would have been a win. But
[01:14:46] I think it would have been a win. But now, I don't even know what his
[01:14:48] now, I don't even know what his objectives are. Can you list what his
[01:14:49] objectives are. Can you list what his objectives are? I don't know what they
[01:14:51] objectives are? I don't know what they are. They seem to keep
[01:14:53] are. They seem to keep >> We're going to be out very soon, which
[01:14:54] >> We're going to be out very soon, which is why we need to continue this war. It
[01:14:56] is why we need to continue this war. It the communications here and thinking has
[01:14:59] the communications here and thinking has been so sclerotic
[01:15:01] been so sclerotic and difficult to track. I'm like, who's
[01:15:02] and difficult to track. I'm like, who's boarding what ships? Is it a ceasefire?
[01:15:04] boarding what ships? Is it a ceasefire? Isn't a ceasefire? He claims that them
[01:15:07] Isn't a ceasefire? He claims that them commandeering two western vessels is not
[01:15:10] commandeering two western vessels is not a violation of the ceasefire. Of course
[01:15:12] a violation of the ceasefire. Of course it is. That's actually that's an act of
[01:15:14] it is. That's actually that's an act of war. So we don't even know what to look
[01:15:16] war. So we don't even know what to look for. You ask me how does this turn out?
[01:15:18] for. You ask me how does this turn out? I think a multinational force that opens
[01:15:20] I think a multinational force that opens a straight of hormones. Too many people
[01:15:21] a straight of hormones. Too many people have too much vested interest in free
[01:15:24] have too much vested interest in free navigation.
[01:15:25] navigation. >> But in such a scenario they they're
[01:15:27] >> But in such a scenario they they're probably building up drones right now in
[01:15:28] probably building up drones right now in a ceasefire underground. They'll be
[01:15:30] a ceasefire underground. They'll be building drone factories in every bunker
[01:15:31] building drone factories in every bunker in Iran right now. The minute you do a
[01:15:33] in Iran right now. The minute you do a multinational force in the straight of
[01:15:35] multinational force in the straight of moose, those drones start hitting ships
[01:15:36] moose, those drones start hitting ships and then the big companies say, "I'm not
[01:15:38] and then the big companies say, "I'm not putting my ships through there."
[01:15:40] putting my ships through there." >> I think that's a fair point and actually
[01:15:41] >> I think that's a fair point and actually not that many ships have gone down. But
[01:15:42] not that many ships have gone down. But it's actually an insurance problem
[01:15:44] it's actually an insurance problem because if you have 2 million barrels on
[01:15:46] because if you have 2 million barrels on a tanker, you've got aund $160 million
[01:15:49] a tanker, you've got aund $160 million payload. Someone has to take receipt of
[01:15:51] payload. Someone has to take receipt of that. Someone has bought it. So there
[01:15:53] that. Someone has bought it. So there needs to be middlemen in the form of
[01:15:54] needs to be middlemen in the form of insurance. And no one wants to ensure
[01:15:56] insurance. And no one wants to ensure take that kind of risk right now. So
[01:15:58] take that kind of risk right now. So it's a lack of insurance more than it is
[01:15:59] it's a lack of insurance more than it is a lack of safety from the threat. But I
[01:16:01] a lack of safety from the threat. But I do think the reality is the Iranians,
[01:16:04] do think the reality is the Iranians, the IRGC, I say the bottom line is you
[01:16:07] the IRGC, I say the bottom line is you go after the money. You don't bomb
[01:16:09] go after the money. You don't bomb civilian infrastructure. That's a war
[01:16:10] civilian infrastructure. That's a war crime. But making it very difficult for
[01:16:13] crime. But making it very difficult for them and creating a carrot. You can
[01:16:15] them and creating a carrot. You can have, you know, free flow of oil. If you
[01:16:17] have, you know, free flow of oil. If you look at the Arab Spring, if you look at
[01:16:20] look at the Arab Spring, if you look at revolutions, they're usually a function
[01:16:21] revolutions, they're usually a function of unemployment or economics. So if you
[01:16:24] of unemployment or economics. So if you were to inhibit their ability to export
[01:16:26] were to inhibit their ability to export oil, that brings them to the table
[01:16:29] oil, that brings them to the table because the IRGC has been able to
[01:16:30] because the IRGC has been able to survive because millions of families in
[01:16:32] survive because millions of families in Iran are dependent upon the IRGC for
[01:16:34] Iran are dependent upon the IRGC for their check. They control the most
[01:16:36] their check. They control the most profitable, biggest parts of the
[01:16:37] profitable, biggest parts of the economy. So if you if you cut off their
[01:16:41] economy. So if you if you cut off their economics that I think or threaten to, I
[01:16:44] economics that I think or threaten to, I think that brings them to the table. I
[01:16:46] think that brings them to the table. I think that's an the most optimistic
[01:16:48] think that's an the most optimistic scenario is the straits open by a
[01:16:49] scenario is the straits open by a multinational force and there's a
[01:16:51] multinational force and there's a feeling of safety because it's enforced.
[01:16:53] feeling of safety because it's enforced. Iran agrees to keep it open and somehow
[01:16:56] Iran agrees to keep it open and somehow we develop a series of objectives that
[01:16:58] we develop a series of objectives that we claim those boxes we've been checked
[01:17:00] we claim those boxes we've been checked and we declare victory and get the hell
[01:17:02] and we declare victory and get the hell out of dodge. But every day this goes
[01:17:04] out of dodge. But every day this goes on, we seed, in my opinion advantage and
[01:17:08] on, we seed, in my opinion advantage and power to the IRGC because they just look
[01:17:09] power to the IRGC because they just look like they're winning. I think it's clear
[01:17:11] like they're winning. I think it's clear that Trump miscalculated this situation
[01:17:14] that Trump miscalculated this situation quite horrifically. You you have to be a
[01:17:16] quite horrifically. You you have to be a logic like you don't have to be a genius
[01:17:18] logic like you don't have to be a genius to look at what he's saying and see the
[01:17:19] to look at what he's saying and see the contradictions, the constant
[01:17:20] contradictions, the constant contradictions week over week. And the
[01:17:21] contradictions week over week. And the big one he's now got himself trapped in
[01:17:24] big one he's now got himself trapped in is he was handing out a six week
[01:17:25] is he was handing out a six week timeline at the start of this and we're
[01:17:27] timeline at the start of this and we're approaching 6 weeks. So if you watch
[01:17:28] approaching 6 weeks. So if you watch this press conference from yesterday, he
[01:17:30] this press conference from yesterday, he is incredibly irritated because all the
[01:17:32] is incredibly irritated because all the journalists are saying, "You said 6
[01:17:34] journalists are saying, "You said 6 weeks, it's 6 weeks." And for the first
[01:17:37] weeks, it's 6 weeks." And for the first time, there's a real measurable
[01:17:39] time, there's a real measurable contradiction in what he said at the
[01:17:41] contradiction in what he said at the start versus now. And so if you watch
[01:17:43] start versus now. And so if you watch it, he's so he's so agitated. He's
[01:17:45] it, he's so he's so agitated. He's calling the reporters a disgrace. I
[01:17:47] calling the reporters a disgrace. I think in this press conference
[01:17:48] think in this press conference yesterday, he personally insulted around
[01:17:51] yesterday, he personally insulted around 10 journalists.
[01:17:52] 10 journalists. >> He's crashing out.
[01:17:53] >> He's crashing out. >> He's crashing out.
[01:17:54] >> He's crashing out. >> Yeah.
[01:17:54] >> Yeah. >> And for me, that was like, "God, this
[01:17:56] >> And for me, that was like, "God, this guy's lost. But I find the most
[01:17:59] guy's lost. But I find the most interesting thing about this, one of the
[01:18:01] interesting thing about this, one of the most interesting thing is someone who
[01:18:02] most interesting thing is someone who follows the markets. If you didn't know
[01:18:05] follows the markets. If you didn't know there was a war in Iran that threatens
[01:18:06] there was a war in Iran that threatens geopolitical,
[01:18:08] geopolitical, I don't know, safety, world oil flows.
[01:18:11] I don't know, safety, world oil flows. >> Yeah.
[01:18:12] >> Yeah. >> If you looked at the market, would you
[01:18:13] >> If you looked at the market, would you know?
[01:18:13] know? >> No.
[01:18:14] >> No. >> The market just hit an all-time high in
[01:18:15] >> The market just hit an all-time high in the US. And again, it goes back to
[01:18:17] the US. And again, it goes back to something very unhealthy. And that is
[01:18:19] something very unhealthy. And that is there's certain elements of our economy,
[01:18:21] there's certain elements of our economy, specifically the wealthy, that have
[01:18:22] specifically the wealthy, that have totally disassociated from
[01:18:24] totally disassociated from >> they can't feel it at the pump. Do you
[01:18:26] >> they can't feel it at the pump. Do you feel it?
[01:18:27] feel it? >> No.
[01:18:28] >> No. >> So 50% of consumer spending in the US is
[01:18:30] >> So 50% of consumer spending in the US is the top 10%. The top 10% don't give a
[01:18:33] the top 10%. The top 10% don't give a if gas is at six bucks a gallon. It
[01:18:35] if gas is at six bucks a gallon. It doesn't matter. Young uh poor people,
[01:18:38] doesn't matter. Young uh poor people, lower-inccome people spend 22% of their
[01:18:40] lower-inccome people spend 22% of their household income on energy. And when
[01:18:43] household income on energy. And when energy prices are going up, it really
[01:18:45] energy prices are going up, it really impacts them. So unfortunately, we've
[01:18:47] impacts them. So unfortunately, we've again outsourced the downside of war to
[01:18:49] again outsourced the downside of war to less wealthy nations who are very oil
[01:18:51] less wealthy nations who are very oil dependent, to the Gulf, which is
[01:18:53] dependent, to the Gulf, which is incurring damage here. But America is
[01:18:56] incurring damage here. But America is somewhat sequestered other than
[01:18:58] somewhat sequestered other than reputational risk in who gets elected
[01:19:00] reputational risk in who gets elected president.
[01:19:02] president. We're somewhat squ again we've
[01:19:04] We're somewhat squ again we've outsourced the pain to military families
[01:19:06] outsourced the pain to military families to people who who spend a
[01:19:08] to people who who spend a disproportionate amount of their income
[01:19:10] disproportionate amount of their income lower income households on energy. But
[01:19:12] lower income households on energy. But it's somewhat dangerous that the most
[01:19:14] it's somewhat dangerous that the most powerful people in the world and to a
[01:19:15] powerful people in the world and to a certain extent the most powerful nation
[01:19:17] certain extent the most powerful nation in the world seems to be unaffected by
[01:19:20] in the world seems to be unaffected by some of these uh by some of this. uh we
[01:19:23] some of these uh by some of this. uh we stocks are at an all-time high.
[01:19:26] stocks are at an all-time high. >> This is something that I've made for
[01:19:28] >> This is something that I've made for you. I've realized that the direio
[01:19:29] you. I've realized that the direio audience are striv
[01:19:33] goals that we want to accomplish. And
[01:19:35] goals that we want to accomplish. And one of the things I've learned is that
[01:19:37] one of the things I've learned is that when you aim at the big big goal, it can
[01:19:40] when you aim at the big big goal, it can feel incredibly psychologically
[01:19:42] feel incredibly psychologically uncomfortable because it's kind of like
[01:19:44] uncomfortable because it's kind of like being stood at the foot of Mount Everest
[01:19:46] being stood at the foot of Mount Everest and looking upwards. The way to
[01:19:47] and looking upwards. The way to accomplish your goals is by breaking
[01:19:49] accomplish your goals is by breaking them down into tiny small steps. And we
[01:19:52] them down into tiny small steps. And we call this in our team the 1%. And
[01:19:54] call this in our team the 1%. And actually this philosophy is highly
[01:19:56] actually this philosophy is highly responsible for much of our success
[01:19:57] responsible for much of our success here. So what we've done so that you at
[01:20:00] here. So what we've done so that you at home can accomplish any big goal that
[01:20:02] home can accomplish any big goal that you have is we've made these 1% diaries
[01:20:05] you have is we've made these 1% diaries and we released these last year and they
[01:20:07] and we released these last year and they all sold out. So I asked my team over
[01:20:09] all sold out. So I asked my team over and over again to bring the diaries back
[01:20:11] and over again to bring the diaries back but also to introduce some new colors
[01:20:12] but also to introduce some new colors and to make some minor tweaks to the
[01:20:14] and to make some minor tweaks to the diary. Now we have a better range for
[01:20:18] diary. Now we have a better range for you. So if you have a big goal in mind
[01:20:20] you. So if you have a big goal in mind and you need a framework and a process
[01:20:22] and you need a framework and a process and some motivation, then I highly
[01:20:25] and some motivation, then I highly recommend you get one of these diaries
[01:20:26] recommend you get one of these diaries before they all sell out once again. And
[01:20:28] before they all sell out once again. And you can get yours at the diary.com.
[01:20:31] you can get yours at the diary.com. And if you want the link, the link is in
[01:20:33] And if you want the link, the link is in the description below.
[01:20:35] the description below. On this point of stocks, the average
[01:20:36] On this point of stocks, the average person and tying it back into what we
[01:20:38] person and tying it back into what we were saying about AI because there's
[01:20:40] were saying about AI because there's been this overinvestment in artificial
[01:20:42] been this overinvestment in artificial intelligence technology companies. As
[01:20:44] intelligence technology companies. As you said earlier, a kid in Stanford
[01:20:45] you said earlier, a kid in Stanford right now or anybody could raise 1020
[01:20:48] right now or anybody could raise 1020 million for some AI idea they have and
[01:20:50] million for some AI idea they have and I'm seeing it everywhere. Um the
[01:20:52] I'm seeing it everywhere. Um the companies are overinvesting in
[01:20:53] companies are overinvesting in infrastructure. I mean it was crazy. I
[01:20:55] infrastructure. I mean it was crazy. I think OpenAI did the biggest ever fund
[01:20:56] think OpenAI did the biggest ever fund raise. Was it almost two $200 billion
[01:20:59] raise. Was it almost two $200 billion >> to build data centers on revenues of I
[01:21:01] >> to build data centers on revenues of I don't know is it $30 billion or
[01:21:03] don't know is it $30 billion or something? Craziness is going on in the
[01:21:04] something? Craziness is going on in the markets. At some point things correct?
[01:21:07] markets. At some point things correct? >> Oh 100%. And if you look at the greatest
[01:21:09] >> Oh 100%. And if you look at the greatest spends on infrastructure, when they get
[01:21:10] spends on infrastructure, when they get above 2 or 3% of GDP, there's almost
[01:21:12] above 2 or 3% of GDP, there's almost always a crash afterwards. It happened
[01:21:14] always a crash afterwards. It happened in the railroads. It happened in
[01:21:16] in the railroads. It happened in electrification. It happened in the
[01:21:17] electrification. It happened in the internet. It happened in the huge telco
[01:21:20] internet. It happened in the huge telco buildout of global crossing.
[01:21:22] buildout of global crossing. Now, having said that, that doesn't mean
[01:21:24] Now, having said that, that doesn't mean those companies don't come back at some
[01:21:25] those companies don't come back at some point, but there's almost always a dip
[01:21:27] point, but there's almost always a dip or a correction afterwards. Uh I
[01:21:30] or a correction afterwards. Uh I encourage or I I absolutely was on the
[01:21:32] encourage or I I absolutely was on the wrong end of that dip in 2000, again in
[01:21:34] wrong end of that dip in 2000, again in 2008.
[01:21:36] 2008. But if you look at the most valuable
[01:21:38] But if you look at the most valuable companies in the world,
[01:21:40] companies in the world, they have in a single year all of them
[01:21:42] they have in a single year all of them have had a 40 to 97% correction. Amazon
[01:21:45] have had a 40 to 97% correction. Amazon went down 94 97% from 99 to 2001.
[01:21:49] went down 94 97% from 99 to 2001. Facebook was off 72% in 2022. The
[01:21:53] Facebook was off 72% in 2022. The difference now is if those companies go
[01:21:55] difference now is if those companies go down, they're such a big part of the
[01:21:56] down, they're such a big part of the market. If they sneeze, the global
[01:21:58] market. If they sneeze, the global economy could catch a cold. So I I think
[01:22:01] economy could catch a cold. So I I think the technology
[01:22:04] the technology will absolutely survive. I do think it's
[01:22:05] will absolutely survive. I do think it's seminal. It's breakthrough. But that
[01:22:08] seminal. It's breakthrough. But that doesn't mean that we're not going to
[01:22:08] doesn't mean that we're not going to incur a pretty massive correction from a
[01:22:10] incur a pretty massive correction from a stock market perspective. Also, let me
[01:22:13] stock market perspective. Also, let me catastrophize for a moment.
[01:22:17] catastrophize for a moment. your generation and the majority of
[01:22:19] your generation and the majority of investors right now are under the
[01:22:21] investors right now are under the impression that any breakthrough in
[01:22:22] impression that any breakthrough in technology results in a small number of
[01:22:24] technology results in a small number of companies over time that are able to use
[01:22:26] companies over time that are able to use IP distribution ability to raise capital
[01:22:31] IP distribution ability to raise capital to sequester and capture trillions of
[01:22:33] to sequester and capture trillions of dollars in market cap e-commerce that's
[01:22:36] dollars in market cap e-commerce that's Amazon eBay Shopify social media you
[01:22:40] Amazon eBay Shopify social media you know whether it's you know obviously
[01:22:42] know whether it's you know obviously Meta or or or Snap or YouTube or AI.
[01:22:48] Meta or or or Snap or YouTube or AI. The problem is is that what we forget is
[01:22:51] The problem is is that what we forget is there have been seinal technologies that
[01:22:54] there have been seinal technologies that have not necessarily resulted in any
[01:22:56] have not necessarily resulted in any small number of companies been able to
[01:22:57] small number of companies been able to capture shareholder value. Let me give
[01:22:59] capture shareholder value. Let me give you some examples. If someone said to
[01:23:01] you some examples. If someone said to you right now for the next 36 months you
[01:23:03] you right now for the next 36 months you either have to go without AI or jet
[01:23:05] either have to go without AI or jet transportation, what would you pick?
[01:23:09] transportation, what would you pick? >> Without jet transportation or without AI
[01:23:11] >> Without jet transportation or without AI >> prop planes for the next 3 years or or
[01:23:14] >> prop planes for the next 3 years or or AI? I would I would keep the AI.
[01:23:16] AI? I would I would keep the AI. >> Okay. Hands down, jet transportation. I
[01:23:20] >> Okay. Hands down, jet transportation. I use AI every day. I invest in AI
[01:23:22] use AI every day. I invest in AI companies. Jet transportation is much
[01:23:23] companies. Jet transportation is much more important to me. I think that's
[01:23:25] more important to me. I think that's skirting along the surface of the
[01:23:27] skirting along the surface of the atmosphere at 810 the speed of sound has
[01:23:28] atmosphere at 810 the speed of sound has unlocked emotional and financial
[01:23:30] unlocked emotional and financial well-being across the world. It's an
[01:23:32] well-being across the world. It's an unbelievable innovation. If you added up
[01:23:34] unbelievable innovation. If you added up all the shareholder value, the losses
[01:23:36] all the shareholder value, the losses and the gains, the profits and the
[01:23:38] and the gains, the profits and the losses right now, as we sit here now,
[01:23:40] losses right now, as we sit here now, the entire airline and jet manufacturing
[01:23:42] the entire airline and jet manufacturing industry is at break even. It still
[01:23:44] industry is at break even. It still hasn't made money. If you look at all
[01:23:45] hasn't made money. If you look at all the airlines that have gone out of
[01:23:46] the airlines that have gone out of business, if you look at the government
[01:23:47] business, if you look at the government subsidies of all the jet plane
[01:23:49] subsidies of all the jet plane manufacturers, it has been a shitty
[01:23:50] manufacturers, it has been a shitty business. It's at zero. PCs. I was on
[01:23:53] business. It's at zero. PCs. I was on the board of Gateway Computer. I realize
[01:23:56] the board of Gateway Computer. I realize that's the weakest flex in the world.
[01:23:58] that's the weakest flex in the world. >> I have no idea what that is.
[01:23:59] >> I have no idea what that is. >> You don't know what Gateway is? Oh god,
[01:24:00] >> You don't know what Gateway is? Oh god, I feel so old. It was his company Ted
[01:24:03] I feel so old. It was his company Ted Wade who was sort of the Michael Delvis
[01:24:04] Wade who was sort of the Michael Delvis generation figured out a way to assemble
[01:24:06] generation figured out a way to assemble computers in I think South Dakota and it
[01:24:08] computers in I think South Dakota and it became this hugely valuable company. It
[01:24:09] became this hugely valuable company. It was the second when I was on the board,
[01:24:10] was the second when I was on the board, it was the second largest computer
[01:24:12] it was the second largest computer manufacturer by volume in the world. We
[01:24:13] manufacturer by volume in the world. We were ahead of Apple. We got sold for
[01:24:15] were ahead of Apple. We got sold for $600 million or maybe 760, which is what
[01:24:18] $600 million or maybe 760, which is what Alphabet will lose in about three
[01:24:19] Alphabet will lose in about three trading seconds or gain today. No one's
[01:24:22] trading seconds or gain today. No one's been able to capture a lot of money
[01:24:23] been able to capture a lot of money around PCs. People say Apple. No, it
[01:24:25] around PCs. People say Apple. No, it wasn't PCs. It was the iPhone. Vaccines.
[01:24:28] wasn't PCs. It was the iPhone. Vaccines. I think vaccines are the second greatest
[01:24:30] I think vaccines are the second greatest innovation in history, only bested by
[01:24:32] innovation in history, only bested by the American middle class. Millions of
[01:24:34] the American middle class. Millions of lives saved through vaccines.
[01:24:36] lives saved through vaccines. Mona is down 90%. There's no one company
[01:24:39] Mona is down 90%. There's no one company that's been able to capture sequester
[01:24:41] that's been able to capture sequester shareholder value. So my thesis is that
[01:24:43] shareholder value. So my thesis is that there's a one in three chance that AI
[01:24:46] there's a one in three chance that AI becomes as important as vaccines, as
[01:24:48] becomes as important as vaccines, as important as jet transportation, as
[01:24:50] important as jet transportation, as important as PCs, but there's no one or
[01:24:52] important as PCs, but there's no one or small group of companies that are able
[01:24:54] small group of companies that are able to capture shareholder value. Why is
[01:24:56] to capture shareholder value. Why is that? AI puts AI out of business. And
[01:25:00] that? AI puts AI out of business. And that is if you look at the convergence
[01:25:01] that is if you look at the convergence of the technologies, all the models are
[01:25:04] of the technologies, all the models are converging.
[01:25:04] converging. >> Yeah. the AI reverse engineers any
[01:25:07] >> Yeah. the AI reverse engineers any feature and basically they're all kind
[01:25:09] feature and basically they're all kind of they all started with this delta
[01:25:11] of they all started with this delta they're all converging towards the same
[01:25:12] they're all converging towards the same thing. So I wonder if the big winner or
[01:25:15] thing. So I wonder if the big winner or the stakeholder that wins in AI is us.
[01:25:18] the stakeholder that wins in AI is us. >> Yeah.
[01:25:18] >> Yeah. >> And that is we have this we have amazing
[01:25:21] >> And that is we have this we have amazing vaccines PCs transportation
[01:25:24] vaccines PCs transportation but a small number of companies haven't
[01:25:26] but a small number of companies haven't become worth trillions of dollars. And I
[01:25:28] become worth trillions of dollars. And I wonder if the same thing might not
[01:25:29] wonder if the same thing might not happen with AI, with openweight models
[01:25:31] happen with AI, with openweight models out of China, with um basically great
[01:25:35] out of China, with um basically great models that are for free. My prediction
[01:25:38] models that are for free. My prediction would be to go short the AI ecosystem
[01:25:40] would be to go short the AI ecosystem from a shareholder standpoint, but from
[01:25:43] from a shareholder standpoint, but from a stakeholder standpoint, I think it's
[01:25:44] a stakeholder standpoint, I think it's going to be great. And let me back up. I
[01:25:48] going to be great. And let me back up. I think a more important technology in
[01:25:50] think a more important technology in terms of how it's going to change the
[01:25:51] terms of how it's going to change the world is not AI. You know, every year I
[01:25:53] world is not AI. You know, every year I say this is my technology of the year.
[01:25:54] say this is my technology of the year. In 24, I said it was AI. 25 and 26 I've
[01:25:58] In 24, I said it was AI. 25 and 26 I've had the same technology that said this
[01:25:59] had the same technology that said this is more important than AI. Any guesses?
[01:26:02] is more important than AI. Any guesses? Talk about this a lot. You would not
[01:26:04] Talk about this a lot. You would not know this technology or use it. That's a
[01:26:06] know this technology or use it. That's a hint.
[01:26:08] hint. >> Okay.
[01:26:10] >> Okay. Testosterone
[01:26:13] replacement.
[01:26:14] replacement. >> That hurts cuz you know I'm on it. That
[01:26:16] >> That hurts cuz you know I'm on it. That hurts.
[01:26:16] hurts. >> Okay. No. Um Okay. A Zen.
[01:26:19] >> Okay. No. Um Okay. A Zen. >> Okay.
[01:26:19] >> Okay. >> GLP1.
[01:26:20] >> GLP1. >> GLP1. Yeah.
[01:26:21] >> GLP1. Yeah. >> Talk to somebody who's on GLP1 and uses
[01:26:23] >> Talk to somebody who's on GLP1 and uses AI every day and ask them which one they
[01:26:25] AI every day and ask them which one they would give up.
[01:26:26] would give up. Yeah,
[01:26:27] Yeah, >> I think GLP1, if you look at what's
[01:26:29] >> I think GLP1, if you look at what's really going to have an impact on
[01:26:30] really going to have an impact on people's lives and what will create more
[01:26:31] people's lives and what will create more shareholder value, I think it's GLP1
[01:26:33] shareholder value, I think it's GLP1 than AI. And a lot of people much
[01:26:35] than AI. And a lot of people much smarter than me say you're full of
[01:26:36] smarter than me say you're full of AI. AI is going to change everything. I
[01:26:39] AI. AI is going to change everything. I I don't I think GLP1 is more important
[01:26:41] I don't I think GLP1 is more important technology than than AI. And my thesis
[01:26:43] technology than than AI. And my thesis is there's a one in three chance that AI
[01:26:46] is there's a one in three chance that AI ends up being more like vaccines than
[01:26:48] ends up being more like vaccines than e-commerce or social. and that it's
[01:26:50] e-commerce or social. and that it's going to be impossible for a small
[01:26:52] going to be impossible for a small number of companies to capture all the
[01:26:53] number of companies to capture all the shareholder value they're raising money
[01:26:55] shareholder value they're raising money at. The other kind of what I'll put, you
[01:26:58] at. The other kind of what I'll put, you know, fun to speculate or catastrophize.
[01:27:00] know, fun to speculate or catastrophize. If someone were to say the US economy
[01:27:02] If someone were to say the US economy crashed in the next 24 months or or
[01:27:05] crashed in the next 24 months or or valuations came down not 30% but the
[01:27:08] valuations came down not 30% but the market dropped 40 or 50%, which has
[01:27:10] market dropped 40 or 50%, which has happened before, I wouldn't think it's
[01:27:11] happened before, I wouldn't think it's because of its misadventures in the
[01:27:13] because of its misadventures in the Middle East. I think there's a decent
[01:27:15] Middle East. I think there's a decent chance if I were advising she and China
[01:27:18] chance if I were advising she and China and I saw America as a real adversary
[01:27:20] and I saw America as a real adversary and said we are sick of these guys
[01:27:22] and said we are sick of these guys messing with us, treating us so poorly,
[01:27:24] messing with us, treating us so poorly, these ridiculous tariffs, very difficult
[01:27:27] these ridiculous tariffs, very difficult to understand what they're what they're
[01:27:29] to understand what they're what they're thinking or how we deal with them. I
[01:27:30] thinking or how we deal with them. I would do what I think they're doing and
[01:27:32] would do what I think they're doing and that is I would engage in modern-day
[01:27:34] that is I would engage in modern-day steel dumping. So back I think it was in
[01:27:36] steel dumping. So back I think it was in the 80s or the '90s, China wanted to
[01:27:39] the 80s or the '90s, China wanted to ramp up, they wanted to consolidate the
[01:27:41] ramp up, they wanted to consolidate the global steel market. So they began
[01:27:42] global steel market. So they began dumping cheap steel into the US and US
[01:27:44] dumping cheap steel into the US and US steel manufacturers could not compete.
[01:27:45] steel manufacturers could not compete. And the idea was price it below market,
[01:27:48] And the idea was price it below market, consolidate the market and then you get
[01:27:49] consolidate the market and then you get margin power. That's basically what
[01:27:51] margin power. That's basically what Amazon and Netflix have done. They've
[01:27:52] Amazon and Netflix have done. They've sold you a dollar worth of goods for 80
[01:27:54] sold you a dollar worth of goods for 80 cents until they wrapped up the market.
[01:27:56] cents until they wrapped up the market. Then they started raising prices. I
[01:27:58] Then they started raising prices. I think China is beginning to engage in
[01:28:00] think China is beginning to engage in what I'll call AI dumping. And that is
[01:28:02] what I'll call AI dumping. And that is they're going to have a series of
[01:28:03] they're going to have a series of openweight models. About a third of
[01:28:05] openweight models. About a third of corporations now are supposedly using
[01:28:07] corporations now are supposedly using Chinese lightweight openweight models
[01:28:09] Chinese lightweight openweight models that are cheaper. If I were she, I would
[01:28:11] that are cheaper. If I were she, I would just dump cheap AI into the US market.
[01:28:14] just dump cheap AI into the US market. And the moment large corporations start
[01:28:15] And the moment large corporations start announcing they're not engaging these
[01:28:17] announcing they're not engaging these multi-million dollar site licenses with
[01:28:19] multi-million dollar site licenses with anthropic or open AI, they're using
[01:28:21] anthropic or open AI, they're using these inexpensive Chinese models and the
[01:28:24] these inexpensive Chinese models and the market realizes that there's no way they
[01:28:25] market realizes that there's no way they can justify these incredible valuations.
[01:28:28] can justify these incredible valuations. I think the US market crashes because
[01:28:30] I think the US market crashes because 40% of the S&P now is directly or
[01:28:33] 40% of the S&P now is directly or tangentially related to this giant bet
[01:28:35] tangentially related to this giant bet America is making on AI. The majority of
[01:28:38] America is making on AI. The majority of GDP growth over the last two years has
[01:28:40] GDP growth over the last two years has come from AI capex. If that slows down,
[01:28:43] come from AI capex. If that slows down, we're immediately in recession.
[01:28:45] we're immediately in recession. >> That's such an interesting idea that if
[01:28:46] >> That's such an interesting idea that if you're sat in China as a leader now, you
[01:28:48] you're sat in China as a leader now, you go, you know what? Give Americans cheap
[01:28:50] go, you know what? Give Americans cheap AI and you'll kneecap their economy.
[01:28:52] AI and you'll kneecap their economy. >> 100%. That's what I would do.
[01:28:54] >> 100%. That's what I would do. >> It does make sense, right?
[01:28:56] >> It does make sense, right? >> It makes a lot of sense. I I've heard a
[01:28:58] >> It makes a lot of sense. I I've heard a lot of founders um get quite scared that
[01:29:00] lot of founders um get quite scared that there will be an economic crash in the
[01:29:02] there will be an economic crash in the next 12 or 24 months because of the
[01:29:04] next 12 or 24 months because of the overinvestment in in AI. And investors
[01:29:06] overinvestment in in AI. And investors are going to start to realize that the
[01:29:07] are going to start to realize that the returns just aren't there for a lot of
[01:29:09] returns just aren't there for a lot of these companies that have raised at 100
[01:29:11] these companies that have raised at 100 million valuation that on an idea and so
[01:29:14] million valuation that on an idea and so the market will contract. And if it if
[01:29:15] the market will contract. And if it if the market does contract, what does
[01:29:16] the market does contract, what does history tell us that the individual
[01:29:18] history tell us that the individual should do? The person listening, they're
[01:29:20] should do? The person listening, they're not the average person in such a market
[01:29:22] not the average person in such a market because they're scared they're going to
[01:29:23] because they're scared they're going to be laid off. If all this investor money
[01:29:24] be laid off. If all this investor money suddenly contracts, investors go risk
[01:29:26] suddenly contracts, investors go risk off. They might be laid off. Well,
[01:29:29] off. They might be laid off. Well, again, so Jamie Diamond was asked, "What
[01:29:31] again, so Jamie Diamond was asked, "What is the definition of a recession?" and
[01:29:33] is the definition of a recession?" and he said something that happens every
[01:29:34] he said something that happens every seven years. Your generation isn't used
[01:29:37] seven years. Your generation isn't used to a recession.
[01:29:38] to a recession. >> No.
[01:29:38] >> No. >> I was on the board of the New York Times
[01:29:40] >> I was on the board of the New York Times and within like 60 days 70% of our
[01:29:43] and within like 60 days 70% of our revenues went away. 2008 the credit
[01:29:46] revenues went away. 2008 the credit crisis advertisers 70 80% of the New
[01:29:49] crisis advertisers 70 80% of the New York Times revenue was advertising.
[01:29:51] York Times revenue was advertising. You know, in May we're doing X million
[01:29:54] You know, in May we're doing X million in revenue.
[01:29:55] in revenue. >> And then you joined the board and then
[01:29:56] >> And then you joined the board and then >> Yeah. No, I think I joined the board. I
[01:29:59] >> Yeah. No, I think I joined the board. I joined the board at exactly the wrong
[01:30:00] joined the board at exactly the wrong time. I raised Quickstory. I raised 600
[01:30:02] time. I raised Quickstory. I raised 600 million to become the largest sheriff in
[01:30:03] million to become the largest sheriff in the New York Times and overnight I
[01:30:04] the New York Times and overnight I turned it into 200 million.
[01:30:07] turned it into 200 million. So, but yeah, that was that was a
[01:30:09] So, but yeah, that was that was a learning experience for me. Anyways,
[01:30:11] learning experience for me. Anyways, just about the time I started having
[01:30:13] just about the time I started having kids. God, that was stressful. Anyway,
[01:30:15] kids. God, that was stressful. Anyway, they basically within 60 days lost 70%
[01:30:19] they basically within 60 days lost 70% of their ad revenue and we had to go and
[01:30:21] of their ad revenue and we had to go and find a Mexican billionaire to basically
[01:30:23] find a Mexican billionaire to basically call us to bail us out. Your generation
[01:30:25] call us to bail us out. Your generation really doesn't know what a recession
[01:30:27] really doesn't know what a recession looks like. Like, everything stops.
[01:30:30] looks like. Like, everything stops. Imagine 70% of your subscription and
[01:30:32] Imagine 70% of your subscription and advertising revenue from 1 month to one
[01:30:34] advertising revenue from 1 month to one month declines. What would that do to
[01:30:36] month declines. What would that do to your business?
[01:30:36] your business? >> You'd have to consider letting people go
[01:30:39] >> You'd have to consider letting people go and you'd have to cut costs. And
[01:30:41] and you'd have to cut costs. And >> but here's the thing, to a certain
[01:30:42] >> but here's the thing, to a certain extent, it's healthy. You start
[01:30:45] extent, it's healthy. You start developing all these fatty deposits and
[01:30:48] developing all these fatty deposits and the best time to start a business is
[01:30:50] the best time to start a business is coming out of a recession because people
[01:30:52] coming out of a recession because people are cheaper, things are cheaper, people
[01:30:54] are cheaper, things are cheaper, people have a new way of looking at stuff. And
[01:30:55] have a new way of looking at stuff. And also for your generation,
[01:30:58] also for your generation, people don't realize, I mean, quite
[01:31:00] people don't realize, I mean, quite frankly, if I'm a 28-year-old who's
[01:31:01] frankly, if I'm a 28-year-old who's talented and doesn't have kids and dogs
[01:31:03] talented and doesn't have kids and dogs yet, I think a recession that takes down
[01:31:06] yet, I think a recession that takes down the asset prices might not be the worst
[01:31:07] the asset prices might not be the worst thing in the world that happens to your
[01:31:09] thing in the world that happens to your generation. Because when 08 crashed,
[01:31:12] generation. Because when 08 crashed, when there was the crash of '08, we let
[01:31:15] when there was the crash of '08, we let the banks fail. We I'm sorry, we bailed
[01:31:18] the banks fail. We I'm sorry, we bailed out the banks, but we didn't bail out
[01:31:19] out the banks, but we didn't bail out the economy. And I was coming into my
[01:31:21] the economy. And I was coming into my prime income earning years. I was in my
[01:31:22] prime income earning years. I was in my early 40s and I was still lucky enough
[01:31:25] early 40s and I was still lucky enough and talented enough to make good money.
[01:31:27] and talented enough to make good money. So what did I get in 2009? Amazon,
[01:31:30] So what did I get in 2009? Amazon, Apple, and Netflix for 8, 10, and 12
[01:31:32] Apple, and Netflix for 8, 10, and 12 bucks a share.
[01:31:34] bucks a share. Those companies have 20xed.
[01:31:36] Those companies have 20xed. >> We don't let anything fail now. We bail
[01:31:39] >> We don't let anything fail now. We bail out the markets. Where do you and your
[01:31:41] out the markets. Where do you and your colleagues find value? Like what's cheap
[01:31:44] colleagues find value? Like what's cheap right now? Where do you find value? And
[01:31:46] right now? Where do you find value? And here's the thing. When you There's two
[01:31:48] here's the thing. When you There's two parts of your life from a financial
[01:31:49] parts of your life from a financial standpoint. There's investing part of
[01:31:51] standpoint. There's investing part of your life and there's harvesting. You're
[01:31:53] your life and there's harvesting. You're in the investing part of your life.
[01:31:55] in the investing part of your life. There's a certain
[01:31:57] There's a certain advantage to getting to invest when
[01:32:00] advantage to getting to invest when asset prices are low. There's nowhere
[01:32:02] asset prices are low. There's nowhere for you to find value.
[01:32:04] for you to find value. >> Yeah.
[01:32:04] >> Yeah. >> I I I'm pretty comfortable saying Apple
[01:32:06] >> I I I'm pretty comfortable saying Apple or Nvidia aren't going to 20x from here.
[01:32:08] or Nvidia aren't going to 20x from here. Where do you find value? Real estate.
[01:32:11] Where do you find value? Real estate. What? Brooklyn is $3,000 a square foot.
[01:32:14] What? Brooklyn is $3,000 a square foot. So while you're in the investing part of
[01:32:17] So while you're in the investing part of your life and you can survive a bit of a
[01:32:19] your life and you can survive a bit of a shock, I don't know if quite frankly a
[01:32:22] shock, I don't know if quite frankly a good thing to happen for your generation
[01:32:24] good thing to happen for your generation wouldn't be a correction in asset prices
[01:32:27] wouldn't be a correction in asset prices because the economy has an unbelievable
[01:32:29] because the economy has an unbelievable malleability and resilience to reform,
[01:32:31] malleability and resilience to reform, reshape and come back. Recessions
[01:32:33] reshape and come back. Recessions usually don't last longer than 18, 24,
[01:32:36] usually don't last longer than 18, 24, 36 months. And I think a correction in
[01:32:39] 36 months. And I think a correction in asset values would be good. But what
[01:32:41] asset values would be good. But what we've decided, the leadership of my
[01:32:43] we've decided, the leadership of my generation has decided we'd rather pull
[01:32:45] generation has decided we'd rather pull out your credit card and artificially
[01:32:47] out your credit card and artificially prop up the markets through deficit
[01:32:49] prop up the markets through deficit spending or bailouts.
[01:32:51] spending or bailouts. They're talking, we mentioned Spirit
[01:32:52] They're talking, we mentioned Spirit Airlines, they're talking about a $500
[01:32:54] Airlines, they're talking about a $500 million government loan
[01:32:56] million government loan to Spirit Airlines to bail them out.
[01:32:59] to Spirit Airlines to bail them out. That's nothing but you transferring
[01:33:01] That's nothing but you transferring money to me.
[01:33:03] money to me. >> Because who owns shares in Spirit?
[01:33:06] >> Because who owns shares in Spirit? People my age. Who's trying to buy
[01:33:09] People my age. Who's trying to buy for cheap? people your age, it should be
[01:33:12] for cheap? people your age, it should be allowed to go out of business. It should
[01:33:14] allowed to go out of business. It should decline in price. Asset value should go
[01:33:16] decline in price. Asset value should go down. When we bailed out every baby
[01:33:19] down. When we bailed out every baby boomer owner of a restaurant or a small
[01:33:21] boomer owner of a restaurant or a small business in COVID, that just robbed
[01:33:23] business in COVID, that just robbed opportunity from the new graduate of a
[01:33:24] opportunity from the new graduate of a culinary academy who wanted her shot to
[01:33:27] culinary academy who wanted her shot to go buy a restaurant for pennies on the
[01:33:29] go buy a restaurant for pennies on the dollar. Everyone in my generation has
[01:33:32] dollar. Everyone in my generation has had those asset dips where if you were
[01:33:33] had those asset dips where if you were resilient and coming into your prime
[01:33:35] resilient and coming into your prime income earning years, you could buy
[01:33:36] income earning years, you could buy assets for a lower price. We've decided
[01:33:39] assets for a lower price. We've decided that the the government is here to bail
[01:33:41] that the the government is here to bail my generation out and smooth out our
[01:33:43] my generation out and smooth out our assets. Make sure that our assets never
[01:33:45] assets. Make sure that our assets never go down by too much. All that does is
[01:33:48] go down by too much. All that does is rub opportunity from your generation.
[01:33:50] rub opportunity from your generation. You wrote this book, the the MLP
[01:33:51] You wrote this book, the the MLP profile, the simple formula for success.
[01:33:53] profile, the simple formula for success. Uh brilliant, brilliant book. And
[01:33:56] Uh brilliant, brilliant book. And linking to what you were just saying
[01:33:57] linking to what you were just saying there, where are you investing now? Like
[01:33:59] there, where are you investing now? Like if you're a young person that's trying
[01:34:00] if you're a young person that's trying to defend your money or just really
[01:34:02] to defend your money or just really anybody at any age that's trying to find
[01:34:03] anybody at any age that's trying to find a place to put your money where you'll
[01:34:04] a place to put your money where you'll make a return. I guess there's actually
[01:34:06] make a return. I guess there's actually two questions here. which is the
[01:34:07] two questions here. which is the investing part. And they're like, "How
[01:34:09] investing part. And they're like, "How do I set myself up just to make more
[01:34:10] do I set myself up just to make more money, especially if there's going to be
[01:34:12] money, especially if there's going to be an economic collapse, things might get a
[01:34:14] an economic collapse, things might get a little bit uncertain?" I think you I
[01:34:16] little bit uncertain?" I think you I think I heard you say before that less
[01:34:18] think I heard you say before that less people are leaving their jobs. It's it's
[01:34:19] people are leaving their jobs. It's it's harder for entry- level people to get
[01:34:21] harder for entry- level people to get into newer roles potentially at the
[01:34:23] into newer roles potentially at the moment. So, in such a world where there
[01:34:24] moment. So, in such a world where there is uncertainty and I want to make sure
[01:34:27] is uncertainty and I want to make sure that I don't go broke, I have $10,000,
[01:34:30] that I don't go broke, I have $10,000, let's say, and I want to make sure I
[01:34:32] let's say, and I want to make sure I increase my earning potential. What's
[01:34:33] increase my earning potential. What's one's advice for those dual strategies?
[01:34:35] one's advice for those dual strategies? >> Well, I'll tell you what I'm doing. So,
[01:34:38] >> Well, I'll tell you what I'm doing. So, and it's different. I'm at a point where
[01:34:40] and it's different. I'm at a point where I'm not looking to get rich. I'm looking
[01:34:41] I'm not looking to get rich. I'm looking to not get poor. So, I just diversify
[01:34:43] to not get poor. So, I just diversify like crazy. I don't put more than I
[01:34:45] like crazy. I don't put more than I don't invest more than 3% of my net
[01:34:47] don't invest more than 3% of my net worth in any one thing.
[01:34:48] worth in any one thing. >> Yeah.
[01:34:48] >> Yeah. >> And I've been diversifying out of the US
[01:34:50] >> And I've been diversifying out of the US market, in the Latin America and
[01:34:51] market, in the Latin America and European markets because the reality is
[01:34:54] European markets because the reality is nobody knows. The only kevlar against
[01:34:56] nobody knows. The only kevlar against the unknown, which is everywhere, is
[01:34:57] the unknown, which is everywhere, is diversification. Now, a younger person
[01:35:00] diversification. Now, a younger person can take more risks, right? I mean, what
[01:35:02] can take more risks, right? I mean, what am I doing? I'm diversifying and I'm
[01:35:04] am I doing? I'm diversifying and I'm investing in Pokemon because I do it
[01:35:05] investing in Pokemon because I do it with my son and he loves it and I think
[01:35:08] with my son and he loves it and I think collectibles are actually the only place
[01:35:09] collectibles are actually the only place where there's value right now.
[01:35:10] where there's value right now. Everything to me just looks crazy
[01:35:12] Everything to me just looks crazy overvalued. I look at everything and
[01:35:13] overvalued. I look at everything and think that's going to get cut in half.
[01:35:15] think that's going to get cut in half. Oh, that's going to get cut in half. But
[01:35:17] Oh, that's going to get cut in half. But you always want to be in the market.
[01:35:18] you always want to be in the market. It's very hard to be a stock picker. So
[01:35:20] It's very hard to be a stock picker. So if you're your age, you're a talented
[01:35:22] if you're your age, you're a talented entrepreneur. You're investing in
[01:35:23] entrepreneur. You're investing in yourself. There's no ROI like finding a
[01:35:26] yourself. There's no ROI like finding a business you're good at, trying to
[01:35:28] business you're good at, trying to invest, and working your ass off. That's
[01:35:30] invest, and working your ass off. That's how you get wealthy in this for young
[01:35:32] how you get wealthy in this for young people in this generation or you find a
[01:35:34] people in this generation or you find a company that's going fast. You do well
[01:35:36] company that's going fast. You do well there. You get stock options. But the
[01:35:37] there. You get stock options. But the key is to make sure that a certain
[01:35:40] key is to make sure that a certain amount of your income never comes into
[01:35:42] amount of your income never comes into your hands. People your age can't save
[01:35:43] your hands. People your age can't save money. If they get $100, they'll spend
[01:35:46] money. If they get $100, they'll spend 105. So the key is to find every
[01:35:48] 105. So the key is to find every matching program or every vehicle or
[01:35:51] matching program or every vehicle or hack to make sure you never see the
[01:35:52] hack to make sure you never see the money. And from an early age, it goes
[01:35:54] money. And from an early age, it goes into lowcost index funds. Take 30% of
[01:35:57] into lowcost index funds. Take 30% of your capital and have fun with it. buy
[01:36:00] your capital and have fun with it. buy Nvidia, whatever you want that you think
[01:36:02] Nvidia, whatever you want that you think you're smarter than everyone else and
[01:36:03] you're smarter than everyone else and then you're going to find out you're not
[01:36:05] then you're going to find out you're not and and the market outperforms you. But
[01:36:07] and and the market outperforms you. But how do you get rich? The only answer I
[01:36:09] how do you get rich? The only answer I have is slowly and that is figure out a
[01:36:12] have is slowly and that is figure out a way to make sure that money every month
[01:36:14] way to make sure that money every month is invested in lowcost index funds. In
[01:36:18] is invested in lowcost index funds. In terms of trying to pick the next big
[01:36:21] terms of trying to pick the next big thing, oh Christ, your guess is as good
[01:36:24] thing, oh Christ, your guess is as good as mine. I just don't. And anyone who
[01:36:26] as mine. I just don't. And anyone who tells you they know doesn't know. So, I
[01:36:29] tells you they know doesn't know. So, I know how to get you rich. That's the
[01:36:30] know how to get you rich. That's the good news. The bad news is the answer is
[01:36:32] good news. The bad news is the answer is slowly and it requires some discipline.
[01:36:34] slowly and it requires some discipline. But people your age, just find out a way
[01:36:36] But people your age, just find out a way to start saving when you're a teenager,
[01:36:38] to start saving when you're a teenager, 25 bucks a month, then in your 20s, 100,
[01:36:40] 25 bucks a month, then in your 20s, 100, then 500, then a,000. And regardless of
[01:36:43] then 500, then a,000. And regardless of whether you have a platinum record or a
[01:36:45] whether you have a platinum record or a bestselling book or a podcast empire,
[01:36:47] bestselling book or a podcast empire, you're going to be fine. Uh, but I don't
[01:36:50] you're going to be fine. Uh, but I don't I can't look at a sector and say, "Oh,
[01:36:52] I can't look at a sector and say, "Oh, it's AI." Well, AI is overvalued right
[01:36:54] it's AI." Well, AI is overvalued right now. I don't know. Your job is to find
[01:36:56] now. I don't know. Your job is to find something you're good at, to focus such
[01:36:58] something you're good at, to focus such you can become great at something which
[01:36:59] you can become great at something which commands margin. To show some
[01:37:01] commands margin. To show some discipline, save some money, to
[01:37:03] discipline, save some money, to diversify like crazy, and then to let
[01:37:05] diversify like crazy, and then to let time take over. Cuz I look at you and I
[01:37:08] time take over. Cuz I look at you and I think, "Oh, I'm like Stephen. I'm a
[01:37:10] think, "Oh, I'm like Stephen. I'm a young entrepreneur." And then I look in
[01:37:11] young entrepreneur." And then I look in the mirror, I'm like, "Fuck." And I'm
[01:37:12] the mirror, I'm like, "Fuck." And I'm like, "Oh my god, I'm 5 years from
[01:37:14] like, "Oh my god, I'm 5 years from death."
[01:37:16] death." It goes, but it happened in a blink. So
[01:37:19] It goes, but it happened in a blink. So take advantage of that because if I gave
[01:37:21] take advantage of that because if I gave you a magic box and said if you put like
[01:37:23] you a magic box and said if you put like you're making real money now if you put
[01:37:25] you're making real money now if you put $100,000 in a box by the time you're my
[01:37:27] $100,000 in a box by the time you're my age it's going to be worth a million
[01:37:29] age it's going to be worth a million bucks. And imagine that that that that
[01:37:32] bucks. And imagine that that that that box is like a second. 100,000 in a
[01:37:36] box is like a second. 100,000 in a second it's a million dollars. This to
[01:37:38] second it's a million dollars. This to this feels like a second. So how much
[01:37:40] this feels like a second. So how much money would you put in that box?
[01:37:41] money would you put in that box? >> Oh my gosh. Yeah. So the moment you have
[01:37:43] >> Oh my gosh. Yeah. So the moment you have some capital, just think about trying to
[01:37:47] some capital, just think about trying to at a young age show some discipline and
[01:37:50] at a young age show some discipline and put some money in that box because it
[01:37:52] put some money in that box because it just compound interest is just it's
[01:37:56] just compound interest is just it's staggering the power of it. But trying
[01:37:58] staggering the power of it. But trying to predict where you should invest other
[01:38:00] to predict where you should invest other than investing in yourself, additional
[01:38:02] than investing in yourself, additional skills, certifications, investing in
[01:38:03] skills, certifications, investing in relationships, trying to be as do as
[01:38:06] relationships, trying to be as do as many nice kind things for other people,
[01:38:07] many nice kind things for other people, I think that compounds, especially when
[01:38:09] I think that compounds, especially when you're younger. People remember people
[01:38:11] you're younger. People remember people who helped them when they were younger
[01:38:12] who helped them when they were younger and maybe not that powerful. I imagine
[01:38:14] and maybe not that powerful. I imagine you're very loyal.
[01:38:16] you're very loyal. >> The person who gave you 5,000 bucks, do
[01:38:18] >> The person who gave you 5,000 bucks, do you resent the money they make? They
[01:38:20] you resent the money they make? They made
[01:38:21] made >> I actually emailed him this weekend.
[01:38:22] >> I actually emailed him this weekend. It's called Alistister thanking him
[01:38:23] It's called Alistister thanking him again. Send him a big letter for
[01:38:25] again. Send him a big letter for >> that because that person took a chance
[01:38:26] >> that because that person took a chance on you. There was a certain amount of
[01:38:28] on you. There was a certain amount of kindness there, right?
[01:38:28] kindness there, right? >> It wasn't even the 5,000. It was that a
[01:38:30] >> It wasn't even the 5,000. It was that a smart person said, "I believe in you."
[01:38:32] smart person said, "I believe in you." And that meant that I could go back to
[01:38:33] And that meant that I could go back to my mother who was ignoring me and
[01:38:34] my mother who was ignoring me and saying, "Look, the smart person believed
[01:38:36] saying, "Look, the smart person believed in me and therefore it made me believe
[01:38:37] in me and therefore it made me believe in myself." That's really the investment
[01:38:38] in myself." That's really the investment they made. That's everything and they
[01:38:40] they made. That's everything and they took a chance and it probably was they
[01:38:42] took a chance and it probably was they wanted a return but what they wanted to
[01:38:43] wanted a return but what they wanted to do was help out a young man. Those
[01:38:45] do was help out a young man. Those people
[01:38:46] people >> your opportunity to make those
[01:38:47] >> your opportunity to make those investments as a young person whether
[01:38:48] investments as a young person whether it's helping someone get a job being
[01:38:50] it's helping someone get a job being kind a kind a kind word a kind text
[01:38:53] kind a kind a kind word a kind text telling them how impressive they are
[01:38:54] telling them how impressive they are whatever it might be that stuff it's
[01:38:56] whatever it might be that stuff it's like investing it compounds and you wake
[01:38:59] like investing it compounds and you wake up at 50 and you find those
[01:39:02] up at 50 and you find those relationships are really powerful
[01:39:04] relationships are really powerful assets.
[01:39:05] assets. >> Mhm.
[01:39:05] >> Mhm. >> So look I don't I don't have a silver
[01:39:07] >> So look I don't I don't have a silver bullet here. save money, diversify,
[01:39:10] bullet here. save money, diversify, compound interest, invest in
[01:39:11] compound interest, invest in relationships early. Those compound too.
[01:39:14] relationships early. Those compound too. >> What are the most important decisions
[01:39:15] >> What are the most important decisions you made that resulted in the biggest
[01:39:17] you made that resulted in the biggest sort of wealth upside for you? And they
[01:39:19] sort of wealth upside for you? And they could be any kind of decision. It's not
[01:39:21] could be any kind of decision. It's not like I invested in this, but just I
[01:39:23] like I invested in this, but just I don't know, a philosophy, a mentality
[01:39:24] don't know, a philosophy, a mentality and approach that when you look back on
[01:39:26] and approach that when you look back on your career, you go, it was that that
[01:39:28] your career, you go, it was that that that was the biggest step change in my
[01:39:30] that was the biggest step change in my money. Well, my superpower is I I I've
[01:39:34] money. Well, my superpower is I I I've gotten shot in the face a couple times
[01:39:35] gotten shot in the face a couple times personally and professionally and I
[01:39:37] personally and professionally and I mourn and I get up and I go try and
[01:39:40] mourn and I get up and I go try and raise money again and start another
[01:39:41] raise money again and start another company. I had an e-commerce incubator.
[01:39:43] company. I had an e-commerce incubator. It was basically out of business 6
[01:39:44] It was basically out of business 6 months from starting it. My e-commerce
[01:39:46] months from starting it. My e-commerce company that went public in 2002 went
[01:39:48] company that went public in 2002 went through restructuring, which is a fancy
[01:39:50] through restructuring, which is a fancy word for bankruptcy. In 2008, I started
[01:39:53] word for bankruptcy. In 2008, I started a video delivery company that went out
[01:39:55] a video delivery company that went out of business. So, I mean, you know,
[01:39:58] of business. So, I mean, you know, people talk about their successes. I
[01:39:59] people talk about their successes. I think generously I'm sort of like three,
[01:40:02] think generously I'm sort of like three, four and two. I've lo I've had more
[01:40:04] four and two. I've lo I've had more losses than wins every time I've been
[01:40:07] losses than wins every time I've been rejected from a school. I had my
[01:40:10] rejected from a school. I had my affections weren't returned from a
[01:40:12] affections weren't returned from a potential romantic partner. I got fired.
[01:40:14] potential romantic partner. I got fired. I had a company go out of business. I
[01:40:16] I had a company go out of business. I had investment. I've always been able
[01:40:18] had investment. I've always been able and I think this is a key skill. I think
[01:40:19] and I think this is a key skill. I think you have to do whatever you need to do.
[01:40:21] you have to do whatever you need to do. Whether it's be around people who care
[01:40:23] Whether it's be around people who care about you, exercise, you have to be able
[01:40:25] about you, exercise, you have to be able to stand in front of a metaphorical
[01:40:26] to stand in front of a metaphorical mirror and go. I can add a lot of value
[01:40:29] mirror and go. I can add a lot of value to a company. I can raise money again
[01:40:32] to a company. I can raise money again and start a business if I need to. I can
[01:40:34] and start a business if I need to. I can make someone really happy.
[01:40:36] make someone really happy. >> That's been my superpower. So the the
[01:40:39] >> That's been my superpower. So the the seinal moment in my professional success
[01:40:43] seinal moment in my professional success was a willingness to say, "Okay, I just
[01:40:44] was a willingness to say, "Okay, I just got shot in the face. The whole world
[01:40:46] got shot in the face. The whole world says I'm a failure because my business I
[01:40:47] says I'm a failure because my business I raised a ton of money and I had to call
[01:40:48] raised a ton of money and I had to call my investors and say we're shutting
[01:40:50] my investors and say we're shutting down." That's humiliating. It's public
[01:40:51] down." That's humiliating. It's public failure. I mourn and then I go out and I
[01:40:54] failure. I mourn and then I go out and I raise more money. I try another company
[01:40:56] raise more money. I try another company because only one in seven businesses
[01:40:58] because only one in seven businesses succeed. So I started nine
[01:41:01] succeed. So I started nine and I knew at some point if you work
[01:41:04] and I knew at some point if you work hard enough, I mean you can't guarantee
[01:41:05] hard enough, I mean you can't guarantee success, but so much of it is out of
[01:41:07] success, but so much of it is out of your control. You just want to step up
[01:41:09] your control. You just want to step up to the plate as many times as possible.
[01:41:11] to the plate as many times as possible. You want to have a great swing. You want
[01:41:12] You want to have a great swing. You want to be in shape. You want to be a good
[01:41:13] to be in shape. You want to be a good person. But a lot of it is resilience. I
[01:41:16] person. But a lot of it is resilience. I have a lot of really talented friends
[01:41:19] have a lot of really talented friends who came up through the alternative
[01:41:20] who came up through the alternative investments community. Masters of the
[01:41:22] investments community. Masters of the universe making one, two, three million
[01:41:25] universe making one, two, three million bucks a year working for hedge funds in
[01:41:27] bucks a year working for hedge funds in the '9s and the odds and then they went
[01:41:29] the '9s and the odds and then they went out on their own, raised a bunch of
[01:41:30] out on their own, raised a bunch of money, hit some bumps and then closed
[01:41:33] money, hit some bumps and then closed their funds or they had a business. I
[01:41:36] their funds or they had a business. I have friends who are entrepreneurs who
[01:41:38] have friends who are entrepreneurs who left a good job and started a business
[01:41:41] left a good job and started a business and they get stuck. These are people
[01:41:43] and they get stuck. These are people who've never known anything but success.
[01:41:45] who've never known anything but success. They got into an Ivy League school. They
[01:41:47] They got into an Ivy League school. They got a great job. Everything has been
[01:41:50] got a great job. Everything has been this. And then they hit a failure and
[01:41:54] this. And then they hit a failure and they just get they lose their mojo. They
[01:41:57] they just get they lose their mojo. They just get stuck. They can't get over it.
[01:41:59] just get stuck. They can't get over it. They lose the confidence to go out and
[01:42:01] They lose the confidence to go out and raise more money. And that's the key.
[01:42:04] raise more money. And that's the key. The key to success is getting shot in
[01:42:06] The key to success is getting shot in the face and then just getting up again.
[01:42:09] the face and then just getting up again. >> I'm 30 years old. How old are you,
[01:42:11] >> I'm 30 years old. How old are you, Scott? I'm 61. Are
[01:42:13] Scott? I'm 61. Are >> you actually 61?
[01:42:14] >> you actually 61? >> No, I'm lying. Yeah, I'm 61. It's the
[01:42:16] >> No, I'm lying. Yeah, I'm 61. It's the testosterone therapy show.
[01:42:18] testosterone therapy show. >> No, I thought I didn't think you were
[01:42:19] >> No, I thought I didn't think you were 61.
[01:42:19] 61. >> I appreciate that.
[01:42:20] >> I appreciate that. >> Um I thought you were
[01:42:22] >> Um I thought you were >> 62 and 61.
[01:42:22] >> 62 and 61. >> 62 in November.
[01:42:23] >> 62 in November. >> If you'd said to me anywhere between
[01:42:27] >> If you'd said to me anywhere between 49 and 61, I think I would have believed
[01:42:31] 49 and 61, I think I would have believed you. Um so
[01:42:33] you. Um so >> I'm sorry. Let me take that out. I'm 49.
[01:42:35] >> I'm sorry. Let me take that out. I'm 49. Stephen,
[01:42:36] Stephen, >> what is it that you know, you're 61, I'm
[01:42:39] >> what is it that you know, you're 61, I'm 33. Yeah. What is it that I don't know
[01:42:41] 33. Yeah. What is it that I don't know as a 30-year-old that I'm going to find
[01:42:43] as a 30-year-old that I'm going to find out in the next 30 years that I should
[01:42:46] out in the next 30 years that I should know? What is the most important thing
[01:42:47] know? What is the most important thing that I should know about the next 30
[01:42:48] that I should know about the next 30 years?
[01:42:50] years? >> Nothing's ever as good or as bad as it
[01:42:52] >> Nothing's ever as good or as bad as it seems. You've raised a ton of money.
[01:42:54] seems. You've raised a ton of money. You're on fire right now.
[01:42:58] >> Be humble
[01:43:00] >> Be humble >> because a lot of this is out is out of
[01:43:01] >> because a lot of this is out is out of your control. If there's AI dumping in
[01:43:03] your control. If there's AI dumping in the US or the swore gets out of control
[01:43:05] the US or the swore gets out of control and the market shuts down,
[01:43:08] and the market shuts down, >> this goes away. Yeah.
[01:43:10] >> this goes away. Yeah. And it's not your fault, but nor is your
[01:43:12] And it's not your fault, but nor is your success. So when things are going really
[01:43:14] success. So when things are going really well, be really humble and invest a lot
[01:43:17] well, be really humble and invest a lot in other people and realize be careful
[01:43:19] in other people and realize be careful not to believe your own press. In 1999,
[01:43:22] not to believe your own press. In 1999, my company was going public. I was
[01:43:23] my company was going public. I was looking at jets. I just thought I was
[01:43:25] looking at jets. I just thought I was the man. I wasn't very kind to my
[01:43:27] the man. I wasn't very kind to my ex-wife. I really thought a lot of
[01:43:30] ex-wife. I really thought a lot of myself. And I I credited my character
[01:43:33] myself. And I I credited my character and my grit for my success without
[01:43:36] and my grit for my success without crediting enough to how lucky I was that
[01:43:39] crediting enough to how lucky I was that I was in the right place at the right
[01:43:41] I was in the right place at the right time. So be humble when you're
[01:43:43] time. So be humble when you're successful. At the same time,
[01:43:45] successful. At the same time, when you fail, and I failed a lot,
[01:43:47] when you fail, and I failed a lot, realize a lot of that isn't your fault,
[01:43:49] realize a lot of that isn't your fault, too. And forgive yourself and move on.
[01:43:53] too. And forgive yourself and move on. So a lot of your success and a lot of
[01:43:54] So a lot of your success and a lot of your failure isn't your fault. And when
[01:43:56] your failure isn't your fault. And when you look back on your life, when you
[01:43:58] you look back on your life, when you look back on the biggest
[01:43:59] look back on the biggest disappointments, and my sense is I don't
[01:44:01] disappointments, and my sense is I don't know much about you, but my sense is you
[01:44:02] know much about you, but my sense is you haven't had a lot of tragedy or a real
[01:44:04] haven't had a lot of tragedy or a real disappointment, at least professionally,
[01:44:05] disappointment, at least professionally, it feels like it's been upward for you.
[01:44:07] it feels like it's been upward for you. When you look back on the most
[01:44:09] When you look back on the most disappointing things that happened to
[01:44:10] disappointing things that happened to you in this decade, you're not going to
[01:44:13] you in this decade, you're not going to be disappointed about the thing that
[01:44:14] be disappointed about the thing that happened. You're going to be
[01:44:15] happened. You're going to be disappointed about how upset you were. M
[01:44:17] disappointed about how upset you were. M >> so forgive yourself and realize this
[01:44:20] >> so forgive yourself and realize this will pass because at the end of life
[01:44:23] will pass because at the end of life people wish they'd allowed themselves to
[01:44:25] people wish they'd allowed themselves to be happier. They think I wish I had not
[01:44:29] be happier. They think I wish I had not beaten myself up over that breakup, over
[01:44:32] beaten myself up over that breakup, over that loss of job, over that loss of
[01:44:34] that loss of job, over that loss of money. They're upset about how upset
[01:44:36] money. They're upset about how upset they were, not about what actually
[01:44:37] they were, not about what actually happened. So nothing's ever as good as
[01:44:39] happened. So nothing's ever as good as bad as it seems. and to really try and
[01:44:44] bad as it seems. and to really try and embrace and I didn't do this embrace
[01:44:46] embrace and I didn't do this embrace relationships and that is I got on a
[01:44:49] relationships and that is I got on a hamster wheel. I just never had enough
[01:44:51] hamster wheel. I just never had enough money. I I was my whole identity was
[01:44:54] money. I I was my whole identity was professional and from like the age of 25
[01:44:56] professional and from like the age of 25 to 45 I don't think I established many
[01:44:59] to 45 I don't think I established many healthy relationships. I just didn't
[01:45:00] healthy relationships. I just didn't invest in them. I was always good with
[01:45:02] invest in them. I was always good with friends. I always stayed in contact. But
[01:45:05] friends. I always stayed in contact. But I kind of woke up in my early 40s. I
[01:45:08] I kind of woke up in my early 40s. I just decided I wanted to move to New
[01:45:09] just decided I wanted to move to New York and not have any obligations. I got
[01:45:12] York and not have any obligations. I got divorced. I don't say left my friends,
[01:45:15] divorced. I don't say left my friends, but I had enough money. I was living
[01:45:16] but I had enough money. I was living like a caveman, just occasionally
[01:45:18] like a caveman, just occasionally leaving for food or alcohol or a good
[01:45:20] leaving for food or alcohol or a good time. And I liked having absolutely no
[01:45:23] time. And I liked having absolutely no obligations and no connection to
[01:45:24] obligations and no connection to anybody. I actually enjoyed it. And then
[01:45:26] anybody. I actually enjoyed it. And then I realized I started reading a lot of
[01:45:28] I realized I started reading a lot of literature on happiness and longevity.
[01:45:30] literature on happiness and longevity. And I realized if I do this, I'll be
[01:45:31] And I realized if I do this, I'll be dead at 55 because I don't have any
[01:45:34] dead at 55 because I don't have any connection to anybody. And so I started
[01:45:36] connection to anybody. And so I started investing in relationships, started
[01:45:39] investing in relationships, started making an effort to reinvigorate
[01:45:41] making an effort to reinvigorate friendships,
[01:45:42] friendships, really start started to think about
[01:45:44] really start started to think about thoughtfully about I want to be around
[01:45:46] thoughtfully about I want to be around people a lot more. I was becoming very
[01:45:48] people a lot more. I was becoming very much an introvert and isolated. I mean,
[01:45:50] much an introvert and isolated. I mean, and I was like a poorer version of
[01:45:52] and I was like a poorer version of Howard Hughes at some point, but so I
[01:45:54] Howard Hughes at some point, but so I would say really focus on as much as you
[01:45:57] would say really focus on as much as you can relationships and investing in them
[01:46:00] can relationships and investing in them uh in your 30s. I think it's super
[01:46:02] uh in your 30s. I think it's super rewarding or it pays off. But more than
[01:46:04] rewarding or it pays off. But more than anything, just forgive yourself
[01:46:06] anything, just forgive yourself >> and become a dad.
[01:46:08] >> and become a dad. >> Well, look, I didn't want to have kids.
[01:46:11] >> Well, look, I didn't want to have kids. So, and I think you can be happy without
[01:46:13] So, and I think you can be happy without kids. I kind of got forced into it. I
[01:46:16] kids. I kind of got forced into it. I was with someone who much higher
[01:46:19] was with someone who much higher character, much more attractive than me,
[01:46:21] character, much more attractive than me, and she decided she wanted to have kids,
[01:46:22] and she decided she wanted to have kids, and I said, "Well, I'm never getting
[01:46:23] and I said, "Well, I'm never getting married again." And she called my bluff
[01:46:25] married again." And she called my bluff and said, "We don't have to be married."
[01:46:27] and said, "We don't have to be married." So, I'm Anyways, so I didn't want kids.
[01:46:32] So, I'm Anyways, so I didn't want kids. Uh, but now, and this goes back to
[01:46:35] Uh, but now, and this goes back to purpose. Everyone talks about finding a
[01:46:36] purpose. Everyone talks about finding a purpose. And the way I approach life,
[01:46:38] purpose. And the way I approach life, which was really screwed up, is I
[01:46:39] which was really screwed up, is I approached it from a capitalist
[01:46:40] approached it from a capitalist standpoint. I want more out of this
[01:46:42] standpoint. I want more out of this friendship than I'm getting. I want more
[01:46:44] friendship than I'm getting. I want more out of this investment than I invested.
[01:46:45] out of this investment than I invested. I want to pay this employee less than
[01:46:48] I want to pay this employee less than the value they're creating. I saw
[01:46:49] the value they're creating. I saw everything as a transaction that I
[01:46:50] everything as a transaction that I wanted to be on the right side of. And
[01:46:52] wanted to be on the right side of. And what I figured out is that finding your
[01:46:55] what I figured out is that finding your purpose is finding that thing that you
[01:46:57] purpose is finding that thing that you can never get a a real positive return
[01:47:01] can never get a a real positive return on. What do I mean by that? I will never
[01:47:03] on. What do I mean by that? I will never get a positive return for my children.
[01:47:05] get a positive return for my children. There's they're never going to be up at
[01:47:07] There's they're never going to be up at 2 a.m. worried about me, right? I I they
[01:47:11] 2 a.m. worried about me, right? I I they are costing me so much money. I don't
[01:47:12] are costing me so much money. I don't care how nice the senior home is that
[01:47:14] care how nice the senior home is that I'm putting them in. The amount of love,
[01:47:16] I'm putting them in. The amount of love, concern, and anxiety I feel over my
[01:47:18] concern, and anxiety I feel over my boys, it would be almost impossible for
[01:47:21] boys, it would be almost impossible for them to return that. And that's the
[01:47:24] them to return that. And that's the point. I finally have something where I
[01:47:26] point. I finally have something where I feel like that sense of purpose. The
[01:47:28] feel like that sense of purpose. The most loyal, proud Americans are veterans
[01:47:31] most loyal, proud Americans are veterans because there's no way America can
[01:47:32] because there's no way America can really pay them back for leaving their
[01:47:34] really pay them back for leaving their families and risking their person.
[01:47:37] families and risking their person. That's an unparalleled investment.
[01:47:38] That's an unparalleled investment. They'll never get that investment back,
[01:47:40] They'll never get that investment back, but what they get back is purpose. And I
[01:47:42] but what they get back is purpose. And I didn't realize that finding something
[01:47:44] didn't realize that finding something you were so passionate about, whether it
[01:47:46] you were so passionate about, whether it was a nonprofit or saving dogs from kill
[01:47:49] was a nonprofit or saving dogs from kill shelters or getting involved in a
[01:47:51] shelters or getting involved in a woman's right to have bodily autonomy
[01:47:53] woman's right to have bodily autonomy where you give so much that there's just
[01:47:56] where you give so much that there's just no way you'll ever really get at the
[01:47:57] no way you'll ever really get at the kind of tangible ROI. And what I
[01:48:00] kind of tangible ROI. And what I realized is when you find that, that's
[01:48:01] realized is when you find that, that's your purpose. And I didn't realize that.
[01:48:04] your purpose. And I didn't realize that. So having kids for me has given me
[01:48:07] So having kids for me has given me purpose. Um I enjoy it, but that's my
[01:48:10] purpose. Um I enjoy it, but that's my thing now. I realized that's my job is
[01:48:13] thing now. I realized that's my job is to overinvest is not to measure it
[01:48:15] to overinvest is not to measure it because I'm just not going to get that
[01:48:16] because I'm just not going to get that back. So for me, having kids has been my
[01:48:19] back. So for me, having kids has been my purpose. Also, building something with a
[01:48:22] purpose. Also, building something with a partner is really rewarding. It's like
[01:48:25] partner is really rewarding. It's like you travel a lot. Inevitably, I don't
[01:48:27] you travel a lot. Inevitably, I don't know if this happens to you. I always
[01:48:28] know if this happens to you. I always get upgraded to the presidential suite
[01:48:30] get upgraded to the presidential suite when I'm alone. And it's like it doesn't
[01:48:32] when I'm alone. And it's like it doesn't happen. If you don't have someone to
[01:48:34] happen. If you don't have someone to share with, it doesn't happen. And the
[01:48:37] share with, it doesn't happen. And the most rewarding businesses, the most
[01:48:38] most rewarding businesses, the most rewarding things in my life have been
[01:48:40] rewarding things in my life have been when you build something with a
[01:48:41] when you build something with a co-founder or you build something with a
[01:48:43] co-founder or you build something with a partner. And when you have kids who go
[01:48:45] partner. And when you have kids who go up and down in terms of their health and
[01:48:48] up and down in terms of their health and well-being and they end up being good
[01:48:49] well-being and they end up being good kids, which my kids are, and you've
[01:48:52] kids, which my kids are, and you've built that with somebody, that's
[01:48:54] built that with somebody, that's immensely rewarding. It's like the most
[01:48:56] immensely rewarding. It's like the most rewarding businesses I've ever had have
[01:48:58] rewarding businesses I've ever had have been with co-founders where we build
[01:49:00] been with co-founders where we build something together and it pays off. It's
[01:49:02] something together and it pays off. It's just so rewarding. And when you don't
[01:49:04] just so rewarding. And when you don't build something with someone else, when
[01:49:05] build something with someone else, when you build it on your own, it's like
[01:49:07] you build it on your own, it's like getting upgraded without anyone there.
[01:49:10] getting upgraded without anyone there. It's like it didn't happen.
[01:49:11] It's like it didn't happen. >> So, so true.
[01:49:13] >> So, so true. >> Yeah.
[01:49:13] >> Yeah. >> What I tell any man, there's never a
[01:49:16] >> What I tell any man, there's never a good time to have kids. You want to be
[01:49:18] good time to have kids. You want to be somewhat financially stable. That was my
[01:49:20] somewhat financially stable. That was my problem. The biggest source of stress in
[01:49:21] problem. The biggest source of stress in my life was when I had kids in ' 08 when
[01:49:24] my life was when I had kids in ' 08 when I went broke. That was hugely stressful.
[01:49:27] I went broke. That was hugely stressful. And I think you want to make sure you
[01:49:29] And I think you want to make sure you have a not only a partner you're
[01:49:30] have a not only a partner you're committed to, but I didn't realize how
[01:49:31] committed to, but I didn't realize how important competence was in a partner
[01:49:33] important competence was in a partner because once you have kids, it's like
[01:49:35] because once you have kids, it's like operating a panzer tank division.
[01:49:36] operating a panzer tank division. There's a lot of moving parts and you
[01:49:38] There's a lot of moving parts and you have to have a competent partner. But I
[01:49:40] have to have a competent partner. But I think if you're economically somewhat
[01:49:42] think if you're economically somewhat secure and you have a competent partner,
[01:49:44] secure and you have a competent partner, for me, I'm not going to tell anyone
[01:49:47] for me, I'm not going to tell anyone what to do. Having kids has given me my
[01:49:50] what to do. Having kids has given me my purpose. Hands down the most rewarding
[01:49:52] purpose. Hands down the most rewarding thing I will ever do. That is that is
[01:49:54] thing I will ever do. That is that is the last thing I I will think about when
[01:49:55] the last thing I I will think about when I die. I see so much emotion in you when
[01:49:58] I die. I see so much emotion in you when you talk about it. Your eyes filled with
[01:50:01] you talk about it. Your eyes filled with with tears as you started talking about
[01:50:02] with tears as you started talking about your boys.
[01:50:03] your boys. >> Yeah.
[01:50:04] >> Yeah. >> Because I know a lot about you because
[01:50:05] >> Because I know a lot about you because I've interviewed you a couple of times
[01:50:06] I've interviewed you a couple of times now and I've read your books and so on
[01:50:07] now and I've read your books and so on and it's um I find that fascinating
[01:50:09] and it's um I find that fascinating because I don't know what it is. I don't
[01:50:11] because I don't know what it is. I don't know what where where that emotion comes
[01:50:13] know what where where that emotion comes from because lots of people sit here and
[01:50:14] from because lots of people sit here and talk about their kids,
[01:50:15] talk about their kids, >> right?
[01:50:15] >> right? >> But when you the minute you started
[01:50:16] >> But when you the minute you started mentioning them and the impact they've
[01:50:18] mentioning them and the impact they've had on you in terms of find helping you
[01:50:19] had on you in terms of find helping you find your purpose, I could see the
[01:50:20] find your purpose, I could see the emotion in your eyes.
[01:50:21] emotion in your eyes. >> Yeah. Well, there's and there's some
[01:50:23] >> Yeah. Well, there's and there's some things that surprised me.
[01:50:26] things that surprised me. one,
[01:50:28] one, I didn't fall in love with my kid. I I
[01:50:29] I didn't fall in love with my kid. I I didn't love my kids when they first came
[01:50:31] didn't love my kids when they first came out into the world. You know, I felt a
[01:50:33] out into the world. You know, I felt a sense of responsibility and anxiety. I
[01:50:35] sense of responsibility and anxiety. I did not like having babies. I was able
[01:50:38] did not like having babies. I was able to be selfish up to that point. I didn't
[01:50:39] to be selfish up to that point. I didn't feel this immense level of love and
[01:50:41] feel this immense level of love and support that you were supposed to. I
[01:50:44] support that you were supposed to. I fell in love with my voice. It was a
[01:50:46] fell in love with my voice. It was a sort of slow incremental
[01:50:48] sort of slow incremental incremental thing. And like I mean for
[01:50:52] incremental thing. And like I mean for me and I think everyone's a little bit
[01:50:54] me and I think everyone's a little bit different. The reason I'm just
[01:50:56] different. The reason I'm just fascinated and obsessed with my older
[01:50:58] fascinated and obsessed with my older one is he's me. I look at him you know
[01:51:03] one is he's me. I look at him you know the bad skin he had in his junior year.
[01:51:05] the bad skin he had in his junior year. It's scrawny 6'1 130 lb. The way he
[01:51:10] It's scrawny 6'1 130 lb. The way he laughs I'm like when I hug him I'm
[01:51:12] laughs I'm like when I hug him I'm hugging me at 17 or 18. It's like so
[01:51:14] hugging me at 17 or 18. It's like so rewarding and that's why I love him so
[01:51:17] rewarding and that's why I love him so much. My second is a different species
[01:51:19] much. My second is a different species than me. I observe him with fascination
[01:51:22] than me. I observe him with fascination because I can't get over the fact that I
[01:51:25] because I can't get over the fact that I made that because he is so different
[01:51:27] made that because he is so different than me. So I'm fascinated by him in
[01:51:30] than me. So I'm fascinated by him in just such a different way. Right? My
[01:51:33] just such a different way. Right? My oldest is a pleaser used to come into
[01:51:35] oldest is a pleaser used to come into the bedroom on a Sunday and get in bed
[01:51:37] the bedroom on a Sunday and get in bed with us and then stand up and say,
[01:51:39] with us and then stand up and say, "Let's make a plan." And I'd be like,
[01:51:41] "Let's make a plan." And I'd be like, "Where are the cameras? This is like a
[01:51:42] "Where are the cameras? This is like a hall this like about Hallmark film,
[01:51:45] hall this like about Hallmark film, right? My youngest is a terrorist
[01:51:47] right? My youngest is a terrorist assessing the household for
[01:51:48] assessing the household for vulnerabilities so we can strike more at
[01:51:50] vulnerabilities so we can strike more at our weakest. The only one recommendation
[01:51:53] our weakest. The only one recommendation I make to anyone who has a kid have two
[01:51:55] I make to anyone who has a kid have two because it's fascinating. The only thing
[01:51:57] because it's fascinating. The only thing I can guarantee you when I meet someone
[01:51:58] I can guarantee you when I meet someone with one kid I'm like okay the only
[01:52:01] with one kid I'm like okay the only thing I know about the second is it will
[01:52:03] thing I know about the second is it will be entirely different. If you want to
[01:52:06] be entirely different. If you want to believe in nature over nurture have two
[01:52:08] believe in nature over nurture have two kids cuz they just could not be more
[01:52:11] kids cuz they just could not be more different. And it's fascinating.
[01:52:14] different. And it's fascinating. We just haven't treated our boys that
[01:52:16] We just haven't treated our boys that differently. I don't We just haven't.
[01:52:17] differently. I don't We just haven't. Birth order, whatever. There's something
[01:52:19] Birth order, whatever. There's something in the batter. First Lady Obama,
[01:52:21] in the batter. First Lady Obama, Michelle Obama said something that
[01:52:23] Michelle Obama said something that really struck with me. She said, "They
[01:52:25] really struck with me. She said, "They come to you." You know, I forget. I
[01:52:27] come to you." You know, I forget. I think you've had that guy on the the
[01:52:29] think you've had that guy on the the child psychologist guy.
[01:52:32] child psychologist guy. We think we're engineers and that we get
[01:52:34] We think we're engineers and that we get to make the sheep. We don't. We're
[01:52:36] to make the sheep. We don't. We're shepherds. We get to point them in the
[01:52:37] shepherds. We get to point them in the right direction. We get to pick what
[01:52:38] right direction. We get to pick what food they have. But they come to you. Oh
[01:52:41] food they have. But they come to you. Oh my god. I've seen that. And I I like to
[01:52:44] my god. I've seen that. And I I like to think I always thought that my kids were
[01:52:47] think I always thought that my kids were going to be super into World War II
[01:52:49] going to be super into World War II movies and CrossFit because that's what
[01:52:50] movies and CrossFit because that's what dad is into and they just want to hang
[01:52:51] dad is into and they just want to hang out with me. No, no, no. The reason I'm
[01:52:53] out with me. No, no, no. The reason I'm up at night looking at Pokemon cards is
[01:52:55] up at night looking at Pokemon cards is because my kid is interested in it. And
[01:52:57] because my kid is interested in it. And what you realize as a dad is if you want
[01:52:59] what you realize as a dad is if you want to be a good dad, you have to lean into
[01:53:01] to be a good dad, you have to lean into their interests. I'm not fascinated by
[01:53:03] their interests. I'm not fascinated by Pokemon. That's not something I will do
[01:53:04] Pokemon. That's not something I will do once my kid is out of the house. But
[01:53:07] once my kid is out of the house. But what you realize is you have to engage
[01:53:08] what you realize is you have to engage in their interests and get into it. And
[01:53:11] in their interests and get into it. And it took me a while because I was very
[01:53:13] it took me a while because I was very selfish to fully embrace that. I hated
[01:53:16] selfish to fully embrace that. I hated giving up my weekends. I liked having
[01:53:18] giving up my weekends. I liked having fabulous brunch with interesting people.
[01:53:20] fabulous brunch with interesting people. I liked going to St. Barts. All that
[01:53:22] I liked going to St. Barts. All that went away and I resented it for a while.
[01:53:25] went away and I resented it for a while. And then there's a certain ease and
[01:53:27] And then there's a certain ease and relaxation that overwhelms you where
[01:53:28] relaxation that overwhelms you where you're like, I'm not trying to be
[01:53:30] you're like, I'm not trying to be fabulous. I know what I'm doing this
[01:53:31] fabulous. I know what I'm doing this weekend. I'm going to some lame birthday
[01:53:32] weekend. I'm going to some lame birthday party for four-year-olds and then I'm
[01:53:34] party for four-year-olds and then I'm taking my kid to the soccer game. And
[01:53:36] taking my kid to the soccer game. And there's a certain ease about it that's
[01:53:38] there's a certain ease about it that's relaxing and liberating. Building some
[01:53:41] relaxing and liberating. Building some something with somebody, seeing the way
[01:53:43] something with somebody, seeing the way they evolve is just fascinating. Just
[01:53:46] they evolve is just fascinating. Just fascinating the things they start asking
[01:53:47] fascinating the things they start asking you and you find your purpose. So I
[01:53:51] you and you find your purpose. So I would recommend it to anybody, but I
[01:53:53] would recommend it to anybody, but I also want to be clear, it's not
[01:53:55] also want to be clear, it's not oneizefits-all. Some people can be very
[01:53:56] oneizefits-all. Some people can be very happy not having it. So many of the
[01:53:58] happy not having it. So many of the themes we've talked about today,
[01:53:59] themes we've talked about today, especially at the end of this
[01:54:00] especially at the end of this conversation, are in your new book,
[01:54:02] conversation, are in your new book, Notes on Being a Man: How to Address the
[01:54:04] Notes on Being a Man: How to Address the Masculinity Crisis: Build Mental
[01:54:05] Masculinity Crisis: Build Mental Strength and Raise Concerns. I've had so
[01:54:07] Strength and Raise Concerns. I've had so many of my friends, actually, both men
[01:54:09] many of my friends, actually, both men and women, talk to me about this book
[01:54:11] and women, talk to me about this book because I think so many of us have been
[01:54:12] because I think so many of us have been looking for a road map on exactly these
[01:54:14] looking for a road map on exactly these subjects, notes on being a man. Some of
[01:54:17] subjects, notes on being a man. Some of the themes are here on the back about
[01:54:19] the themes are here on the back about being kind, about being a good dad, and
[01:54:21] being kind, about being a good dad, and how what that means. Um, but a lot of
[01:54:23] how what that means. Um, but a lot of the stuff you said about getting out of
[01:54:24] the stuff you said about getting out of the house and really attacking the world
[01:54:25] the house and really attacking the world and who one should become in such a
[01:54:27] and who one should become in such a world. So, I highly recommend anybody
[01:54:28] world. So, I highly recommend anybody who is raising sons, who is a son, who
[01:54:30] who is raising sons, who is a son, who is a father, or just wants to understand
[01:54:32] is a father, or just wants to understand what it is to be a good man in the
[01:54:34] what it is to be a good man in the modern world gets this book. And I'm
[01:54:35] modern world gets this book. And I'm going to link it below.
[01:54:36] going to link it below. >> Thank you.
[01:54:37] >> Thank you. >> All of you. It's been such a smash hit
[01:54:39] >> All of you. It's been such a smash hit and it's it's really rare that my some
[01:54:40] and it's it's really rare that my some of my close friends will send me
[01:54:42] of my close friends will send me messages thinking about one particular
[01:54:43] messages thinking about one particular friend about this book and I think it's
[01:54:45] friend about this book and I think it's a testament to the impact it's having on
[01:54:47] a testament to the impact it's having on the world. So, thank you for writing
[01:54:48] the world. So, thank you for writing this. I know you took a long time to
[01:54:50] this. I know you took a long time to write this book because we talked about
[01:54:51] write this book because we talked about it a couple of times when we had
[01:54:52] it a couple of times when we had conversations, but it is finally here.
[01:54:55] conversations, but it is finally here. >> Who is it for? I write with only one
[01:54:58] >> Who is it for? I write with only one objective. I hope my sons read my stuff
[01:55:00] objective. I hope my sons read my stuff in 30 or 40 years and feel like they
[01:55:03] in 30 or 40 years and feel like they understand me and the world a little bit
[01:55:05] understand me and the world a little bit better. That's it. And I tried I use
[01:55:07] better. That's it. And I tried I use that as a means of trying to be fearless
[01:55:08] that as a means of trying to be fearless because there's so many comments and so
[01:55:11] because there's so many comments and so many there's such a narrative out there
[01:55:13] many there's such a narrative out there trying to shape your views around the
[01:55:14] trying to shape your views around the orthodoxy that whatever your political
[01:55:17] orthodoxy that whatever your political leanings are, it's pretty easy to be
[01:55:18] leanings are, it's pretty easy to be intimidated into having a certain
[01:55:20] intimidated into having a certain narrative. So I try to write as if no
[01:55:22] narrative. So I try to write as if no one's going to read it but my sons in 20
[01:55:23] one's going to read it but my sons in 20 or 30 years. We have a closing tradition
[01:55:25] or 30 years. We have a closing tradition where the last guest leaves a question
[01:55:26] where the last guest leaves a question for the next not knowing who they're
[01:55:27] for the next not knowing who they're leaving it for. The question left for
[01:55:29] leaving it for. The question left for you is in one sentence, what is the most
[01:55:31] you is in one sentence, what is the most challenging setback you've experienced?
[01:55:32] challenging setback you've experienced? And what's the lesson you want to pass
[01:55:34] And what's the lesson you want to pass on to others?
[01:55:36] on to others? That's easy. My mother dying and you can
[01:55:40] That's easy. My mother dying and you can never tell your parents how much you
[01:55:42] never tell your parents how much you love them too much. Forgive them. And
[01:55:50] my mom died slowly, which is bad for
[01:55:52] my mom died slowly, which is bad for her, but it was good for me cuz nothing
[01:55:54] her, but it was good for me cuz nothing went outside.
[01:55:59] >> What What is What is the emotion you're
[01:56:01] >> What What is What is the emotion you're experiencing?
[01:56:01] experiencing? >> I miss my mom terribly. I'm I'm a
[01:56:04] >> I miss my mom terribly. I'm I'm a middle-aged man who hasn't gotten over
[01:56:06] middle-aged man who hasn't gotten over the death of his mother. Lie to my life.
[01:56:14] raised me on her own secretary salary.
[01:56:18] raised me on her own secretary salary. Gave me confidence,
[01:56:22] everything.
[01:56:26] >> Is there a way to You said you're a
[01:56:28] >> Is there a way to You said you're a middle-aged man that hasn't gotten over
[01:56:29] middle-aged man that hasn't gotten over the loss of his mother. Is there a way
[01:56:31] the loss of his mother. Is there a way to
[01:56:31] to >> I don't want to. I I think the recedes
[01:56:35] >> I don't want to. I I think the recedes for love is grief. I hope my boys feel
[01:56:37] for love is grief. I hope my boys feel the same way about me. Hasn't got in the
[01:56:39] the same way about me. Hasn't got in the way of my life.
[01:56:41] way of my life. uh makes me be more bold with my
[01:56:44] uh makes me be more bold with my emotions.
[01:56:46] emotions. Yeah. I don't I see it I used to see it
[01:56:49] Yeah. I don't I see it I used to see it as a problem. I went to grief
[01:56:50] as a problem. I went to grief counseling. Now I see it as a not a bug
[01:56:53] counseling. Now I see it as a not a bug but as a feature.
[01:56:55] but as a feature. The the recedes for for love are grief
[01:56:58] The the recedes for for love are grief and anxiety. And so what I would tell
[01:57:00] and anxiety. And so what I would tell every young person is I I hope they have
[01:57:03] every young person is I I hope they have a lot of joy in their life. I also hope
[01:57:04] a lot of joy in their life. I also hope they have a decent amount of grief
[01:57:06] they have a decent amount of grief because that means they have people they
[01:57:07] because that means they have people they loved immensely.
[01:57:11] Amen.
[01:57:13] Amen. >> Thank you, Scott.
[01:57:13] >> Thank you, Scott. >> Thank you, Steve. Good to see you.
[01:57:15] >> Thank you, Steve. Good to see you. Congratulations on your burgeoning
[01:57:17] Congratulations on your burgeoning empire.
[01:57:19] empire. >> Jesus Christ, man. This is out of
[01:57:21] >> Jesus Christ, man. This is out of control.
[01:57:24] control. >> You say this every time.
[01:57:25] >> You say this every time. >> Every time I come here, I'm like, "Where
[01:57:27] >> Every time I come here, I'm like, "Where are the helicopters?"
[01:57:30] are the helicopters?" Is is it's like pretty soon I'm going to
[01:57:32] Is is it's like pretty soon I'm going to see like a nuclear tower power plant
[01:57:34] see like a nuclear tower power plant towering next to It's just amazing. and
[01:57:37] towering next to It's just amazing. and you're you're like everywhere. At one
[01:57:40] you're you're like everywhere. At one point at one point I was on I was I was
[01:57:42] point at one point I was on I was I was at the airport with my boys and I'm
[01:57:44] at the airport with my boys and I'm staring at this like 100 foot like
[01:57:46] staring at this like 100 foot like testimonial deal and I'm like make it
[01:57:48] testimonial deal and I'm like make it stop.
[01:57:51] stop. God
[01:57:52] God Jesus. YouTube have this new crazy
[01:57:55] Jesus. YouTube have this new crazy algorithm where they know exactly what
[01:57:57] algorithm where they know exactly what video you would like to watch next based
[01:57:59] video you would like to watch next based on AI and all of your viewing behavior.
[01:58:01] on AI and all of your viewing behavior. And the algorithm says that this video
[01:58:04] And the algorithm says that this video is the perfect video for you. It's
[01:58:06] is the perfect video for you. It's different for everybody looking right
[01:58:07] different for everybody looking right now.