Jensen Huang: NVIDIA - The $4 Trillion Company & the AI Revolution | Lex Fridman Podcast #494
https://www.youtube.com/watch?v=vif8NQcjVf0
[00:00] The following is a conversation with Jensen Huang, CEO of Nvidia, one of the most important and influential companies in the history of human civilization.
[00:11] Nvidia is the engine powering the AI revolution, and a lot of its success can be directly attributed to Jensen's sheer force of will and his many brilliant bets and decisions as a leader, engineer, and innovator.
[00:26] This is the Lex Freedman podcast.
[00:28] And now, dear friends, here's Jensen Huang.
[00:33] You've propelled Nvidia into a new era in AI, moving beyond his focus on chip scale design to now rack scale design.
[00:41] And I think it's fair to say that winning for Nvidia for a long time used to be about building the best GPU possible.
[00:47] And you still do, but now you've expanded that to extreme co-design of GPU, CPU, memory, networking, storage, power, cooling, software, the rack itself, the pod that you've announced, and even the data.
[01:01] You've announced, and even the data center.
[01:03] So, let's talk about extreme code design.
[01:05] What uh is the hardest part of uh co-designing a system with that many complex components and design variables?
[01:10] Yeah, thanks for that question.
[01:13] So first of all, the reason why extreme code design is necessary is because the problem no longer fits inside one computer to be accelerated by one GPU.
[01:24] The problem that you're trying to solve is you would like to go faster than the number of computers that you add.
[01:32] So you added, you know, 10,000 computers, but you would like it to go a million times faster.
[01:38] Then all of a sudden you have to take the algorithm, you have to break up the algorithm, you have to refactor it, you have to shard the pipeline, you have to shard the data, you have to shard the model.
[01:52] Now all of a sudden when you distribute the problem this way, not just scaling up the problem, but you're distributing the problem, then everything gets in the
[02:02] the problem, then everything gets in the way.
[02:05] This is the AMD doll's law problem where the amount of speed up you have for something depends on how much of the total workload it is.
[02:13] And so if computation represents 50% of the problem and I sped up computation infinitely like a million times you know I only sped up the total workload by a factor of two.
[02:25] Now all of a sudden, not only do you have to distribute the computation, you have to, you know, shard the pipeline somehow.
[02:35] Uh you also have to solve the networking problem because you've got all of these computers are all connected together.
[02:42] And so distributed computing at the scale that we do, the CPU is a problem, the GPU is a problem, the networking is a problem, the switching is a problem, and distributing the workload across all these computers are a problem.
[02:57] It's just a massively complex computer science problem and so we just got to bring every technology to bear otherwise
[03:05] bring every technology to bear otherwise we scale up linearly.
[03:08] we scale up linearly or we scale up based on uh the
[03:12] or we scale up based on uh the capabilities of Moore's law which has
[03:13] capabilities of Moore's law which has largely slowed because Dernard's scaling
[03:15] largely slowed because Dernard's scaling has slowed.
[03:17] I'm sure there's trade-offs there.
[03:19] Plus you have a completely disperate disciplines here.
[03:22] I'm sure you have specialists in each one of these
[03:23] high bandwidth memory, the the
[03:25] high bandwidth memory, the the networking, the NVL link, the nyx, the
[03:27] networking, the NVL link, the nyx, the the optics and the copper that you're
[03:28] the optics and the copper that you're doing, the power delivery, the cooling,
[03:30] doing, the power delivery, the cooling, all that.
[03:31] I mean, there's like world experts in each of those.
[03:33] How do you get them in a room together to figure out
[03:35] them in a room together to figure out >> That's why my staff is so large.
[03:37] >> That's why my staff is so large. >> What's the pro can you take me through
[03:39] the process of the specialists and the
[03:41] generalists? Like, how do you put
[03:43] together the rack when you know this the
[03:45] set of things you have to shove into a
[03:47] rack together? Yeah,
[03:48] >> like what does that process look like of
[03:50] designing it all together?
[03:51] >> There's the the first question which is
[03:53] what is extreme code design? You're
[03:56] we're optimizing across the entire stack
[03:58] of software from architectures to chips
[04:00] to systems to system software to the
[04:03] algorithms to the applications.
[04:05] That's one layer.
[04:05] The second thing that you and
[04:06] One layer.
[04:10] The second thing that you and I just talked about is goes beyond CPUs and GPUs and networking chips and scale up switches and scale out switches.
[04:15] And then of course you got to include power and cooling and all of that because you know all these computers are extremely extremely power power hungry.
[04:23] They do a lot of work and they're very energy efficient but they in aggregate still consume a lot of power.
[04:30] And so that's one the first question is what is it?
[04:32] The second question is why is it and we just spoke about the reason you know you want to distribute the workload so that you can exceed the benefit of just increasing the number of computers and and then the third question is how is it how do you do it.
[04:51] And and uh that's the that's kind of the miracle of this company you know when you're designing a computer you have to have operating system of computers when you're designing a company you should first think about what is it that you want the company to produce.
[05:04] You know, I see a lot of companies organization
[05:08] see a lot of companies organization charts and they all look the same.
[05:09] charts and they all look the same. Hamburger organization charts, software
[05:12] Hamburger organization charts, software organization charts and car company
[05:14] organization charts and car company organization charts.
[05:15] organization charts. They all look the same.
[05:17] same. And it doesn't make any sense to me.
[05:20] me. You know, the goal of au of a company is to be the machinery, the
[05:22] company is to be the machinery, the mechanism, the system that produces
[05:25] mechanism, the system that produces the output and that output is the
[05:27] the output and that output is the product that we like to create. It is
[05:30] product that we like to create. It is also designed the architecture of the
[05:32] also designed the architecture of the company should reflect the environment
[05:34] company should reflect the environment by which it exists. It almost directly
[05:37] by which it exists. It almost directly says what you should do with the
[05:39] says what you should do with the organization. My direct staff is 60
[05:42] organization. My direct staff is 60 people. You know, I don't have one-on-
[05:44] people. You know, I don't have one-on- ones with them because it's impossible.
[05:46] ones with them because it's impossible. You can't have you can't have 60 people
[05:47] You can't have you can't have 60 people on your staff if you're, you know, going
[05:50] on your staff if you're, you know, going to get work done. And
[05:51] to get work done. And >> so you still have 60 reports. You still
[05:53] >> so you still have 60 reports. You still have more. Yeah.
[05:56] have more. Yeah. >> And most stars at least have a foot in
[05:58] >>> And most stars at least have a foot in engineering.
[05:59] engineering. almost all of them. There's experts in
[06:02] almost all of them. There's experts in memory, there's experts in CPUs, there's
[06:04] memory, there's experts in CPUs, there's experts in optical all Yeah. GPUs and
[06:08] Experts in optical all, yeah, GPUs and architecture, algorithms, design.
[06:11] Architecture, algorithms, design.
[06:13] So, you constantly have an eye on the entire stack and you're having to do like intense discussions about the design of the entire stack.
[06:18] And no conversation is ever one person.
[06:21] That's why I don't do one-on-ones.
[06:23] We present a problem and all of us attack it, you know, because we're doing extreme code design and literally the company is doing extreme code design all the time.
[06:33] So even if you're talking about a particular component like cooling, networking, everybody's listening in.
[06:41] Yeah.
[06:42] And they can contribute, well, this doesn't work for the for the power distribution.
[06:46] This doesn't exactly, this doesn't work for the for the memory.
[06:49] This doesn't work for this.
[06:51] Exactly. And whoever wants to tune out, tune out.
[06:54] You know what I'm saying? And the reason for that is because because the people who are on the staff, they they know when to pay attention.
[07:01] They're supposed, you know, something they could have contributed to, they didn't contribute to.
[07:03] I'm going to call them out, you know, and so, hey, come on, let's get in here.
[07:07] So, as you mentioned, Nvidia is this
[07:09] So, as you mentioned, Nvidia is this company that's adapting to the environment.
[07:11] So, at which point can you say, did the environment change?
[07:16] be began adapting sort of secretly in the early days from GPU for gaming maybe the early deep learning revolution
[07:24] to we're now going to start thinking of it as an AI factory.
[07:28] What does Nvidia do is produces AI.
[07:30] Let's build a factory that makes AI.
[07:33] I could I you could I could reason through it just systematically.
[07:36] Um we started out as as an accelerator company
[07:39] but the problem with accelerators is that the application domain is too narrow.
[07:42] It has the benefit of being incredibly optimized for the job.
[07:45] You know, any specialist has that benefit.
[07:49] The problem with intense specialization is that of course your market reach is narrower,
[07:57] but that's that's even fine.
[07:59] The problem is the market size also dictates your R&D capacity.
[08:04] And your R&D capacity ultimately dictates the influence and
[08:11] ultimately dictates the influence and impact that you can possibly have in computing.
[08:15] And so when we first started out in acceler as an accelerator, very specific accelerator, we always we always knew that that had that was going to be our first step.
[08:21] We had to find a way to become accelerated computing.
[08:26] But the problem is when you become a computing company, it's too general purpose and it takes away from your specialization.
[08:32] It turn I connected two words that are actually have fundamental tension.
[08:38] The better computing company we become, the worse we become as a specialist.
[08:43] The more of a specialist, the less capacity we have to do overall computing.
[08:47] And so the that and I connected those two words together on purpose that the company has to find that really narrow path step by step by step to expand our aperture of computing but not give up on the most important specialization that we had.
[09:05] Okay.
[09:07] So the first step that we took beyond acceleration was we invented the programmable pixel shader.
[09:11] So that was
[09:13] programmable pixel shader.
[09:13] So that was the first step towards programmability.
[09:16] the first step towards programmability.
[09:16] our you know that was our first journey
[09:18] our you know that was our first journey towards moving into the world of
[09:20] towards moving into the world of computing.
[09:20] The second thing that we did
[09:22] computing. The second thing that we did was we we uh uh created uh we put FP32
[09:26] into our shaders.
[09:26] That FP32 step IE
[09:30] compatible FP32 was a huge step in the
[09:34] direction of computing.
[09:34] It was the
[09:37] direction of computing. It was the reason why um all of the people who were
[09:39] reason why um all of the people who were working on on um stream processors and
[09:43] working on on um stream processors and you know other types of data flow
[09:44] you know other types of data flow processors discovered us and they say
[09:46] processors discovered us and they say hey all of a sudden you know we might be
[09:48] hey all of a sudden you know we might be able to use this GPUs that's incredibly
[09:50] able to use this GPUs that's incredibly computationally intensive and it's now
[09:53] computationally intensive and it's now you know compliant with it e I can take
[09:55] you know compliant with it e I can take my software that I was writing you know
[09:58] my software that I was writing you know previously on CPUs and I could you know
[10:00] previously on CPUs and I could you know see about you know using the GPU for
[10:03] see about you know using the GPU for that
[10:04] that and which led us to create put C on top
[10:08] and which led us to create put C on top of FP32 was called we call CG that CG
[10:12] of FP32 was called we call CG that CG path took us to eventually CUDA CUDA
[10:15] Path took us to eventually CUDA, CUDA, step by step by step, um, uh, we well.
[10:19] Step by step by step, um, uh, we well putting CUDA on GeForce, that, that was a.
[10:22] Putting CUDA on GeForce, that, that was a strategic decision that was very, very.
[10:24] Strategic decision that was very, very hard to do because it cost the company.
[10:27] Hard to do because it cost the company enormous amounts of our profits and we.
[10:29] Enormous amounts of our profits and we couldn't afford it at the time but we.
[10:31] Couldn't afford it at the time but we did it anyways because we wanted to be a.
[10:33] did it anyways because we wanted to be a computing company, a computing company.
[10:36] Computing company, a computing company has a computing architecture.
[10:38] A computing architecture has to be.
[10:39] Computing architecture has to be compatible across all of the chips that.
[10:41] we build.
[10:42] Can you, can you take me through that decision?
[10:43] So, putting CUDA on GeForce could not afford to do.
[10:45] Can you explain that decision?
[10:47] Why that decision?
[10:49] Why boldly choose to do that anyway?
[10:52] Can you explain that decision?
[10:53] Excellent.
[10:56] That was that was the first I would I would say that that was the first um.
[10:58] Would I would say that that was the first um the first strategic decision that that.
[11:01] The first strategic decision that that is as close to an existential threat for.
[11:04] People who don't know it turned out to be spoiler alert one of the most.
[11:07] Be spoiler alert one of the most incredibly brilliant decisions ever made.
[11:08] Incredibly brilliant decisions ever made by a company.
[11:12] So CUDA turned out to be.
[11:18] By a company.
[11:21] So CUDA turned out to be an incredible foundation for computation, an incredible foundation for computation, uh, in this AI infrastructure world.
[11:23] So, uh, in this AI infrastructure world.
[11:25] So, so you're just setting the context.
[11:27] It turned out to be a good decision.
[11:28] Yeah, it turned out to have been a good decision.
[11:31] I think the, so, so here here's the way it went.
[11:32] So, we invented this thing called CUDA and, um, uh, it expanded the the aperture of applications that that we can accelerate with our accelerator.
[11:39] The question is how do we how do we attract developers to CUDA?
[11:48] Because a computing platform is all about developers.
[11:50] And developers don't come to a computing platform just because you know it could perform something interesting.
[12:01] They come to a computing platform because the install base is large.
[12:05] Because a developer like anybody else wants to develop software that reaches a lot of people.
[12:09] So the install base is in fact the single most important part of an architecture.
[12:14] The architecture could attract enormous amounts of criticism.
[12:17] For example, no
[12:20] amounts of criticism.
[12:22] For example, no architecture has ever attracted more criticism than the x86.
[12:25] criticism than the x86.
[12:28] You know, as as a less than less than elegant architecture, but yet it is the defining architecture of today.
[12:33] It it gives you an example that in fact so many risk architectures which were beautifully architected incredibly well-designed by some of the brightest computer scientists in the world largely failed and so I've given you two examples where one is you know one is elegant the other one's barely aesthetic and so yet x86 survived.
[12:58] install base is everything.
[13:02] install base defines an architecture not everything else is secondary.
[13:04] Okay. And so there were other architectures at the time.
[13:07] CUDA came out, Open CL was here.
[13:09] There were you know there's several other competing architectures but the the thing that the decision that we made that was good was we said hey look ultimately it's about um installed base.
[13:20] ultimately it's about um installed base and what is the best way we could get a new computing architecture into the world.
[13:26] By that time frame GeForce had become successful.
[13:31] We were already selling millions and millions of GeForce GPUs a year.
[13:35] And we said, you know, we we ought to put CUDA on GeForce and put it into every single PC whether customers use it or not and use it as a starting point of cultivating our installed base.
[13:50] Meanwhile, we'll go and attract developers and we went to universities and wrote books and taught classes and put CUDA everywhere.
[13:59] And eventually people discover and at the time the PC was the primary computing vehicle.
[14:02] There was no cloud and we could put a supercomputer in the hands of every researcher in school, every scientist, you know, every engineering school, every or every student in school and eventually something amazing will happen.
[14:15] Well, the problem was CUDA increased our cost of that GPU, which is
[14:21] increased our cost of that GPU, which is a consumer product, so tremendously it
[14:23] a consumer product, so tremendously it it completely consumed all of the
[14:27] it completely consumed all of the company's gross profit dollars.
[14:29] And so, at the time, the company was probably, you know, worth, I don't know, at the
[14:31] at the time, the company was probably, you know, worth, I don't know, at the
[14:33] you know, worth, I don't know, at the time, eight,
[14:35] time, eight, was it like $8 billion or something
[14:37] was it like $8 billion or something like$67 billion or something like that.
[14:40] like$67 billion or something like that.
[14:43] After we launched CUDA, I recognized that it was going to add so
[14:46] I recognized that it was going to add so much cost, but it was something we
[14:48] much cost, but it was something we believed in.
[14:51] You know, our market cap went down to like $1.5 billion.
[14:54] And so, we were down we were down there for a
[14:56] we were down we were down there for a while and and uh we clawed our way way
[14:59] back slowly, but we carried CUDA on
[15:02] GeForce.
[15:05] I always say that Nvidia is the house that GeForce built because it was
[15:07] house that GeForce built because it was GeForce that took CUDA out to everybody.
[15:10] GeForce that took CUDA out to everybody.
[15:13] Researchers, scientists, um they discovered CUDA on GeForce because they
[15:17] discovered CUDA on GeForce because they were all, you know, many of them were
[15:18] were all, you know, many of them were gamers.
[15:21] Gamers. Um, many of them built their own PCs anyways in a university lab.
[15:25] Many of PCs anyways in a university lab.
[15:25] Many of them built clusters themselves, you know.
[15:27] Using, using PC components and and so that, you know, that's kind of how we got going.
[15:31] And then that became the platform, the foundation for the deep learning revolution.
[15:35] That was also another great, great observation, yeah.
[15:38] That existential moment, do you remember like what were those meetings like? What were those discussions like deciding as a company risking everything?
[15:48] Well, um, I had, I had to make it clear to the board what we were trying to do.
[15:54] And, um, uh, the management team knew our gross margins were going to get crushed.
[16:00] So you could imagine a world where GeForce would carry the burden of CUDA and none of the gamers would appreciate it.
[16:08] And none of the gamers would pay for it.
[16:10] You know, they only pay certain price and it doesn't matter what your cost is.
[16:14] And so that, you know, we we increased our cost by 50% and that con consumed and we were a 35% gross margin.
[16:22] consumed and we were a 35% gross margin company.
[16:25] And so it it was a it was quite a difficult decision to make, but you could imagine that someday this could go into workstations and it would go into supercomputers and and in those segments maybe we can capture more margin.
[16:38] Um so you you could you could reason your way into being able to afford this.
[16:43] Uh but it still took it took a decade.
[16:45] But that but that's more like conversation with the board convincing them.
[16:50] But you psychologically because Nvidia has continued to make bold bets that predict the future and in part especially now define the future.
[17:02] So I'm almost looking for wisdom about how you were able to make those decisions to make leaps like that as a company.
[17:14] Well, f first of all, um I'm informed by by by a lot of curiosity.
[17:18] Uh at some point there's a reasoning system
[17:25] Point there's a reasoning system that that convinces me uh so clearly.
[17:29] That that convinces me uh so clearly this outcome will happen.
[17:32] This outcome will happen that this will happen.
[17:35] And so I believe that this will happen.
[17:37] And so I believe I believe it in my mind.
[17:39] And when I believe it in my mind, you know, you know how it is.
[17:42] You manifest a future.
[17:45] And that future is so convincing, there's no way it won't happen.
[17:48] There's a lot of suffering in in between, but you've got to believe what you believe.
[17:53] So you you you envision the future.
[17:55] Yeah.
[17:57] And you essentially from a sort of engineering perspective manifest it.
[17:59] Yeah.
[18:01] And and you you reason about how to get there.
[18:04] You reason about why it it must exist.
[18:08] Um and and um and you know, I reason we all reason here.
[18:11] The management team will reason about it.
[18:12] All the people that I we spend a lot of time reasoning about it.
[18:14] The thing the thing that the next part of it is probably a skill thing which is you know oftentimes in leadership uh the leadership stays quiet or they learn
[18:25] Leadership stays quiet or they learn about something and then they do some about something and then they do some manifesto and it's a brand new year and manifesto and it's a brand new year and somehow at the end of the year next year we're going to have a brand new plan,
[18:34] we're going to have a brand new plan, big huge layoff this way, big huge layoff this way, big huge organization change this way, new organization change this way, new mission statement, brand new logos, um you know that kind of stuff.
[18:45] Um, we've just never I never do things that way.
[18:47] When I learn about something and it's starting to influence how I think, I'll make it very clear to everybody near me that, you know, this this is interesting.
[18:55] Um, this is going to make a difference.
[18:58] Uh, this is going to impact that.
[19:01] And I reason about things step by step by step.
[19:05] Often times I've already made up my mind but I'll take every possible opportunity external information new insights new discoveries uh new engineering you know revelations uh new milestones developed I'll take those opportunities and I'll use it to shape everybody else's belief system and
[19:25] shape everybody else's belief system and I'm doing that literally every single day.
[19:30] I'm doing that with my board I'm doing that with my management team I'm doing that with my employees I'm trying to shape their belief system such that when I come the day I say, "Hey, let's buy Melanox."
[19:43] It's completely obvious to everybody that we absolutely should.
[19:48] On the day that on the day that I that I said, "Hey guys, let's go all in on deep learning."
[19:56] And let me tell you why. I've already been laying down the bricks to different organizations inside the company.
[20:02] every organization and every everybody many of the people might have heard everything most of the company heard hears of course pieces of it and on the day that I announce it um everybody's kind of bought into many pieces of it and in a lot of ways I like to announce these things and I imagine um that that the employees are kind of
[20:27] um that that the employees are kind of saying you know Jensen what took you so
[20:29] saying you know Jensen what took you so long and and in fact I've been shaping
[20:32] long and and in fact I've been shaping their belief system for some time and
[20:34] their belief system for some time and therefore leadership
[20:36] therefore leadership sometimes it looks like you're leading
[20:38] sometimes it looks like you're leading from behind
[20:39] from behind >> but you've been shaping their you know
[20:41] >> but you've been shaping their you know to the point where on the day that I
[20:42] to the point where on the day that I declared it 100% buy in but that's what
[20:45] declared it 100% buy in but that's what you want you want to bring everybody
[20:47] you want you want to bring everybody along you know otherwise we announce
[20:49] along you know otherwise we announce something about deep learning and
[20:50] something about deep learning and everybody goes what are you talking
[20:51] everybody goes what are you talking about you know you announce something
[20:54] about you know you announce something about let's go allin on this thing and
[20:56] about let's go allin on this thing and and your your management team your board
[20:59] and your your management team your board your employees your customers, they're
[21:01] your employees your customers, they're kind of like, where's this coming from?
[21:02] kind of like, where's this coming from? You know, this is insane. And so, so,
[21:05] You know, this is insane. And so, so, uh, GTC, in fact, if you go back in
[21:07] uh, GTC, in fact, if you go back in time, you look at look at the keynotes,
[21:11] time, you look at look at the keynotes, I'm also shaping the belief system of my
[21:14] I'm also shaping the belief system of my partners and the industry and and I'm
[21:17] partners and the industry and and I'm using that to shape, you know, the
[21:18] using that to shape, you know, the belief system of my own employees and
[21:21] belief system of my own employees and and and so by the time that I announce
[21:23] and and so by the time that I announce something, like, for example, we just
[21:25] something, like, for example, we just now we just announced Grock, we've been
[21:28] now we just announced Grock, we've been late. I've been talking about the
[21:30] late. I've been talking about the stepping stones for two and a half
[21:32] stepping stones for two and a half years. You guys just go back and oh my
[21:35] years. You guys just go back and oh my gosh, they've been talking about it for
[21:37] gosh, they've been talking about it for two and a half years. And so I've been
[21:39] two and a half years. And so I've been laying the foundation step by step by
[21:40] laying the foundation step by step by step. So when the time comes you
[21:42] step. So when the time comes you announce it, everybody's, you know, what
[21:43] announce it, everybody's, you know, what took you so long?
[21:44] took you so long? >> But it's not just inside the company.
[21:45] >> But it's not just inside the company. You're shaping the landscape, the
[21:47] You're shaping the landscape, the broader global landscape of innovation.
[21:49] broader global landscape of innovation. Like putting those ideas out there, you
[21:51] Like putting those ideas out there, you really are manifesting reality.
[21:53] really are manifesting reality. >> We don't build computers. We actually
[21:54] >> We don't build computers. We actually don't build clouds. We don't, as it
[21:57] don't build clouds. We don't, as it turns out, we're a computing platform
[21:58] turns out, we're a computing platform company and so nobody can buy anything
[22:01] company and so nobody can buy anything from us. That's the weird thing. You
[22:03] from us. That's the weird thing. You know, we ver we vertically
[22:06] know, we ver we vertically design vertically integrate to design
[22:08] design vertically integrate to design and optimize, but then we open up the
[22:11] and optimize, but then we open up the entire platform at every single layer to
[22:14] entire platform at every single layer to be integrated into other companies
[22:17] be integrated into other companies products and services and clouds and
[22:19] products and services and clouds and supercomputers and OEM computers and and
[22:22] supercomputers and OEM computers and and so the amazing thing is I can't do what
[22:25] so the amazing thing is I can't do what I do without having convinced them
[22:27] I do without having convinced them first. And so most of GTC is about
[22:31] first. And so most of GTC is about manifesting a future that by the time
[22:33] manifesting a future that by the time that we my product is ready, they're
[22:36] that we my product is ready, they're going what took you so long? Yeah. Uh so
[22:40] going what took you so long? Yeah. Uh so one of the things you've been a believer
[22:42] one of the things you've been a believer for a long time is uh scaling laws
[22:45] for a long time is uh scaling laws broadly defined. So are you still a
[22:48] broadly defined. So are you still a believer in the in the scaling laws?
[22:49] believer in the in the scaling laws? >> Yeah, we have more scaling laws now.
[22:51] >> Yeah, we have more scaling laws now. >> So I think uh you've outlined four of
[22:53] >> So I think uh you've outlined four of them with pre-training, post- training,
[22:55] them with pre-training, post- training, test time, and agentic scaling. What do
[22:58] test time, and agentic scaling. What do you think when you think about the
[23:00] you think when you think about the future, deep future and the near-term
[23:03] future, deep future and the near-term future, what are the blockers that
[23:06] future, what are the blockers that you're most concerned about that keep
[23:08] you're most concerned about that keep you up at night that you have to
[23:09] you up at night that you have to overcome in order to keep scaling?
[23:12] overcome in order to keep scaling? >> Well, we can go back and reflect on what
[23:14] >> Well, we can go back and reflect on what people thought were blockers.
[23:16] people thought were blockers. >> Mhm. So in the beginning we were the
[23:18] >> Mhm. So in the beginning we were the first the pre pre-training scaling law
[23:21] first the pre pre-training scaling law you know people thought uh well
[23:23] you know people thought uh well rightfully so that the amount of data
[23:25] rightfully so that the amount of data that we have high quality data that we
[23:28] that we have high quality data that we have um will limit the intelligence that
[23:30] have um will limit the intelligence that we achieve and that scaling law was an
[23:32] we achieve and that scaling law was an important very important scale law the
[23:34] important very important scale law the larger the model the correspondently
[23:36] larger the model the correspondently more data uh results in a better with a
[23:39] more data uh results in a better with a results in a smarter AI and so that was
[23:42] results in a smarter AI and so that was pre-training and Ilas Susker Ilas
[23:45] pre-training and Ilas Susker Ilas we're out of data or something like
[23:47] we're out of data or something like that. Pre-training is over or something
[23:48] that. Pre-training is over or something like that. The the industry panicked,
[23:51] like that. The the industry panicked, you know, that this is the end of AI.
[23:54] you know, that this is the end of AI. And of course, of course, that's that's
[23:56] And of course, of course, that's that's obviously not true. Um, we're going to
[23:58] obviously not true. Um, we're going to keep on scaling the amount of data that
[23:59] keep on scaling the amount of data that we h have to to train with. A lot of
[24:02] we h have to to train with. A lot of that data is probably going to be
[24:03] that data is probably going to be synthetic. And that also confused
[24:06] synthetic. And that also confused people, you know, and and what people
[24:08] people, you know, and and what people don't realize is they've kind of
[24:10] don't realize is they've kind of forgotten that most of the data that
[24:12] forgotten that most of the data that that we are training uh that we teach
[24:15] that we are training uh that we teach each other with, inform each other with
[24:16] each other with, inform each other with this is synthetic. You know, I it's
[24:19] this is synthetic. You know, I it's synthetic because it didn't come out of
[24:22] synthetic because it didn't come out of nature. You created it. I'm consuming
[24:25] nature. You created it. I'm consuming it. I modify it, augment it, I
[24:30] it. I modify it, augment it, I regenerate it, somebody else consumes
[24:32] regenerate it, somebody else consumes it. And so so we've now reached a level
[24:35] it. And so so we've now reached a level where AI is able to
[24:39] where AI is able to take ground truth, augment it,
[24:43] take ground truth, augment it, enhance it, synthetically generate an
[24:46] enhance it, synthetically generate an enormous amount of data and that part of
[24:49] enormous amount of data and that part of post training um continues to scale. And
[24:51] post training um continues to scale. And so the amount of data that we could use
[24:53] so the amount of data that we could use that is human generated will be smaller
[24:56] that is human generated will be smaller and smaller and smaller. the amount of
[24:58] and smaller and smaller. the amount of data that we use to uh train model uh uh
[25:02] data that we use to uh train model uh uh is going to continue to scale to the
[25:04] is going to continue to scale to the point where we're no longer limited
[25:06] point where we're no longer limited training is no longer limited by data is
[25:09] training is no longer limited by data is now limited by compute and the reason
[25:11] now limited by compute and the reason for that is most of the data is
[25:12] for that is most of the data is synthetic then the next phase is uh test
[25:16] synthetic then the next phase is uh test time and um I I still remember people
[25:20] time and um I I still remember people people telling me that inference oh yeah
[25:22] people telling me that inference oh yeah that's easy pre pre-training that's hard
[25:25] that's easy pre pre-training that's hard these are giant systems that people are
[25:26] these are giant systems that people are talking about inference must be easy and
[25:29] talking about inference must be easy and so inference chips are going to be
[25:30] so inference chips are going to be little tiny chips and you know they're
[25:32] little tiny chips and you know they're not they're not like Nvidia's chips oh
[25:34] not they're not like Nvidia's chips oh those are going to be complicated and
[25:36] those are going to be complicated and expensive and you know we could make and
[25:38] expensive and you know we could make and this is and in the future inference is
[25:41] this is and in the future inference is going to be the biggest market and it's
[25:42] going to be the biggest market and it's going to be easy and we're going to
[25:43] going to be easy and we're going to commoditize and you know everybody can
[25:45] commoditize and you know everybody can build their own chips and and and that
[25:48] build their own chips and and and that was always illogical to me because
[25:51] was always illogical to me because inference is thinking and I think
[25:54] inference is thinking and I think thinking is hard thinking is way harder
[25:57] thinking is hard thinking is way harder than reading.
[25:59] than reading. >> You know, pre-training is just
[26:01] >> You know, pre-training is just memorization and generalization, you
[26:03] memorization and generalization, you know, and looking for patterns and
[26:05] know, and looking for patterns and relationships. You're reading and
[26:07] relationships. You're reading and reading versus thinking, reasoning,
[26:10] reading versus thinking, reasoning, solving problems, taking un unexplored
[26:16] experiences, new experiences, and
[26:18] experiences, new experiences, and breaking it down into de decomposing it
[26:21] breaking it down into de decomposing it into, you know, solvable pieces that we
[26:24] into, you know, solvable pieces that we then go off either through first
[26:26] then go off either through first principal reasoning or, you know,
[26:28] principal reasoning or, you know, through through uh previous examples,
[26:30] through through uh previous examples, prior experiences, you know, or or or
[26:33] prior experiences, you know, or or or just uh uh exploration. and and search
[26:36] just uh uh exploration. and and search and you know trying different things and
[26:39] and you know trying different things and that whole process of post of of test
[26:42] that whole process of post of of test time scaling. Uh inference is really
[26:45] time scaling. Uh inference is really about thinking and and it's about
[26:47] about thinking and and it's about reasoning. It's about planning. It's
[26:49] reasoning. It's about planning. It's about search. It's about and so how
[26:51] about search. It's about and so how could that possibly be computed? And we
[26:54] could that possibly be computed? And we were absolutely right about that you
[26:56] were absolutely right about that you know so so test time scaling is
[26:58] know so so test time scaling is intensely comput intensive.
[27:01] intensely comput intensive. Then the question is okay now we're at
[27:02] Then the question is okay now we're at inference and we're at test time
[27:04] inference and we're at test time scaling. What's beyond that? Well,
[27:06] scaling. What's beyond that? Well, obviously
[27:08] obviously uh we have now created you know one
[27:10] uh we have now created you know one agentic person and that one agentic
[27:14] agentic person and that one agentic person has a large language model that
[27:15] person has a large language model that we've now we've now you know developed.
[27:18] we've now we've now you know developed. But during test time, that agentic
[27:20] But during test time, that agentic system goes off and does research and
[27:23] system goes off and does research and bangs on databases and it goes on and
[27:26] bangs on databases and it goes on and you know uses tools and one of the most
[27:28] you know uses tools and one of the most important things it does is spins off
[27:30] important things it does is spins off and spawns off a whole bunch of sub
[27:32] and spawns off a whole bunch of sub aents which means we're now creating
[27:34] aents which means we're now creating large teams. It's so much easier to
[27:38] large teams. It's so much easier to scale Nvidia by hiring more employees
[27:42] scale Nvidia by hiring more employees than it is to scale myself.
[27:44] than it is to scale myself. >> And so the next scaling law is the
[27:45] >> And so the next scaling law is the agentic scaling law. It's kind of like
[27:48] agentic scaling law. It's kind of like multip multiplying
[27:50] multip multiplying AI. Multiplying AI, we could spin off
[27:53] AI. Multiplying AI, we could spin off agents as fast as you want to spin off
[27:55] agents as fast as you want to spin off agents. And so, you know, I you have
[27:58] agents. And so, you know, I you have four scaling laws. And and as we use the
[28:01] four scaling laws. And and as we use the a agentic systems, they're going to
[28:03] a agentic systems, they're going to create a lot more data. They're going to
[28:04] create a lot more data. They're going to create a lot of experiences. Some of it
[28:06] create a lot of experiences. Some of it we're going to say, "Wow, this is really
[28:08] we're going to say, "Wow, this is really good. We ought to memorize this."
[28:12] good. We ought to memorize this." >> That data set then comes all the way
[28:13] >> That data set then comes all the way back to pre-training. We memorize and
[28:16] back to pre-training. We memorize and generalize it. We then refine it and
[28:18] generalize it. We then refine it and fine-tune it back into post training.
[28:22] fine-tune it back into post training. Then we enhance it even more with test
[28:24] Then we enhance it even more with test time, you know, in the agent agents
[28:27] time, you know, in the agent agents agentic systems, you know, put it onto
[28:29] agentic systems, you know, put it onto the indust industry. And so this loop,
[28:31] the indust industry. And so this loop, the cycle is going to go on and on and
[28:34] the cycle is going to go on and on and on. It kind of comes down to basically
[28:37] on. It kind of comes down to basically intelligence is going to scale by one
[28:39] intelligence is going to scale by one thing and it's compute. But there's a
[28:42] thing and it's compute. But there's a tricky thing there that you have to
[28:43] tricky thing there that you have to anticipate and predict which is some of
[28:46] anticipate and predict which is some of these components. It requires different
[28:50] these components. It requires different kind of hardware to really do it
[28:52] kind of hardware to really do it optimally. So you have to anticipate
[28:55] optimally. So you have to anticipate where the AI innovation is going to
[28:56] where the AI innovation is going to lead. For example, mixture of experts
[28:58] lead. For example, mixture of experts with sparity.
[28:59] with sparity. >> Perfect.
[29:00] >> Perfect. >> With hardware, you can't just pivot on a
[29:03] >> With hardware, you can't just pivot on a week's notice. You have to anticipate
[29:04] week's notice. You have to anticipate what that's going to look like. That's
[29:07] what that's going to look like. That's >> that's so scary and difficult to do,
[29:09] >> that's so scary and difficult to do, right? For example, uh these AI model
[29:12] right? For example, uh these AI model architectures are being invented about
[29:14] architectures are being invented about once every six months.
[29:16] once every six months. >> Yeah. Right. And uh system architectures
[29:20] >> Yeah. Right. And uh system architectures and hardware architectures
[29:23] and hardware architectures kind of every 3 years. And so you need
[29:27] kind of every 3 years. And so you need to anticipate what likely is going to
[29:29] to anticipate what likely is going to happen, you know, 2 3 years from now.
[29:33] happen, you know, 2 3 years from now. And there's a couple ways that you could
[29:34] And there's a couple ways that you could do that. First of all, we could do
[29:35] do that. First of all, we could do research internally ourselves. And
[29:36] research internally ourselves. And that's one of the reasons why we have
[29:38] that's one of the reasons why we have basic research. We have applied
[29:39] basic research. We have applied research. We create our own models. And
[29:41] research. We create our own models. And so we have we have hands-on life
[29:44] so we have we have hands-on life experience right here. This is part of
[29:46] experience right here. This is part of the code design that I'm talking about.
[29:48] the code design that I'm talking about. >> We're also the only AI company in the
[29:49] >> We're also the only AI company in the world that works with literally every AI
[29:51] world that works with literally every AI company in the world. And to the extent
[29:52] company in the world. And to the extent that we can um uh we try to get a sense
[29:55] that we can um uh we try to get a sense of of what are the challenges that
[29:57] of of what are the challenges that people are experiencing.
[29:58] people are experiencing. >> So you're listening to the whispers
[30:00] >> So you're listening to the whispers across the industry, the adabs.
[30:02] across the industry, the adabs. >> That's right. You got to listen and and
[30:04] >> That's right. You got to listen and and learn from everybody and have a have a
[30:06] learn from everybody and have a have a and then the the last part is to have an
[30:08] and then the the last part is to have an architecture that's that's flexible that
[30:11] architecture that's that's flexible that can adapt and move with the wind and one
[30:13] can adapt and move with the wind and one of the benefits of of CUDA is that it's
[30:16] of the benefits of of CUDA is that it's you know on the one hand an incredible
[30:19] you know on the one hand an incredible accelerator on the other hand it's
[30:21] accelerator on the other hand it's really flexible and so that balance
[30:24] really flexible and so that balance incredible balance between
[30:26] incredible balance between specialization
[30:28] specialization otherwise we can't accelerate the the
[30:30] otherwise we can't accelerate the the CPU versus generalization so that we can
[30:33] CPU versus generalization so that we can adapt with changing algorithms. That's
[30:35] adapt with changing algorithms. That's really really important. That's the
[30:37] really really important. That's the reason why why um CUDA has been so
[30:39] reason why why um CUDA has been so resilient um on the one hand and yet we
[30:43] resilient um on the one hand and yet we continue to enhance it. We're at CUDA
[30:44] continue to enhance it. We're at CUDA 13.2 and so we're invol evolving the
[30:48] 13.2 and so we're invol evolving the architecture so fast that we can stay
[30:50] architecture so fast that we can stay with you know with with the modern al
[30:54] with you know with with the modern al algorithms. Um for example
[30:57] algorithms. Um for example uh when mixture of experts came out uh
[30:59] uh when mixture of experts came out uh that's the reason why we had MVLink 72
[31:02] that's the reason why we had MVLink 72 instead of MVLink 8. We could now take
[31:04] instead of MVLink 8. We could now take an entire 4 trillion 10 trillion
[31:07] an entire 4 trillion 10 trillion parameter model and put it in one
[31:09] parameter model and put it in one computing domain as if it's running on
[31:11] computing domain as if it's running on one GPU. Um I people probably didn't
[31:17] one GPU. Um I people probably didn't notice I said it but if you look at the
[31:20] notice I said it but if you look at the architecture of the Grace Blackwell
[31:23] architecture of the Grace Blackwell racks it was completely focused on doing
[31:26] racks it was completely focused on doing one thing processing the LLM.
[31:30] one thing processing the LLM. All of a sudden one year later you're
[31:32] All of a sudden one year later you're looking at a Vera Rubin rack. It has
[31:35] looking at a Vera Rubin rack. It has storage accelerators. It has this
[31:38] storage accelerators. It has this incredible new CPU called Vera. It has
[31:41] incredible new CPU called Vera. It has Vera Rubin and MVLink72 to run the LLMs.
[31:46] Vera Rubin and MVLink72 to run the LLMs. It also has this new additional rack
[31:48] It also has this new additional rack called Gro. And so this entire rack
[31:51] called Gro. And so this entire rack system is completely different than the
[31:55] system is completely different than the previous one and it's got all these new
[31:57] previous one and it's got all these new components in it. And the reason for
[31:58] components in it. And the reason for that is because the last one was
[32:00] that is because the last one was designed to run
[32:02] designed to run large language models inference and this
[32:06] large language models inference and this one is to run agents and agents bang on
[32:08] one is to run agents and agents bang on tools and Obviously the design of the
[32:12] tools and Obviously the design of the system
[32:13] system had to have been done before claude
[32:17] had to have been done before claude code, codeex, open claw. So you were
[32:20] code, codeex, open claw. So you were anticipating the future essentially and
[32:22] anticipating the future essentially and that that comes from what? From the
[32:23] that that comes from what? From the whispers, from the understanding what
[32:25] whispers, from the understanding what all the state of the artist is.
[32:26] all the state of the artist is. >> No, it's it's easier than that. Uh you
[32:28] >> No, it's it's easier than that. Uh you you just reason about it. Uh first of
[32:31] you just reason about it. Uh first of all just reason
[32:34] all just reason no matter no matter what happens at some
[32:37] no matter no matter what happens at some point in order for that large language
[32:40] point in order for that large language model to be a digital worker. Let's just
[32:42] model to be a digital worker. Let's just let's just use that metaphor. Let's say
[32:45] let's just use that metaphor. Let's say that we want the LM to be a digital
[32:47] that we want the LM to be a digital worker. What does it have to do? It has
[32:49] worker. What does it have to do? It has to access ground truth. That's our file
[32:52] to access ground truth. That's our file system. It has to be able to do
[32:53] system. It has to be able to do research. It doesn't know everything. We
[32:56] research. It doesn't know everything. We don't have and I don't want to wait
[32:57] don't have and I don't want to wait until this AI becomes, you know,
[32:59] until this AI becomes, you know, universally smart about everything past,
[33:03] universally smart about everything past, present, and future before I make it
[33:05] present, and future before I make it useful. And so therefore, I might as
[33:07] useful. And so therefore, I might as well let it go do research. It's
[33:09] well let it go do research. It's obviously if it wants to help me, it's
[33:11] obviously if it wants to help me, it's got to use my tools. You know, a lot of
[33:13] got to use my tools. You know, a lot of people would say, you know, um AI is
[33:16] people would say, you know, um AI is going to completely destroy software. We
[33:18] going to completely destroy software. We don't need software anymore. We don't
[33:19] don't need software anymore. We don't even need tools anymore. That's
[33:20] even need tools anymore. That's ridiculous. Let's let's use the let's
[33:23] ridiculous. Let's let's use the let's use a thought experiment. Uh, and you
[33:25] use a thought experiment. Uh, and you could just sit there, enjoy a glass of
[33:27] could just sit there, enjoy a glass of whiskey and and think about all these
[33:30] whiskey and and think about all these things and it would become completely
[33:32] things and it would become completely obvious like if I were to create
[33:35] obvious like if I were to create the most amazing ro the most amazing
[33:38] the most amazing ro the most amazing agent that we can imagine in the next 10
[33:40] agent that we can imagine in the next 10 years, let's say be a humanoid robot. If
[33:43] years, let's say be a humanoid robot. If that human or robot were to be created,
[33:46] that human or robot were to be created, is it more likely that the human or
[33:47] is it more likely that the human or robot comes into my house and uses the
[33:50] robot comes into my house and uses the tools that I have to do the work that it
[33:53] tools that I have to do the work that it needs to do? Or does his hand turns into
[33:56] needs to do? Or does his hand turns into a 10- pound hammer in one instance,
[33:59] a 10- pound hammer in one instance, turns into a scalpel in another
[34:02] turns into a scalpel in another instance, and in order to boil water, it
[34:04] instance, and in order to boil water, it beams, you know, microwaves out of its
[34:07] beams, you know, microwaves out of its fingers, you know, or is it more likely
[34:09] fingers, you know, or is it more likely just to use the microwave, you know, and
[34:11] just to use the microwave, you know, and the first time it goes up to the
[34:12] the first time it goes up to the microwave. It probably doesn't know how
[34:14] microwave. It probably doesn't know how to use it. But that's okay. It's
[34:16] to use it. But that's okay. It's connected to the internet. It reads the
[34:19] connected to the internet. It reads the manual of this microwave, reads it
[34:23] manual of this microwave, reads it instantly, becomes an expert, and so
[34:25] instantly, becomes an expert, and so uses it.
[34:26] uses it. >> And so I I think the I just described in
[34:29] >> And so I I think the I just described in fact almost all of the
[34:32] fact almost all of the properties of Open Claw.
[34:34] properties of Open Claw. >> Mhm.
[34:35] >> Mhm. >> You know, that it's going to use tools,
[34:36] >> You know, that it's going to use tools, that it's going to access files, it's
[34:38] that it's going to access files, it's going to be able to do research, it has
[34:40] going to be able to do research, it has IO subsystem. And when you're done
[34:42] IO subsystem. And when you're done reasoning through it, reasoning about it
[34:44] reasoning through it, reasoning about it through through it in that way, um then
[34:47] through through it in that way, um then you say, "Oh my gosh, the impact to the
[34:51] you say, "Oh my gosh, the impact to the future computing is deeply profound."
[34:53] future computing is deeply profound." And the reason for that is I think we've
[34:55] And the reason for that is I think we've just reinvented the computer. And then
[34:58] just reinvented the computer. And then now you say, "Okay, when did we reason
[35:00] now you say, "Okay, when did we reason about that? When did we reason about
[35:02] about that? When did we reason about Open Claw?" If you take the Open Claw
[35:05] Open Claw?" If you take the Open Claw schematic that I used at GTC,
[35:08] schematic that I used at GTC, you will find it two years ago.
[35:11] you will find it two years ago. Literally two years ago at GTC, I was
[35:14] Literally two years ago at GTC, I was talking about Asgentic systems that
[35:18] talking about Asgentic systems that exactly reflect open claw today and and
[35:22] exactly reflect open claw today and and of course the confluence of of many
[35:25] of course the confluence of of many things had to happen. First of all, we
[35:27] things had to happen. First of all, we needed claude and and GPT and you know
[35:30] needed claude and and GPT and you know all of these models to reach a level of
[35:33] all of these models to reach a level of capability. So so their innovation and
[35:35] capability. So so their innovation and their breakthroughs and their continual
[35:36] their breakthroughs and their continual advances was really important. And then
[35:39] advances was really important. And then of course somebody had to create a an
[35:41] of course somebody had to create a an open- source you know um project that
[35:45] open- source you know um project that that uh was sufficiently robust you know
[35:48] that uh was sufficiently robust you know and sufficiently complete and that we
[35:51] and sufficiently complete and that we can all we can all put to put to work
[35:53] can all we can all put to put to work and and I think openclaw did for did for
[35:56] and and I think openclaw did for did for agentic systems what chat GPT did for
[35:59] agentic systems what chat GPT did for generative systems and and I just think
[36:00] generative systems and and I just think it's a very big deal.
[36:02] it's a very big deal. >> Yeah, it's a really special moment. I'm
[36:04] >> Yeah, it's a really special moment. I'm not exactly sure why it captured
[36:07] not exactly sure why it captured so much of the world's attention, but it
[36:08] so much of the world's attention, but it did more than cloud code and codeex and
[36:11] did more than cloud code and codeex and so on because consumers could reach it.
[36:13] so on because consumers could reach it. >> Sure. Yeah. But there there's also so
[36:16] >> Sure. Yeah. But there there's also so much of this is vibes and and Peter uh I
[36:19] much of this is vibes and and Peter uh I had a podcast with him. He's a wonderful
[36:21] had a podcast with him. He's a wonderful human being. So part of it is also the
[36:23] human being. So part of it is also the humans that represent the thing. Part of
[36:25] humans that represent the thing. Part of it is memes and the
[36:27] it is memes and the >> cuz we're all trying to figure it out.
[36:28] >> cuz we're all trying to figure it out. There's really serious and complicated
[36:30] There's really serious and complicated security concerns about when you have
[36:33] security concerns about when you have such powerful technology, how do you
[36:34] such powerful technology, how do you hand over your data so they can do
[36:36] hand over your data so they can do useful stuff, but then there's scary
[36:38] useful stuff, but then there's scary things associated with that. And we as a
[36:40] things associated with that. And we as a civilization, as individual people and
[36:41] civilization, as individual people and as a civilization figuring out how to
[36:43] as a civilization figuring out how to find that right balance.
[36:44] find that right balance. >> Yeah, we we uh we jumped on it right
[36:46] >> Yeah, we we uh we jumped on it right away and we sent a bunch of security
[36:48] away and we sent a bunch of security experts this way
[36:49] experts this way >> and we did this thing called Open Shell.
[36:51] >> and we did this thing called Open Shell. It's it's already been integrated into
[36:54] It's it's already been integrated into into open claw
[36:55] into open claw >> and Nvidia put forward Nemo claw.
[36:58] >> and Nvidia put forward Nemo claw. >> Yep. Exactly.
[36:59] >> Yep. Exactly. >> The install is super easy. It makes sure
[37:02] >> The install is super easy. It makes sure that uh it's secure.
[37:03] that uh it's secure. >> We give you two out of three rights.
[37:05] >> We give you two out of three rights. Agentic systems can can access sensitive
[37:07] Agentic systems can can access sensitive information. It can execute code and it
[37:10] information. It can execute code and it can communicate externally.
[37:11] can communicate externally. >> Mhm.
[37:14] >> Mhm. >> We could keep things safe if we gave you
[37:16] >> We could keep things safe if we gave you two out of those three capabilities at
[37:18] two out of those three capabilities at any time, but not all three. And out of
[37:21] any time, but not all three. And out of those two out of three capabilities, we
[37:23] those two out of three capabilities, we also give you access control based on
[37:25] also give you access control based on based on um whatever rights that you're
[37:27] based on um whatever rights that you're given by enterprise. And then we
[37:29] given by enterprise. And then we connected to a policy engine that all
[37:31] connected to a policy engine that all these enterprises already have. And so
[37:34] these enterprises already have. And so um we're going to try to do our best to
[37:36] um we're going to try to do our best to to uh help Open Claw become a a better
[37:39] to uh help Open Claw become a a better claw. So you eloquently explained how we
[37:43] claw. So you eloquently explained how we have a long history of blockers that we
[37:45] have a long history of blockers that we thought were going to be blockers and we
[37:46] thought were going to be blockers and we overcame them. But now looking into the
[37:48] overcame them. But now looking into the future, what do you think might be the
[37:49] future, what do you think might be the blockers now that it's clear that agents
[37:52] blockers now that it's clear that agents will be everywhere? So it's obviously
[37:54] will be everywhere? So it's obviously we're going to need compute. So what is
[37:56] we're going to need compute. So what is going to be the blocker for that
[37:58] going to be the blocker for that scaling? Power is a concern, but it's
[38:01] scaling? Power is a concern, but it's not the only concern. But that's the
[38:03] not the only concern. But that's the reason why we're pushing so hard on
[38:05] reason why we're pushing so hard on extreme code design so that we can
[38:09] extreme code design so that we can improve the tokens per second per watt
[38:13] improve the tokens per second per watt orders of magnitude every single year.
[38:17] orders of magnitude every single year. And so in the last 10 years, Moors law
[38:19] And so in the last 10 years, Moors law would have progressed computing about a
[38:22] would have progressed computing about a 100 times in the last 10 years. We
[38:24] 100 times in the last 10 years. We progressed and scaled up computing by a
[38:27] progressed and scaled up computing by a million times in the last 10 years. And
[38:29] million times in the last 10 years. And so we're going to keep on we're going to
[38:30] so we're going to keep on we're going to keep on doing that through extreme code
[38:31] keep on doing that through extreme code design. Um so energy efficiency per per
[38:35] design. Um so energy efficiency per per watt completely affects the revenues of
[38:38] watt completely affects the revenues of a company. It affects the revenues of a
[38:41] a company. It affects the revenues of a factory and we're just we're just going
[38:44] factory and we're just we're just going to push that to the limit so that we can
[38:46] to push that to the limit so that we can keep on driving token cost down as fast
[38:49] keep on driving token cost down as fast as we can. you know, the our computer
[38:52] as we can. you know, the our computer price is going up, but our token
[38:55] price is going up, but our token generation effectiveness is going up so
[38:57] generation effectiveness is going up so much faster that token cost is coming
[39:00] much faster that token cost is coming down. It's just it it's coming down an
[39:02] down. It's just it it's coming down an order of magnitude every year.
[39:04] order of magnitude every year. >> So power that's an interesting one. So
[39:06] >> So power that's an interesting one. So the the way to try to get around the
[39:09] the the way to try to get around the power blocker is to try to with the
[39:11] power blocker is to try to with the tokens per second per watt try to make
[39:12] tokens per second per watt try to make it more and more efficient. Of course,
[39:14] it more and more efficient. Of course, there's the question, how do we get more
[39:15] there's the question, how do we get more power?
[39:16] power? >> We should also get more power.
[39:17] >> We should also get more power. >> That's a really complicated one. And
[39:18] >> That's a really complicated one. And you've talked about small module nuclear
[39:20] you've talked about small module nuclear power plants. There's all kinds of ideas
[39:22] power plants. There's all kinds of ideas for energy. Uh how much does it keep you
[39:25] for energy. Uh how much does it keep you up at night? Uh the the bottlenecks in
[39:27] up at night? Uh the the bottlenecks in the supply chain of AI like ASML with
[39:30] the supply chain of AI like ASML with EUV lithography machines, TSMC with
[39:33] EUV lithography machines, TSMC with advanced packaging like cos and uh SK HX
[39:36] advanced packaging like cos and uh SK HX with high bandwidth memory all all the
[39:39] with high bandwidth memory all all the time and we're working on all the time.
[39:41] time and we're working on all the time. No company in history has ever grown at
[39:45] No company in history has ever grown at a scale that we're growing while
[39:47] a scale that we're growing while accelerating that growth. It's
[39:49] accelerating that growth. It's incredible.
[39:50] incredible. >> Yeah.
[39:50] >> Yeah. >> And it's hard for people to even
[39:51] >> And it's hard for people to even understand this in the overall world of
[39:55] understand this in the overall world of AI computing. We're increasing share.
[39:58] AI computing. We're increasing share. And so supply chain upstream and
[40:00] And so supply chain upstream and downstream are really important to us. I
[40:04] downstream are really important to us. I spent a lot of time um informing all the
[40:08] spent a lot of time um informing all the CEOs that I work with what are the
[40:10] CEOs that I work with what are the dynamics that's going to cause uh the
[40:13] dynamics that's going to cause uh the growth to continue or even accelerate.
[40:15] growth to continue or even accelerate. It's part of the reasons why to the
[40:17] It's part of the reasons why to the entire right hand side of me were CEOs
[40:21] entire right hand side of me were CEOs of practically the entire IT industry
[40:24] of practically the entire IT industry upstream and practically the entire
[40:29] upstream and practically the entire infrastructure industry downstream. Mhm.
[40:32] infrastructure industry downstream. Mhm. And they were all there were several
[40:34] And they were all there were several hundred CEOs and I don't think there's
[40:36] hundred CEOs and I don't think there's ever been keynotes where several hundred
[40:38] ever been keynotes where several hundred CEOs show up. And and part of it is I'm
[40:42] CEOs show up. And and part of it is I'm telling them about our business
[40:44] telling them about our business condition now. I'm telling them about
[40:47] condition now. I'm telling them about the growth drivers in the very near
[40:49] the growth drivers in the very near future and what's happening. And I'm
[40:51] future and what's happening. And I'm also describing where are we going to go
[40:52] also describing where are we going to go next so that they could use all of this
[40:55] next so that they could use all of this information and all of the dynamics that
[40:57] information and all of the dynamics that are here to inform how they want to
[41:00] are here to inform how they want to invest.
[41:01] invest. And so so I I inform them that way like
[41:04] And so so I I inform them that way like I inform my own employees. And then of
[41:06] I inform my own employees. And then of course then I make trips out to them and
[41:09] course then I make trips out to them and make sure that hey listen I want you to
[41:11] make sure that hey listen I want you to know this quarter, this coming year,
[41:13] know this quarter, this coming year, this next year these things are going to
[41:16] this next year these things are going to happen and and if you look at the CEOs
[41:19] happen and and if you look at the CEOs of the DRAM industry um the number one
[41:22] of the DRAM industry um the number one DRAM in the in the world was DDR memory
[41:26] DRAM in the in the world was DDR memory for CPUs in data centers. About three
[41:31] for CPUs in data centers. About three years ago, I was able to convince
[41:33] years ago, I was able to convince several of the CEOs that even though at
[41:36] several of the CEOs that even though at the time HBM memory was used quite
[41:39] the time HBM memory was used quite scarcely, you know, and and barely by
[41:41] scarcely, you know, and and barely by supercomputers,
[41:42] supercomputers, um that this was going to be a
[41:44] um that this was going to be a mainstream memory for data centers in
[41:46] mainstream memory for data centers in the future. And at first it sounded
[41:48] the future. And at first it sounded ridiculous, but several of the CEOs
[41:50] ridiculous, but several of the CEOs believed me and decided to invest in
[41:53] believed me and decided to invest in building HBM memories. Another memory
[41:56] building HBM memories. Another memory was rather odd to put into a data center
[41:59] was rather odd to put into a data center is the low power memories that we use
[42:01] is the low power memories that we use for cell phones. And we wanted them to
[42:04] for cell phones. And we wanted them to adapt them for supercomputers in the
[42:07] adapt them for supercomputers in the data center. And they go, cell phone
[42:09] data center. And they go, cell phone memory for supercomputers. And I
[42:11] memory for supercomputers. And I explained to them why. Well, look at
[42:13] explained to them why. Well, look at these two memories, LPDDR5,
[42:16] these two memories, LPDDR5, HBM4.
[42:17] HBM4. The volumes are so incredible. All three
[42:20] The volumes are so incredible. All three of them had record years in history. And
[42:22] of them had record years in history. And these are these are 45 year old
[42:24] these are these are 45 year old companies. And so, you know, I that's
[42:28] companies. And so, you know, I that's part of my job is to
[42:30] part of my job is to inform and shape,
[42:33] inform and shape, inspire,
[42:35] inspire, you know. So, you're not just
[42:36] you know. So, you're not just manifesting the the future and maybe
[42:39] manifesting the the future and maybe inspiring Nvidia, the the the different
[42:42] inspiring Nvidia, the the the different engineers of the company. You're you're
[42:44] engineers of the company. You're you're manifesting the supply chain of the
[42:46] manifesting the supply chain of the future. So you're having conversations
[42:48] future. So you're having conversations with TSMC, with ASML,
[42:50] with TSMC, with ASML, >> upstream, downstream,
[42:51] >> upstream, downstream, >> upstream, downstream. So that's the
[42:53] >> upstream, downstream. So that's the thing.
[42:53] thing. >> GEV, Caterpillar.
[42:56] >> GEV, Caterpillar. >> Yeah, that's downstream from us. Yeah.
[42:58] >> Yeah, that's downstream from us. Yeah. Yeah. There you go.
[42:59] Yeah. There you go. >> Yeah. The whole thing. I mean, but
[43:00] >> Yeah. The whole thing. I mean, but that's so
[43:02] that's so >> there's so much incredibly difficult
[43:04] >> there's so much incredibly difficult engineering that happens in the the
[43:07] engineering that happens in the the entire semiconductor industry. And it's
[43:09] entire semiconductor industry. And it's just feels scary how intricate the
[43:14] just feels scary how intricate the supply chain is, how many components
[43:16] supply chain is, how many components there are, but it works somehow.
[43:18] there are, but it works somehow. Exactly. The deep science, the deep
[43:21] Exactly. The deep science, the deep engineering, the incredible
[43:22] engineering, the incredible manufacturing, and so much of the
[43:24] manufacturing, and so much of the manufacturing is already robotics, but
[43:26] manufacturing is already robotics, but we have a couple of hundred suppliers
[43:28] we have a couple of hundred suppliers that contribute the technology that goes
[43:31] that contribute the technology that goes into our 1.3 million component rack.
[43:35] into our 1.3 million component rack. Mhm.
[43:36] Mhm. >> Each rack is 1.3 one and a half million
[43:40] >> Each rack is 1.3 one and a half million components. There are 200 suppliers
[43:43] components. There are 200 suppliers across the Vera Rubin rack.
[43:45] across the Vera Rubin rack. >> So, it's interesting that you don't list
[43:46] >> So, it's interesting that you don't list that as the thing that keeps you up at
[43:47] that as the thing that keeps you up at night in the list of blockers.
[43:49] night in the list of blockers. >> But I'm doing I'm doing all the things
[43:51] >> But I'm doing I'm doing all the things necessary to
[43:52] necessary to >> Okay.
[43:53] >> Okay. >> See, I can go to sleep because I checked
[43:55] >> See, I can go to sleep because I checked it off. I said, "Okay, you know, I I go
[43:57] it off. I said, "Okay, you know, I I go I I can go to sleep and I go, well,
[44:00] I I can go to sleep and I go, well, let's see what um re let's reason about
[44:02] let's see what um re let's reason about this. What's important for us?" Um
[44:04] this. What's important for us?" Um because okay let's reason about this uh
[44:07] because okay let's reason about this uh because we changed the system
[44:09] because we changed the system architecture from the original DGX1 that
[44:12] architecture from the original DGX1 that you remembered to uh MVLink 72 rack
[44:15] you remembered to uh MVLink 72 rack scale computing.
[44:15] scale computing. >> Mhm.
[44:16] >> Mhm. >> What's going to what does that what does
[44:17] >> What's going to what does that what does that mean? What does that mean to uh
[44:20] that mean? What does that mean to uh software? What does that mean to
[44:22] software? What does that mean to engineering? What does that mean uh to
[44:24] engineering? What does that mean uh to how we design and test and what does
[44:26] how we design and test and what does that mean to the supply chain? Well, one
[44:28] that mean to the supply chain? Well, one of the things that it meant was we moved
[44:32] of the things that it meant was we moved um supercomput superco computer
[44:34] um supercomput superco computer integration at the data center into
[44:37] integration at the data center into supercomputer manufacturing in the
[44:40] supercomputer manufacturing in the supply chain.
[44:41] supply chain. >> Mhm.
[44:42] >> Mhm. If you're doing that, you also have to
[44:45] If you're doing that, you also have to recognize you're going to move one and
[44:46] recognize you're going to move one and and if if if you're if you're, you know,
[44:50] and if if if you're if you're, you know, total footprint of whatever data center
[44:53] total footprint of whatever data center you're going to build, let's say you
[44:55] you're going to build, let's say you would like to have, you know, 50 gawatts
[44:58] would like to have, you know, 50 gawatts of supercomputers that are running
[45:01] of supercomputers that are running simultaneously
[45:02] simultaneously and it takes one week to manufacture
[45:05] and it takes one week to manufacture that 50 gawatts of supercomputers. Then
[45:09] that 50 gawatts of supercomputers. Then each week in the supply chain, the
[45:11] each week in the supply chain, the supercomputers are going to need a
[45:12] supercomputers are going to need a gigawatt of power. And so so we're going
[45:15] gigawatt of power. And so so we're going to need the supply chain to increase the
[45:17] to need the supply chain to increase the amount of power it has to build test to
[45:20] amount of power it has to build test to build and test the supercomputers in the
[45:23] build and test the supercomputers in the supply chain before I ship it.
[45:24] supply chain before I ship it. >> Well, MVLink72 literally builds
[45:26] >> Well, MVLink72 literally builds supercomputers in the supply chain and
[45:28] supercomputers in the supply chain and ships them two, three tons at a time per
[45:31] ships them two, three tons at a time per rack. It used to be come they used to
[45:34] rack. It used to be come they used to come in parts and we used to assemble
[45:36] come in parts and we used to assemble them inside the data center. But that's
[45:38] them inside the data center. But that's impossible now because MVLink 72 is so
[45:40] impossible now because MVLink 72 is so dense. And so that's an example. And I
[45:42] dense. And so that's an example. And I would have to go into, you know, I fly
[45:45] would have to go into, you know, I fly into the supply chain, go meet my
[45:47] into the supply chain, go meet my partners, and hey, I said, guess what?
[45:49] partners, and hey, I said, guess what? So here's what we're going to do with
[45:51] So here's what we're going to do with this is the way we used to build our
[45:53] this is the way we used to build our DGXs. We're going to build them this
[45:55] DGXs. We're going to build them this way. This is going to be so much better
[45:56] way. This is going to be so much better because we're going to need them for
[45:58] because we're going to need them for inference. The market for inference is,
[46:00] inference. The market for inference is, you know,
[46:02] you know, coming. The inflection point for
[46:03] coming. The inflection point for inference is coming. It's going to be a
[46:04] inference is coming. It's going to be a big market. And so I first explain to
[46:06] big market. And so I first explain to them what's going on, why it's going to
[46:08] them what's going on, why it's going to happen, and then I then I ask them to
[46:12] happen, and then I then I ask them to make several billion dollars of capital
[46:15] make several billion dollars of capital investments each
[46:17] investments each and because they, you know, they trust
[46:19] and because they, you know, they trust me and and I I I'm very respectful of
[46:22] me and and I I I'm very respectful of them and I I give them every opportunity
[46:24] them and I I give them every opportunity to question me and I spend time to
[46:26] to question me and I spend time to explain things to people and I reason
[46:28] explain things to people and I reason about it. I draw them pictures and I
[46:30] about it. I draw them pictures and I reason about it in first principles and
[46:32] reason about it in first principles and by by the time I'm done with them
[46:34] by by the time I'm done with them there's no what to do.
[46:35] there's no what to do. >> So it's a lot of is about relationships
[46:37] >> So it's a lot of is about relationships and building a shared view of the
[46:39] and building a shared view of the future.
[46:40] future. >> Yeah.
[46:41] >> Yeah. >> Uh but do you worry about certain
[46:44] >> Uh but do you worry about certain bottlenecks? I mean what are the biggest
[46:45] bottlenecks? I mean what are the biggest bottlenecks in the supply chain? Are are
[46:47] bottlenecks in the supply chain? Are are you worried about it ASML V tooling? Are
[46:49] you worried about it ASML V tooling? Are you are you worried about the the
[46:51] you are you worried about the the packaging co-as packaging of TSMC about
[46:54] packaging co-as packaging of TSMC about how fast it could scale? like you said,
[46:56] how fast it could scale? like you said, you're not only growing incredibly fast,
[46:59] you're not only growing incredibly fast, you're accelerating a growth. So it it
[47:01] you're accelerating a growth. So it it it feels like every everybody in the
[47:03] it feels like every everybody in the supply chain and those are certainly
[47:05] supply chain and those are certainly bottlenecks would have to scale up.
[47:07] bottlenecks would have to scale up. >> Are you having conversations with them
[47:09] >> Are you having conversations with them like how can you scale up faster?
[47:12] like how can you scale up faster? >> Do you worry about it?
[47:13] >> Do you worry about it? >> No.
[47:13] >> No. >> Okay.
[47:14] >> Okay. >> Because because I told them what I
[47:16] >> Because because I told them what I needed, they understood what I need.
[47:18] needed, they understood what I need. They told me what they're going to go do
[47:20] They told me what they're going to go do and I believe in what they're going to
[47:21] and I believe in what they're going to do.
[47:22] do. >> Interesting. That's great to hear. So
[47:24] >> Interesting. That's great to hear. So maybe if we can just linger on the power
[47:26] maybe if we can just linger on the power for a little bit. Uh what are your hopes
[47:28] for a little bit. Uh what are your hopes for how to solve the energy problem? One
[47:30] for how to solve the energy problem? One of the areas le that I'm um that I would
[47:34] of the areas le that I'm um that I would love I would love love us to talk about
[47:36] love I would love love us to talk about and just get the message out. You know
[47:39] and just get the message out. You know um our our our power grid is designed
[47:44] um our our our power grid is designed for the worst case condition with some
[47:47] for the worst case condition with some margin.
[47:49] margin. Well, 99% of the time we're nowhere near
[47:52] Well, 99% of the time we're nowhere near the worst case condition because the
[47:53] the worst case condition because the worst case condition is a few days in
[47:55] worst case condition is a few days in the winter, a few days in the summer and
[47:58] the winter, a few days in the summer and extreme weather. Most of the time we're
[48:01] extreme weather. Most of the time we're nowhere near the worst case condition
[48:03] nowhere near the worst case condition and we're probably running around call
[48:05] and we're probably running around call it 60% of peak. And so 99% of the time
[48:12] it 60% of peak. And so 99% of the time our power grid has excess power and
[48:15] our power grid has excess power and they're just sitting idle. But they have
[48:17] they're just sitting idle. But they have to be there sitting idle because just in
[48:19] to be there sitting idle because just in case when the time comes hospitals have
[48:21] case when the time comes hospitals have to be powered and you know
[48:22] to be powered and you know infrastructure has to be powered and
[48:24] infrastructure has to be powered and airports have to run and so on so forth.
[48:26] airports have to run and so on so forth. And so the question that I have is
[48:28] And so the question that I have is whether we could go and um help them
[48:32] whether we could go and um help them understand and create contractual
[48:34] understand and create contractual agreements and design computer
[48:36] agreements and design computer architecture systems, data centers such
[48:38] architecture systems, data centers such that when they need
[48:42] that when they need um the maximum power for infrastructure
[48:45] um the maximum power for infrastructure in society that the data centers would
[48:48] in society that the data centers would get less.
[48:49] get less. >> But that's in a very rare instance
[48:50] >> But that's in a very rare instance anyways. And during that time, we either
[48:52] anyways. And during that time, we either have our backup generator for that
[48:54] have our backup generator for that little part of it or we just have our
[48:56] little part of it or we just have our computers shift the workload somewhere
[48:57] computers shift the workload somewhere else or we have the computers just run
[49:00] else or we have the computers just run slower. You know, we could degrade our
[49:02] slower. You know, we could degrade our performance, reduce our power
[49:05] performance, reduce our power consumption and provide for, you know,
[49:08] consumption and provide for, you know, slightly longer latency response, you
[49:10] slightly longer latency response, you know, when somebody asks for, you know,
[49:12] know, when somebody asks for, you know, asked for an answer. And so I think that
[49:14] asked for an answer. And so I think that that that way of using computers of
[49:17] that that way of using computers of building data centers instead of
[49:20] building data centers instead of expecting 100% uptime
[49:22] expecting 100% uptime and these contracts that are really
[49:24] and these contracts that are really really quite rigorous it's putting a lot
[49:27] really quite rigorous it's putting a lot of pressure on the grid to be able to
[49:29] of pressure on the grid to be able to now they're going to have to increase
[49:31] now they're going to have to increase from their maximum. I just want to use
[49:33] from their maximum. I just want to use their excess. It's just sitting there.
[49:36] their excess. It's just sitting there. Yeah. That's not talked about enough. So
[49:38] Yeah. That's not talked about enough. So what's what's this what's stopping
[49:39] what's what's this what's stopping there? Is it regulation? Is it
[49:42] there? Is it regulation? Is it bureaucracy?
[49:43] bureaucracy? >> I think it's it's a throughway problem.
[49:45] >> I think it's it's a throughway problem. Uh it starts with the end customer. The
[49:47] Uh it starts with the end customer. The end customer puts puts requirements on
[49:51] end customer puts puts requirements on the data centers that they can never
[49:55] the data centers that they can never not be available. Okay. So that the end
[49:58] not be available. Okay. So that the end customer expects perfection. Now in
[50:00] customer expects perfection. Now in order to deliver that perfection, you
[50:02] order to deliver that perfection, you need a combination of backup generators
[50:05] need a combination of backup generators and your grid power supplier to deliver
[50:08] and your grid power supplier to deliver on perfection. And so everybody's got to
[50:11] on perfection. And so everybody's got to have 69s.
[50:13] have 69s. >> Well, I think first of all, right now,
[50:16] >> Well, I think first of all, right now, we ought to have everybody understand
[50:17] we ought to have everybody understand that when the customer asks for these
[50:19] that when the customer asks for these things, you got somebody, you have
[50:21] things, you got somebody, you have somebody in your data center operations
[50:23] somebody in your data center operations team disconnected from the CEO. I bet
[50:25] team disconnected from the CEO. I bet the CEO doesn't know this. I'm going to
[50:27] the CEO doesn't know this. I'm going to talk to all the CEOs. The CEOs are
[50:29] talk to all the CEOs. The CEOs are probably not paying any attention to the
[50:32] probably not paying any attention to the contracts that are being signed. And so
[50:34] contracts that are being signed. And so everybody wants to sign the best
[50:36] everybody wants to sign the best contract of course and they go down to
[50:38] contract of course and they go down to the cloud service providers and the
[50:40] the cloud service providers and the contract the the two contract
[50:42] contract the the two contract negotiators that are you I could just
[50:45] negotiators that are you I could just see them now
[50:46] see them now >> you know negotiating these multi-year
[50:47] >> you know negotiating these multi-year contracts both sides want you know the
[50:51] contracts both sides want you know the best contract as a result
[50:54] best contract as a result the CSPs then have to go down to the
[50:57] the CSPs then have to go down to the utilities and they expect the nine the
[50:59] utilities and they expect the nine the 69s and so I think I think the first
[51:01] 69s and so I think I think the first thing is just make sure that that all of
[51:04] thing is just make sure that that all of the customers, the CEOs of the customers
[51:06] the customers, the CEOs of the customers realize what they're asking for. Now,
[51:09] realize what they're asking for. Now, the second thing is we have to build
[51:10] the second thing is we have to build data centers that gracefully degrade.
[51:13] data centers that gracefully degrade. And so, if the power, if the utility of
[51:15] And so, if the power, if the utility of the grid tells us, listen, we're going
[51:17] the grid tells us, listen, we're going to have to back you down to about 80%.
[51:19] to have to back you down to about 80%. We're going to say that's no problem at
[51:21] We're going to say that's no problem at all.
[51:21] all. >> Mhm.
[51:21] >> Mhm. >> We're just going to move our workload
[51:23] >> We're just going to move our workload around. We're going to make sure that
[51:25] around. We're going to make sure that data is never lost, but we can reduce
[51:27] data is never lost, but we can reduce the computing rate and use less energy.
[51:31] the computing rate and use less energy. the quality of service degrades a little
[51:33] the quality of service degrades a little bit for the critical workloads I shift
[51:35] bit for the critical workloads I shift that somewhere else right away so I
[51:38] that somewhere else right away so I don't have that problem and so you know
[51:40] don't have that problem and so you know whoever whichever data center still has
[51:42] whoever whichever data center still has 100% uptime and so how difficult of an
[51:45] 100% uptime and so how difficult of an engineering problem is that the smart
[51:46] engineering problem is that the smart dynamic allocation of power in the data
[51:48] dynamic allocation of power in the data center
[51:49] center >> as soon as you could specify you could
[51:50] >> as soon as you could specify you could engineer it beautifully put
[51:54] engineer it beautifully put so long as it obeys the laws of physics
[51:56] so long as it obeys the laws of physics on first principles I think we're good
[51:58] on first principles I think we're good >> what was the third thing you were
[51:59] >> what was the third thing you were mentioning um so the Second thing is the
[52:02] mentioning um so the Second thing is the the data centers
[52:03] the data centers >> and the third thing is we need the
[52:05] >> and the third thing is we need the utilities
[52:06] utilities to also recognize that this is an
[52:09] to also recognize that this is an opportunity
[52:10] opportunity >> and and instead of instead of saying
[52:12] >> and and instead of instead of saying look um it's going to take me 5 years to
[52:15] look um it's going to take me 5 years to increase my grid capability uh if you if
[52:19] increase my grid capability uh if you if you have if you're willing to take power
[52:21] you have if you're willing to take power of this level of guarantee
[52:24] of this level of guarantee I can make them available for you next
[52:26] I can make them available for you next month and at this price and so if
[52:30] month and at this price and so if utilities He's also offered more
[52:33] utilities He's also offered more segments of power delivery promises,
[52:36] segments of power delivery promises, then I think everybody will figure out
[52:38] then I think everybody will figure out what to do with it. Yeah. But there's
[52:39] what to do with it. Yeah. But there's just way too much waste in the in the
[52:41] just way too much waste in the in the grid right now. We we should go after
[52:43] grid right now. We we should go after it.
[52:43] it. >> Uh you've uh highly lauded Elon and uh
[52:47] >> Uh you've uh highly lauded Elon and uh Xi's accomplishment in Memphis in
[52:49] Xi's accomplishment in Memphis in building um Colossus Supercomputer
[52:53] building um Colossus Supercomputer probably in record time in just 4
[52:54] probably in record time in just 4 months. It's now at 200,000 GPUs and
[52:57] months. It's now at 200,000 GPUs and growing very quickly. Is there something
[53:00] growing very quickly. Is there something that you could speak to the understand
[53:02] that you could speak to the understand about his approach that's instructive to
[53:04] about his approach that's instructive to the broadly to all the data center
[53:06] the broadly to all the data center creators that's um that enabled that
[53:09] creators that's um that enabled that kind of accomplishment his approach to
[53:11] kind of accomplishment his approach to engineering his approach to the whole
[53:13] engineering his approach to the whole management of construction everything
[53:15] management of construction everything first of all Elon is deep in so many
[53:18] first of all Elon is deep in so many different topics um uh yet he's also a
[53:22] different topics um uh yet he's also a really good systems thinker
[53:24] really good systems thinker >> and so he's able to think through
[53:26] >> and so he's able to think through multiple disciplines and and um uh he
[53:30] multiple disciplines and and um uh he obviously
[53:32] obviously uh pushes things questions everything
[53:35] uh pushes things questions everything whether number one is it necessary
[53:39] whether number one is it necessary number two does it have to be done this
[53:40] number two does it have to be done this way and number you know does it have
[53:43] way and number you know does it have does it have to take this long and and
[53:46] does it have to take this long and and so so he he has he has the he has the
[53:49] so so he he has he has the he has the ability uh to question everything uh to
[53:53] ability uh to question everything uh to the point where everything is down to
[53:55] the point where everything is down to its minimal amount that's necess
[53:57] its minimal amount that's necess necessary. You can't take anything else
[53:59] necessary. You can't take anything else out and and yet yet the the uh the the
[54:03] out and and yet yet the the uh the the the the necessary um capabilities of the
[54:07] the the necessary um capabilities of the product retains, you know, and so he's
[54:09] product retains, you know, and so he's he is as minimalist as you could
[54:11] he is as minimalist as you could possibly imagine and he does it at a
[54:13] possibly imagine and he does it at a system system scale. Um I I also love
[54:16] system system scale. Um I I also love the fact that he he is um he is
[54:19] the fact that he he is um he is represented he he is he is present at
[54:23] represented he he is he is present at the point of action.
[54:24] the point of action. >> Mhm. you know, he'll just go there and
[54:27] >> Mhm. you know, he'll just go there and if there's a problem, he'll just go
[54:29] if there's a problem, he'll just go there and show me the problem. You know,
[54:31] there and show me the problem. You know, when you do all of this in combination,
[54:34] when you do all of this in combination, you overcome a lot of previous this is
[54:37] you overcome a lot of previous this is just the way we do it.
[54:38] just the way we do it. >> Um, you know, I'm I'm waiting for them.
[54:42] >> Um, you know, I'm I'm waiting for them. I, you know, I mean, just everybody has
[54:44] I, you know, I mean, just everybody has a lot of excuses. And and so and then
[54:47] a lot of excuses. And and so and then and then the last thing is when when you
[54:49] and then the last thing is when when you act personally with so much urgency, uh
[54:52] act personally with so much urgency, uh it causes everybody else to act with
[54:53] it causes everybody else to act with urgency, you know, and and every
[54:56] urgency, you know, and and every supplier has a lot of customers going
[54:57] supplier has a lot of customers going on. Every supplier has a lot of projects
[54:59] on. Every supplier has a lot of projects going on. And he he make it he made it
[55:02] going on. And he he make it he made it he makes it his business that he's the
[55:05] he makes it his business that he's the top priority of everybody else's, you
[55:06] top priority of everybody else's, you know, projects. And so he does that by
[55:08] know, projects. And so he does that by demonstrating it.
[55:09] demonstrating it. >> Yeah. I've been in a bunch of those
[55:10] >> Yeah. I've been in a bunch of those meetings. is it's fun to watch cuz
[55:12] meetings. is it's fun to watch cuz really not enough people ask the
[55:14] really not enough people ask the question like okay so uh can this be
[55:18] question like okay so uh can this be done a lot faster and how why does it
[55:20] done a lot faster and how why does it have to take this long yeah
[55:22] have to take this long yeah >> and then that becomes an engineering
[55:24] >> and then that becomes an engineering question often and yes I think when you
[55:26] question often and yes I think when you get the ground truth of actually I
[55:29] get the ground truth of actually I remember um one of the times I was
[55:31] remember um one of the times I was hanging out with him he literally is
[55:32] hanging out with him he literally is going through the entire process how to
[55:34] going through the entire process how to plug in cables into a rack and he's was
[55:37] plug in cables into a rack and he's was working with engineer on the ground
[55:39] working with engineer on the ground that's doing that task and he's just
[55:41] that's doing that task and he's just trying to understand what does that
[55:42] trying to understand what does that process look like so it can be less
[55:44] process look like so it can be less errorprone
[55:46] errorprone and just building up that intuition from
[55:48] and just building up that intuition from every single task involved in uh putting
[55:50] every single task involved in uh putting together the data center. You start to
[55:53] together the data center. You start to immediately get a sense at the detailed
[55:56] immediately get a sense at the detailed scale and at the broad system scale of
[56:00] scale and at the broad system scale of where the inefficiencies are and so you
[56:02] where the inefficiencies are and so you can make it more and more and more
[56:03] can make it more and more and more efficient. Plus, you have the big hammer
[56:05] efficient. Plus, you have the big hammer of being able to say, "Let's do it
[56:07] of being able to say, "Let's do it totally different."
[56:08] totally different." >> Yeah.
[56:08] >> Yeah. >> And remove all possible blockers.
[56:10] >> And remove all possible blockers. >> That's right.
[56:11] >> That's right. >> Is there parallels in the Nvidia extreme
[56:14] >> Is there parallels in the Nvidia extreme systems code design approach that you
[56:15] systems code design approach that you see in the way Elon approaches systems
[56:17] see in the way Elon approaches systems engineering?
[56:18] engineering? >> Well, first of all, the code design is a
[56:20] >> Well, first of all, the code design is a ultimate systems engineering problem.
[56:22] ultimate systems engineering problem. And so, we approach we approach the work
[56:24] And so, we approach we approach the work that we do from that first from that
[56:26] that we do from that first from that principle. Um the other thing that we do
[56:29] principle. Um the other thing that we do uh and this is this is a a philosophy
[56:32] uh and this is this is a a philosophy that a thought
[56:35] that a thought a a state of mind I guess a method that
[56:39] a a state of mind I guess a method that I started uh 30 years ago and it's
[56:42] I started uh 30 years ago and it's called the speed of light. The speed of
[56:44] called the speed of light. The speed of light is not just about the speed. Speed
[56:46] light is not just about the speed. Speed of light is my my shorthand for u what's
[56:50] of light is my my shorthand for u what's what's the limit of what physics can do.
[56:53] what's the limit of what physics can do. And so every single everything
[56:55] And so every single everything everything that we do is compared
[56:56] everything that we do is compared against the speed of light. Um memory
[56:58] against the speed of light. Um memory speed uh math speed uh power cost time
[57:04] speed uh math speed uh power cost time effort number of people manufacturing
[57:07] effort number of people manufacturing cycle time. And uh when you think about
[57:10] cycle time. And uh when you think about latency versus throughput, uh when you
[57:13] latency versus throughput, uh when you think about cost versus throughput, cost
[57:16] think about cost versus throughput, cost versus capacity, all of these things, uh
[57:21] versus capacity, all of these things, uh you test against the speed of light to
[57:23] you test against the speed of light to achieve all of these different
[57:26] achieve all of these different constraints separately.
[57:28] constraints separately. And then when you consider it together,
[57:31] And then when you consider it together, you know, you have to make compromises
[57:33] you know, you have to make compromises because a system that achieves extremely
[57:35] because a system that achieves extremely low latency versus achie a system that
[57:37] low latency versus achie a system that achieves very high throughput are
[57:39] achieves very high throughput are architected fundamentally differently.
[57:42] architected fundamentally differently. But you want to know what's the speed of
[57:44] But you want to know what's the speed of light of a system that achieves high
[57:47] light of a system that achieves high throughput? What's the speed of light of
[57:49] throughput? What's the speed of light of a system that achieves low latency? And
[57:52] a system that achieves low latency? And then when you think about the total
[57:53] then when you think about the total system, you can make trade-offs. And so
[57:56] system, you can make trade-offs. And so I I force everybody to think about
[57:57] I I force everybody to think about what's this what the f the first
[57:59] what's this what the f the first principles the limits
[58:01] principles the limits >> the physical limits
[58:03] >> the physical limits um for everything before we you know
[58:06] um for everything before we you know before we uh do anything and and we test
[58:10] before we uh do anything and and we test everything against that and so that's a
[58:12] everything against that and so that's a good frame of mind I don't love the
[58:15] good frame of mind I don't love the other methods which is continuous
[58:17] other methods which is continuous improvement
[58:19] improvement >> the the problem with continuous
[58:20] >> the the problem with continuous improvement it it first of all you
[58:23] improvement it it first of all you should engineer something from first
[58:26] should engineer something from first principles at the speed you know with
[58:27] principles at the speed you know with speed of light thinking limited only by
[58:30] speed of light thinking limited only by physical limits and and physics limits
[58:34] physical limits and and physics limits and um after that of course you would
[58:37] and um after that of course you would improve it over time um but I don't like
[58:41] improve it over time um but I don't like going into a problem and somebody says
[58:42] going into a problem and somebody says hey you know it takes 74 days to do this
[58:45] hey you know it takes 74 days to do this today
[58:46] today >> right now and um we can do it for you in
[58:48] >> right now and um we can do it for you in 72 days
[58:49] 72 days >> you know I rather strip it all back to
[58:51] >> you know I rather strip it all back to zero
[58:52] zero >> and so first of all explain to me why
[58:54] >> and so first of all explain to me why it's 74 is in the first place and let's
[58:57] it's 74 is in the first place and let's know let's think about what's possible
[58:58] know let's think about what's possible today and if I were to to build it
[59:01] today and if I were to to build it completely from scratch you know how
[59:03] completely from scratch you know how long would it take often times you'd be
[59:05] long would it take often times you'd be surprised and might come to 6 days now
[59:08] surprised and might come to 6 days now the rest of the 6 days to 74 could be
[59:12] the rest of the 6 days to 74 could be very wellreasoned and compromises and
[59:15] very wellreasoned and compromises and you know cost reductions and all kinds
[59:17] you know cost reductions and all kinds of different things but at least you
[59:19] of different things but at least you know what they are and then now that you
[59:21] know what they are and then now that you know that six days possible
[59:24] know that six days possible Then the conversation from 74 to 6
[59:28] Then the conversation from 74 to 6 surprisingly much more effective
[59:30] surprisingly much more effective >> in such incredibly complex systems that
[59:32] >> in such incredibly complex systems that you're working with is simplicity
[59:33] you're working with is simplicity sometimes a good huristic to to reach
[59:36] sometimes a good huristic to to reach for I mean if I can just
[59:40] for I mean if I can just I mean the pod the Vera Rubin pod that
[59:42] I mean the pod the Vera Rubin pod that you announced is just incredible uh
[59:44] you announced is just incredible uh we're talking about seven chips seven
[59:47] we're talking about seven chips seven chip types five purpose-built rack types
[59:49] chip types five purpose-built rack types 40 racks 1.2 two quadrillion
[59:51] 40 racks 1.2 two quadrillion transistors, nearly 20,000 Nvidia dies,
[59:56] transistors, nearly 20,000 Nvidia dies, over 1100 Ruben GPUs, 60 exoflops, 10
[59:59] over 1100 Ruben GPUs, 60 exoflops, 10 pabytes per second of scale bandwidth.
[01:00:01] pabytes per second of scale bandwidth. Uh, that's all just one
[01:00:03] Uh, that's all just one >> that's just one pod.
[01:00:04] >> that's just one pod. >> That's just
[01:00:06] >> That's just >> Yeah, that's just one pod.
[01:00:07] >> Yeah, that's just one pod. >> I mean, so you have the and then even
[01:00:09] >> I mean, so you have the and then even the the NVL72 rack alone is 1.3 million
[01:00:14] the the NVL72 rack alone is 1.3 million components, 1300 chips, 4,000 lb crammed
[01:00:17] components, 1300 chips, 4,000 lb crammed into a single 19inch wide rack. And Lex,
[01:00:20] into a single 19inch wide rack. And Lex, we'll probably kind of crank out about
[01:00:21] we'll probably kind of crank out about 200 of these pods a week just to put in
[01:00:24] 200 of these pods a week just to put in perspective
[01:00:25] perspective >> the the amount of different components.
[01:00:27] >> the the amount of different components. I suppose simplicity is impossible, but
[01:00:30] I suppose simplicity is impossible, but is that a metric that you kind of reach
[01:00:33] is that a metric that you kind of reach for in trying to design things?
[01:00:35] for in trying to design things? >> You know, the phrase the phrase that I
[01:00:37] >> You know, the phrase the phrase that I use most often is we we need things to
[01:00:40] use most often is we we need things to be as complex as necessary but as simple
[01:00:42] be as complex as necessary but as simple as possible. And and so the question is
[01:00:45] as possible. And and so the question is is all that complexity there necessary?
[01:00:48] is all that complexity there necessary? And we ought to test for that and we
[01:00:50] And we ought to test for that and we ought to challenge that. And then after
[01:00:52] ought to challenge that. And then after that everything else above it, you know,
[01:00:55] that everything else above it, you know, it's gratuitous.
[01:00:56] it's gratuitous. >> But it's some of the most incredible
[01:00:58] >> But it's some of the most incredible semiconductor industry broadly, but what
[01:01:00] semiconductor industry broadly, but what Nvidia is doing uh
[01:01:04] Nvidia is doing uh some of the greatest engineering in
[01:01:05] some of the greatest engineering in history. So these systems are just truly
[01:01:08] history. So these systems are just truly truly marvels of engineering.
[01:01:10] truly marvels of engineering. >> It is the most complex computer the
[01:01:12] >> It is the most complex computer the world has ever made. Yeah, the
[01:01:13] world has ever made. Yeah, the engineering teams. I mean, I don't it's
[01:01:15] engineering teams. I mean, I don't it's not a competition, but I don't know if
[01:01:16] not a competition, but I don't know if if it was like an Olympics of uh
[01:01:18] if it was like an Olympics of uh engineering teams. I mean, TSMC does
[01:01:20] engineering teams. I mean, TSMC does incredible engineering. Like I said,
[01:01:22] incredible engineering. Like I said, ASML at every scale, but Nvidia is going
[01:01:25] ASML at every scale, but Nvidia is going to give them a run for their money.
[01:01:27] to give them a run for their money. >> Just incredible, incredible teams,
[01:01:29] >> Just incredible, incredible teams, >> gold medal medalist in every single in
[01:01:31] >> gold medal medalist in every single in every single sport, all assembled right
[01:01:33] every single sport, all assembled right here
[01:01:33] here >> and have to work together and report
[01:01:35] >> and have to work together and report directly to you. This is wonderful. Uh
[01:01:37] directly to you. This is wonderful. Uh you've recently traveled to China.
[01:01:40] you've recently traveled to China. Uh so it's interesting to ask you uh
[01:01:44] Uh so it's interesting to ask you uh China's been incredibly successful in
[01:01:46] China's been incredibly successful in building up its technology sector. What
[01:01:48] building up its technology sector. What do you understand about um how China is
[01:01:52] do you understand about um how China is able to over the past 10 years build so
[01:01:54] able to over the past 10 years build so many incredible world-class companies,
[01:01:57] many incredible world-class companies, world-class engineering teams and just
[01:01:59] world-class engineering teams and just this technology ecosystem
[01:02:01] this technology ecosystem >> that produces so many um incredible
[01:02:04] >> that produces so many um incredible products. whole bunch of reasons for
[01:02:06] products. whole bunch of reasons for well first of all let's let's start
[01:02:07] well first of all let's let's start let's start with some facts 50% of the
[01:02:10] let's start with some facts 50% of the world's AI researchers are Chinese
[01:02:13] world's AI researchers are Chinese plus or minus and they're mostly in
[01:02:17] plus or minus and they're mostly in China still we have many of them here
[01:02:20] China still we have many of them here but there's amazing researchers still in
[01:02:22] but there's amazing researchers still in China um they their tech industry showed
[01:02:27] China um they their tech industry showed up at precisely the right time at the
[01:02:30] up at precisely the right time at the time of the mobile cloud era uh their
[01:02:33] time of the mobile cloud era uh their way of contributing was software and So
[01:02:35] way of contributing was software and So this is a country's in incredible
[01:02:37] this is a country's in incredible science and math. Uh really well
[01:02:40] science and math. Uh really well educated kids. Um uh their tech industry
[01:02:44] educated kids. Um uh their tech industry was created during the era of software.
[01:02:48] was created during the era of software. They're very comfortable with modern
[01:02:49] They're very comfortable with modern software.
[01:02:52] software. China is not one giant economic country.
[01:02:56] China is not one giant economic country. It's got many provinces and cities with
[01:02:59] It's got many provinces and cities with mayors all competing with each other.
[01:03:01] mayors all competing with each other. That's the reason why there's so many EV
[01:03:03] That's the reason why there's so many EV companies. That's the reason why there's
[01:03:05] companies. That's the reason why there's so many AI companies. That's the reason
[01:03:06] so many AI companies. That's the reason why there's so many every company you
[01:03:08] why there's so many every company you could imagine. Um they all create some
[01:03:12] could imagine. Um they all create some of them and and um as a result they have
[01:03:16] of them and and um as a result they have insane competition internally and you
[01:03:19] insane competition internally and you know what remains is an incredible
[01:03:22] know what remains is an incredible company. Um they also have a um social
[01:03:27] company. Um they also have a um social culture where where it's family first,
[01:03:31] culture where where it's family first, friends second and company third.
[01:03:34] friends second and company third. And so
[01:03:36] And so um
[01:03:38] um the amount of conversation that goes
[01:03:42] the amount of conversation that goes back and forth between they're
[01:03:45] back and forth between they're essentially open source all the time. So
[01:03:48] essentially open source all the time. So the fact that they contribute more to
[01:03:49] the fact that they contribute more to open source is so sensible because
[01:03:52] open source is so sensible because they're probably what are we protecting?
[01:03:53] they're probably what are we protecting? You know my engineers their brothers are
[01:03:56] You know my engineers their brothers are in that company their friends are in
[01:03:58] in that company their friends are in that company and they're all
[01:03:59] that company and they're all schoolmates. you know the schoolmate
[01:04:01] schoolmates. you know the schoolmate concept it's a you know one schoolmate
[01:04:04] concept it's a you know one schoolmate your brother for life and um and so they
[01:04:08] your brother for life and um and so they they they share knowledge very very
[01:04:10] they they share knowledge very very quickly and so there's no sense keeping
[01:04:14] quickly and so there's no sense keeping technology hidden you might as well put
[01:04:16] technology hidden you might as well put it on open source and so the open source
[01:04:18] it on open source and so the open source community then amplifies accelerates the
[01:04:21] community then amplifies accelerates the the innovation process so you get this
[01:04:24] the innovation process so you get this rapid incredibly great talent rapid
[01:04:27] rapid incredibly great talent rapid innovation because of open source and
[01:04:29] innovation because of open source and just you the the nature of friends and
[01:04:32] just you the the nature of friends and and um insane competition among compet
[01:04:36] and um insane competition among compet among the company what emerges is
[01:04:38] among the company what emerges is incredible stuff and so this is the
[01:04:42] incredible stuff and so this is the fastest innovating
[01:04:44] fastest innovating country in the world today and this is
[01:04:46] country in the world today and this is something that has everything that
[01:04:47] something that has everything that everything that I've just said is
[01:04:49] everything that I've just said is fundamental to just how the kids were
[01:04:51] fundamental to just how the kids were grown the fact that they have excellent
[01:04:53] grown the fact that they have excellent education the fact that they parents
[01:04:56] education the fact that they parents want them to do well in school the fact
[01:04:58] want them to do well in school the fact that they their culture that way. These
[01:05:00] that they their culture that way. These are, you know, these are just the thing
[01:05:02] are, you know, these are just the thing about their country and they showed up
[01:05:04] about their country and they showed up at a precisely the time when technology
[01:05:06] at a precisely the time when technology is going through that exponential.
[01:05:09] is going through that exponential. >> Plus, culturally, it's pretty cool to be
[01:05:11] >> Plus, culturally, it's pretty cool to be an engineer. It connects to all the
[01:05:14] an engineer. It connects to all the components that you're mentioning.
[01:05:16] components that you're mentioning. >> It's a it's a builder nation.
[01:05:18] >> It's a it's a builder nation. >> It's a builder nation.
[01:05:19] >> It's a builder nation. >> Yeah, it's a builder nation. Um, our
[01:05:21] >> Yeah, it's a builder nation. Um, our country's leaders, incredible, but
[01:05:23] country's leaders, incredible, but they're mostly lawyers. They're
[01:05:25] they're mostly lawyers. They're country's leaders and because we're
[01:05:26] country's leaders and because we're they're trying to keep us safe. uh rule
[01:05:29] they're trying to keep us safe. uh rule of law, uh governing. Their country was
[01:05:33] of law, uh governing. Their country was built out of poverty and so most of
[01:05:37] built out of poverty and so most of their leaders are incredible engineers,
[01:05:40] their leaders are incredible engineers, some of the brightest minds.
[01:05:43] some of the brightest minds. To take a small tangent because you
[01:05:44] To take a small tangent because you mentioned open source, I have to uh go
[01:05:47] mentioned open source, I have to uh go to Perplexity here, who you have been a
[01:05:49] to Perplexity here, who you have been a a fan of a long time.
[01:05:51] a fan of a long time. >> I love it. Yeah.
[01:05:52] >> I love it. Yeah. >> And thank you for releasing open source
[01:05:54] >> And thank you for releasing open source Neatron 3 Super, which you can also use
[01:05:57] Neatron 3 Super, which you can also use inside Perplexity. look stuff up.
[01:05:59] inside Perplexity. look stuff up. >> Yeah.
[01:05:59] >> Yeah. >> Uh which is uh 120 billion parameter
[01:06:02] >> Uh which is uh 120 billion parameter open weight uh model.
[01:06:05] open weight uh model. >> Uh what's your vision
[01:06:08] >> Uh what's your vision with open source? So you mentioned China
[01:06:11] with open source? So you mentioned China with with Deep Seek with Minia with all
[01:06:14] with with Deep Seek with Minia with all these companies really pushing forward
[01:06:17] these companies really pushing forward the open- source uh AI movement and
[01:06:20] the open- source uh AI movement and Nvidia is really leading the way in um
[01:06:24] Nvidia is really leading the way in um close to state-of-the-art open source
[01:06:26] close to state-of-the-art open source LMS. What's your vision there?
[01:06:28] LMS. What's your vision there? >> First,
[01:06:30] >> First, if we're going to be a great AI
[01:06:32] if we're going to be a great AI computing company, we have to understand
[01:06:33] computing company, we have to understand how AI models are evolving.
[01:06:36] how AI models are evolving. >> One of the things that I love about
[01:06:38] >> One of the things that I love about Neotron 3 is it's it's not a just a pure
[01:06:41] Neotron 3 is it's it's not a just a pure transformer model. It's transformer and
[01:06:44] transformer model. It's transformer and SSM. And uh we were early in uh
[01:06:48] SSM. And uh we were early in uh developing the the uh conditional GANs
[01:06:51] developing the the uh conditional GANs which that progressive GANs which led
[01:06:54] which that progressive GANs which led step by step to diffusion. And so um the
[01:06:57] step by step to diffusion. And so um the fact that we're doing basic research in
[01:07:00] fact that we're doing basic research in model architecture and in different
[01:07:02] model architecture and in different domains gives us visibility into you
[01:07:06] domains gives us visibility into you know what kind of computing systems
[01:07:07] know what kind of computing systems would do a good job for future models
[01:07:09] would do a good job for future models and so it is part of our extreme
[01:07:11] and so it is part of our extreme codeesign strategy. Second,
[01:07:14] codeesign strategy. Second, um I think we we right rightfully
[01:07:18] um I think we we right rightfully recognize that on the one hand we want
[01:07:22] recognize that on the one hand we want worldclass models as products and they
[01:07:25] worldclass models as products and they should be proprietary.
[01:07:27] should be proprietary. On the other hand, we also want AI to
[01:07:30] On the other hand, we also want AI to diffuse into every industry and every
[01:07:32] diffuse into every industry and every country, every researcher, every
[01:07:35] country, every researcher, every student.
[01:07:37] student. And if everything is proprietary, it's
[01:07:40] And if everything is proprietary, it's hard to do research and it's hard to
[01:07:42] hard to do research and it's hard to innovate on top of around with. And so
[01:07:47] innovate on top of around with. And so open source is fundamentally necessary
[01:07:50] open source is fundamentally necessary for many industries to join the AI
[01:07:53] for many industries to join the AI revolution.
[01:07:54] revolution. Nvidia has the scale and we have the
[01:07:57] Nvidia has the scale and we have the motives to not only skills, scale and
[01:08:02] motives to not only skills, scale and motivation to build and continue to
[01:08:06] motivation to build and continue to build these AI models for as long as we
[01:08:08] build these AI models for as long as we shall live. And so therefore, we ought
[01:08:10] shall live. And so therefore, we ought to do that. We can open up, we can
[01:08:12] to do that. We can open up, we can activate every industry, every
[01:08:15] activate every industry, every researcher, you know, every country to
[01:08:18] researcher, you know, every country to be able to join the AI revolution.
[01:08:21] be able to join the AI revolution. There's a third reason which is for that
[01:08:24] There's a third reason which is for that to recognizing that AI is not just
[01:08:27] to recognizing that AI is not just language. These AIs will likely use uh
[01:08:31] language. These AIs will likely use uh tools and models and sub aents that were
[01:08:36] tools and models and sub aents that were trained on other modalities of
[01:08:38] trained on other modalities of information. Maybe it's biology or
[01:08:40] information. Maybe it's biology or chemistry or um you know laws of physics
[01:08:44] chemistry or um you know laws of physics or you know fluids and thermodynamics
[01:08:47] or you know fluids and thermodynamics and not all of it is in language
[01:08:49] and not all of it is in language structure. And so somebody has to go
[01:08:51] structure. And so somebody has to go make sure that weather prediction,
[01:08:55] make sure that weather prediction, biology, AI, AI for biology, physical
[01:09:00] biology, AI, AI for biology, physical AI, all of that stuff stays can be
[01:09:03] AI, all of that stuff stays can be pushed to the limits and pushed to the
[01:09:04] pushed to the limits and pushed to the frontier. We don't build cars, but we
[01:09:07] frontier. We don't build cars, but we want to make sure every car company has
[01:09:08] want to make sure every car company has access to great models. We don't we
[01:09:11] access to great models. We don't we don't discover drugs, but I want to make
[01:09:12] don't discover drugs, but I want to make sure that Lily has the world's best
[01:09:15] sure that Lily has the world's best biology AI systems so that they can go
[01:09:17] biology AI systems so that they can go use it for discovering drugs. And so
[01:09:20] use it for discovering drugs. And so these three fundamental reasons both in
[01:09:22] these three fundamental reasons both in in recognizing that AI is not just the
[01:09:25] in recognizing that AI is not just the language that AI is really broad that we
[01:09:27] language that AI is really broad that we want to engage everybody into the world
[01:09:29] want to engage everybody into the world of AI and then also codees of AI.
[01:09:32] of AI and then also codees of AI. >> Well, I have to say once again, thank
[01:09:34] >> Well, I have to say once again, thank you uh for open sourcing really truly
[01:09:37] you uh for open sourcing really truly open sourcing uh Neatron 3. And
[01:09:39] open sourcing uh Neatron 3. And >> yeah, I appreciate you were saying that
[01:09:40] >> yeah, I appreciate you were saying that we open source the models, we open
[01:09:42] we open source the models, we open source the weights, we open source the
[01:09:43] source the weights, we open source the data, we open source how we created it.
[01:09:46] data, we open source how we created it. >> Yeah, it's pretty amazing.
[01:09:48] >> Yeah, it's pretty amazing. It's really It's really incredible.
[01:09:51] It's really It's really incredible. You're originally from Taiwan and have a
[01:09:53] You're originally from Taiwan and have a close relationship with TSMC. So I have
[01:09:56] close relationship with TSMC. So I have to ask uh TSMC I think uh also is a
[01:10:00] to ask uh TSMC I think uh also is a legendary company in terms of the
[01:10:02] legendary company in terms of the engineering teams in terms of the
[01:10:03] engineering teams in terms of the incredible engineering work that they
[01:10:05] incredible engineering work that they do. uh what uh what do you understand
[01:10:08] do. uh what uh what do you understand about TSMC culture and their approach
[01:10:10] about TSMC culture and their approach that explains how they're able to
[01:10:13] that explains how they're able to achieve this singular unmatched success
[01:10:16] achieve this singular unmatched success in uh everything they're doing with
[01:10:18] in uh everything they're doing with semiconductors? You know, first of all,
[01:10:21] semiconductors? You know, first of all, the deepest misunderstanding about TSMC
[01:10:25] the deepest misunderstanding about TSMC is that that um
[01:10:29] is that that um their technology
[01:10:31] their technology is all they have. that somehow they they
[01:10:35] is all they have. that somehow they they have a really great transistor and if
[01:10:37] have a really great transistor and if somebody shows up another transistor
[01:10:39] somebody shows up another transistor game over
[01:10:41] game over >> it's the technology and of course you
[01:10:44] >> it's the technology and of course you know I I don't mean just the trans
[01:10:46] know I I don't mean just the trans transistor the metalization systems the
[01:10:48] transistor the metalization systems the packaging the 3D packaging the silicon
[01:10:50] packaging the 3D packaging the silicon photonics the you know all of the
[01:10:52] photonics the you know all of the technology that they have that
[01:10:53] technology that they have that technology is really what makes the
[01:10:55] technology is really what makes the company special their technology makes
[01:10:57] company special their technology makes the company special
[01:10:59] the company special but their ability to orchestrate
[01:11:04] but their ability to orchestrate the the demands the the dynamic demands
[01:11:08] the the demands the the dynamic demands of hundreds of companies in the world as
[01:11:11] of hundreds of companies in the world as they're moving up, shifting out, you
[01:11:14] they're moving up, shifting out, you know, increasing, decreasing, push
[01:11:16] know, increasing, decreasing, push pushing out, pulling in, um changing
[01:11:20] pushing out, pulling in, um changing from customer to customer, uh wafer
[01:11:23] from customer to customer, uh wafer starting, wafer stopping,
[01:11:26] starting, wafer stopping, uh emergency wafer starts, you know, all
[01:11:29] uh emergency wafer starts, you know, all of this dynamics of the world's
[01:11:32] of this dynamics of the world's complexity as the world is shapeshifting
[01:11:35] complexity as the world is shapeshifting all the time and somehow they're running
[01:11:38] all the time and somehow they're running a factory with high throughput, high
[01:11:41] a factory with high throughput, high yields, really great costs, excellent
[01:11:44] yields, really great costs, excellent customer service. They they take their
[01:11:47] customer service. They they take their work ser they take their promises
[01:11:49] work ser they take their promises seriously. when your wafer because they
[01:11:51] seriously. when your wafer because they know that you're help they're helping
[01:11:52] know that you're help they're helping you run your company when the wafers
[01:11:54] you run your company when the wafers when the wafers were promised to show up
[01:11:56] when the wafers were promised to show up the wafers show up you know so that you
[01:11:58] the wafers show up you know so that you could run your company appropriately and
[01:12:00] could run your company appropriately and so their system their manufacturing
[01:12:02] so their system their manufacturing system is completely miraculous I would
[01:12:05] system is completely miraculous I would say then the second thing is their
[01:12:06] say then the second thing is their culture this culture is uh
[01:12:08] culture this culture is uh simultaneously
[01:12:10] simultaneously uh technology focused on one hand
[01:12:12] uh technology focused on one hand advancing technology
[01:12:14] advancing technology simultaneously customer serviceoriented
[01:12:16] simultaneously customer serviceoriented on the other hand a lot of
[01:12:19] on the other hand a lot of C companies are very customer
[01:12:20] C companies are very customer serviceoriented, but they're not very
[01:12:22] serviceoriented, but they're not very technology
[01:12:24] technology excellent. They're they're not at the
[01:12:25] excellent. They're they're not at the bleeding edge of technology or a lot of
[01:12:27] bleeding edge of technology or a lot of companies who are tech at the bleeding
[01:12:28] companies who are tech at the bleeding edge of technology, but they're not the
[01:12:30] edge of technology, but they're not the best customer service oriented company.
[01:12:32] best customer service oriented company. And so it just depends on somehow
[01:12:34] And so it just depends on somehow they've they've balanced these two and
[01:12:36] they've they've balanced these two and they're world class at both. Um and then
[01:12:40] they're world class at both. Um and then probably the third thing is the
[01:12:41] probably the third thing is the technology that I most value in them uh
[01:12:44] technology that I most value in them uh that they created this you know this
[01:12:46] that they created this you know this this uh intangible called trust. I trust
[01:12:50] this uh intangible called trust. I trust them to put my company on top of them.
[01:12:54] them to put my company on top of them. That's a very big deal. But they trust I
[01:12:56] That's a very big deal. But they trust I mean there's a really close relationship
[01:12:57] mean there's a really close relationship there that you've established and that
[01:12:59] there that you've established and that trust is established based on many years
[01:13:01] trust is established based on many years of performance. But there's human
[01:13:03] of performance. But there's human relationships involved there as well.
[01:13:05] relationships involved there as well. three decades. I don't know how many
[01:13:07] three decades. I don't know how many tens, hundreds of billions of dollars of
[01:13:09] tens, hundreds of billions of dollars of business we've done through them and we
[01:13:11] business we've done through them and we don't have a contract.
[01:13:14] don't have a contract. That's pretty great. Amazing. Okay.
[01:13:16] That's pretty great. Amazing. Okay. There's a story uh that in 2013 the
[01:13:20] There's a story uh that in 2013 the founders of TSMC, Morris Jang, offered
[01:13:22] founders of TSMC, Morris Jang, offered you the chance to become TSMC's chief
[01:13:25] you the chance to become TSMC's chief executive.
[01:13:26] executive. Uh and you said you already had a job.
[01:13:28] Uh and you said you already had a job. Is this story true?
[01:13:30] Is this story true? >> Story is true. I didn't I didn't dismiss
[01:13:32] >> Story is true. I didn't I didn't dismiss it. Yeah. Um uh but I was I was deeply
[01:13:35] it. Yeah. Um uh but I was I was deeply honored and and of course of course um
[01:13:39] honored and and of course of course um uh I knew then as I know now TSMC is one
[01:13:42] uh I knew then as I know now TSMC is one of the most consequential companies in
[01:13:43] of the most consequential companies in history.
[01:13:44] history. >> Yeah. And and Morris is one of the the
[01:13:48] >> Yeah. And and Morris is one of the the highest regarded executive and and um
[01:13:51] highest regarded executive and and um business and personal friend that I've
[01:13:54] business and personal friend that I've that I've had in my life. And um
[01:13:57] that I've had in my life. And um uh for him to ask is uh uh um I I was
[01:14:02] uh for him to ask is uh uh um I I was humbled and and really honored.
[01:14:05] humbled and and really honored. Um but but the work that I'm doing here
[01:14:07] Um but but the work that I'm doing here is really important and I've seen you
[01:14:09] is really important and I've seen you know in my mind anyways in my mind's eye
[01:14:12] know in my mind anyways in my mind's eye what Nvidia was going to be and what the
[01:14:15] what Nvidia was going to be and what the impact that we could have and um uh it
[01:14:18] impact that we could have and um uh it was really important work
[01:14:21] was really important work and it's my responsibility you know my
[01:14:23] and it's my responsibility you know my sole responsibility to make this happen
[01:14:25] sole responsibility to make this happen and so I I um uh I declined it you know
[01:14:31] and so I I um uh I declined it you know not not because it wasn't an incredible
[01:14:33] not not because it wasn't an incredible offer Uh it it's an unbelievable offer.
[01:14:36] offer Uh it it's an unbelievable offer. Um but but I simply couldn't take it.
[01:14:38] Um but but I simply couldn't take it. >> I think Nvidia, both Nvidia and TSMC are
[01:14:41] >> I think Nvidia, both Nvidia and TSMC are two of the greatest companies in the
[01:14:44] two of the greatest companies in the history of human civilization. Running
[01:14:46] history of human civilization. Running either one, I'm sure, is incredibly
[01:14:48] either one, I'm sure, is incredibly complicated effort and it takes you have
[01:14:50] complicated effort and it takes you have to truly be allin.
[01:14:52] to truly be allin. >> Yeah.
[01:14:52] >> Yeah. >> Uh everybody at every scale, not just at
[01:14:54] >> Uh everybody at every scale, not just at the CEO level, everybody is really truly
[01:14:56] the CEO level, everybody is really truly allin.
[01:14:57] allin. >> Yeah.
[01:14:58] >> Yeah. >> To accomplish this kind of complexity.
[01:15:00] >> To accomplish this kind of complexity. >> See, now I can help both companies.
[01:15:02] >> See, now I can help both companies. >> Exactly. Um, so Nvidia is now the most
[01:15:06] >> Exactly. Um, so Nvidia is now the most valuable company in the world. I have to
[01:15:08] valuable company in the world. I have to ask, what is the Nvidia's biggest moat
[01:15:12] ask, what is the Nvidia's biggest moat as the folks in the tech sector say?
[01:15:15] as the folks in the tech sector say? >> Mhm.
[01:15:15] >> Mhm. >> The edge you have that protects you from
[01:15:18] >> The edge you have that protects you from the competition.
[01:15:20] the competition. Our single
[01:15:22] Our single most important uh property as a company
[01:15:27] most important uh property as a company is the install base of our computing
[01:15:30] is the install base of our computing platform. Our single most important
[01:15:33] platform. Our single most important thing is the invol today is our is the
[01:15:35] thing is the invol today is our is the installed base of CUDA. Now the reason
[01:15:38] installed base of CUDA. Now the reason why uh
[01:15:41] why uh 20 20 years ago of course there was no
[01:15:44] 20 20 years ago of course there was no installed base but what makes and if
[01:15:47] installed base but what makes and if somebody if somebody came up with with a
[01:15:50] somebody if somebody came up with with a guda or a tuda uh it wouldn't make any
[01:15:52] guda or a tuda uh it wouldn't make any difference at all. And the reason for
[01:15:55] difference at all. And the reason for that is because because it's never been
[01:15:57] that is because because it's never been just about the technology. The
[01:15:59] just about the technology. The technology of course was incredible
[01:16:01] technology of course was incredible visionary. Um but it's the fact that the
[01:16:04] visionary. Um but it's the fact that the company was dedicated to it, stuck with
[01:16:07] company was dedicated to it, stuck with it, expanded its reach. Um it wasn't
[01:16:11] it, expanded its reach. Um it wasn't three people that that made CUDA
[01:16:13] three people that that made CUDA successful. It was 43,000 people that
[01:16:15] successful. It was 43,000 people that made CUDA successful. and the several
[01:16:18] made CUDA successful. and the several million developers that believed in us
[01:16:21] million developers that believed in us um that trusted that we were going to
[01:16:23] um that trusted that we were going to continue to make CUDA 1 2 3 13 that they
[01:16:27] continue to make CUDA 1 2 3 13 that they decided to port and dedicate their
[01:16:29] decided to port and dedicate their software on top of it, their mountain of
[01:16:31] software on top of it, their mountain of software on top of it. And so the
[01:16:33] software on top of it. And so the install base is the number one most
[01:16:36] install base is the number one most important advantage. that installed base
[01:16:39] important advantage. that installed base when you amplified with the velocity of
[01:16:42] when you amplified with the velocity of our execution at the scale that we're
[01:16:44] our execution at the scale that we're talking about. No company in history had
[01:16:47] talking about. No company in history had ever built systems of this complexity
[01:16:50] ever built systems of this complexity period. And then to build it once a year
[01:16:53] period. And then to build it once a year is impossible.
[01:16:56] is impossible. And and
[01:16:58] And and that velocity combined with the
[01:17:00] that velocity combined with the installed base in the developer's mind
[01:17:04] installed base in the developer's mind is just going to now take the
[01:17:05] is just going to now take the developer's mind. From the developers
[01:17:07] developer's mind. From the developers perspective, if I support CUDA
[01:17:11] perspective, if I support CUDA tomorrow, it will be 10 times better. I
[01:17:13] tomorrow, it will be 10 times better. I just have to wait 6 months on average.
[01:17:16] just have to wait 6 months on average. Not only that, if I develop it on CUDA,
[01:17:19] Not only that, if I develop it on CUDA, I reach a few hundred million people
[01:17:22] I reach a few hundred million people computers. I'm in every cloud. I'm in
[01:17:25] computers. I'm in every cloud. I'm in every computer company. I'm in every
[01:17:27] every computer company. I'm in every single industry. I'm in every single
[01:17:29] single industry. I'm in every single country.
[01:17:31] country. So if I created an open source package
[01:17:33] So if I created an open source package and I put it on CUDA first,
[01:17:35] and I put it on CUDA first, I get these both attributes
[01:17:38] I get these both attributes simultaneously.
[01:17:40] simultaneously. And not only that,
[01:17:43] And not only that, I trust 100%
[01:17:46] I trust 100% that Nvidia is going to keep CUDA around
[01:17:48] that Nvidia is going to keep CUDA around and maintain it and improve it and keep
[01:17:51] and maintain it and improve it and keep optimizing the libraries for as long as
[01:17:54] optimizing the libraries for as long as they shall live.
[01:17:56] they shall live. You could take that to the bank. And
[01:17:58] You could take that to the bank. And that last part, trust,
[01:18:00] that last part, trust, you put all that stuff together, if I
[01:18:03] you put all that stuff together, if I were a developer today, I would target
[01:18:05] were a developer today, I would target CUDA first. I would target CUDA most.
[01:18:09] CUDA first. I would target CUDA most. And that's the reason that that I think
[01:18:11] And that's the reason that that I think in the final analysis is our first
[01:18:14] in the final analysis is our first that's even our first
[01:18:17] that's even our first >> core advantage. Our second one is our
[01:18:19] >> core advantage. Our second one is our ecosystem.
[01:18:21] ecosystem. >> The fact that we vertically integrated
[01:18:23] >> The fact that we vertically integrated this incredibly complex system, but we
[01:18:26] this incredibly complex system, but we integrated horizontally into every
[01:18:28] integrated horizontally into every single every single company's computers.
[01:18:30] single every single company's computers. We're in the Google cloud, we're in
[01:18:31] We're in the Google cloud, we're in Amazon, we're in Azure.
[01:18:33] Amazon, we're in Azure. >> You know, we're ramping up AWS like
[01:18:35] >> You know, we're ramping up AWS like crazy right now. We're in new companies
[01:18:38] crazy right now. We're in new companies like Corewave and Nscale. We're in
[01:18:41] like Corewave and Nscale. We're in supercomputers at Lily. We're in
[01:18:43] supercomputers at Lily. We're in enterprise computers. We're at the edge
[01:18:45] enterprise computers. We're at the edge in radio base stations. You know, I it's
[01:18:48] in radio base stations. You know, I it's just crazy. One architecture is in all
[01:18:50] just crazy. One architecture is in all these different systems. We're in cars,
[01:18:52] these different systems. We're in cars, we're in robots, we're in satellites,
[01:18:54] we're in robots, we're in satellites, we're out in space. And so, so the fact
[01:18:56] we're out in space. And so, so the fact that you have this one architecture and
[01:18:58] that you have this one architecture and the ecosystem is so broad, it basically
[01:19:00] the ecosystem is so broad, it basically covers every single industry in the
[01:19:02] covers every single industry in the world. Well, how does the how does the
[01:19:05] world. Well, how does the how does the CUDA install base evolve into the future
[01:19:08] CUDA install base evolve into the future with AI factories as a moat? What do you
[01:19:11] with AI factories as a moat? What do you what do do you think it's possible that
[01:19:13] what do do you think it's possible that Nvidia of the future is all about the AI
[01:19:15] Nvidia of the future is all about the AI factory? Well, the the unit of computing
[01:19:18] factory? Well, the the unit of computing used to be GPU to us, then it became a
[01:19:21] used to be GPU to us, then it became a computer. Then it became a cluster. Now
[01:19:24] computer. Then it became a cluster. Now it's an entire AI factory. when I see a
[01:19:26] it's an entire AI factory. when I see a computer, when I see what Nvidia builds
[01:19:28] computer, when I see what Nvidia builds in the old days, I would, you know, I
[01:19:30] in the old days, I would, you know, I visualize the chip
[01:19:32] visualize the chip >> and then and then when I announced a new
[01:19:34] >> and then and then when I announced a new product, you know, new generation, like
[01:19:36] product, you know, new generation, like ladies and gentlemen, we're announcing
[01:19:37] ladies and gentlemen, we're announcing ampear today. I pick up the chip.
[01:19:39] ampear today. I pick up the chip. >> Yeah.
[01:19:40] >> Yeah. >> That was my mental model what I was
[01:19:42] >> That was my mental model what I was building.
[01:19:43] building. >> Today, I don't I wouldn't picking up the
[01:19:45] >> Today, I don't I wouldn't picking up the chip is kind of still adorable,
[01:19:47] chip is kind of still adorable, >> but it's adorable. It It's not It's not
[01:19:51] >> but it's adorable. It It's not It's not my mental model of what I'm doing. My
[01:19:53] my mental model of what I'm doing. My mental model is this giant gigawatt
[01:19:56] mental model is this giant gigawatt thing that has power generation. It's
[01:19:59] thing that has power generation. It's connected to the grid. It's got cooling
[01:20:01] connected to the grid. It's got cooling systems and networking of incredible
[01:20:03] systems and networking of incredible monstrosity. You know, 10,000 people are
[01:20:07] monstrosity. You know, 10,000 people are in there trying to install it. Hundreds
[01:20:09] in there trying to install it. Hundreds of networking engineers in there.
[01:20:11] of networking engineers in there. Thousands of engineers behind it trying
[01:20:13] Thousands of engineers behind it trying to power it up.
[01:20:14] to power it up. >> You know, powering up one of those
[01:20:15] >> You know, powering up one of those factories, as you know, it's not
[01:20:17] factories, as you know, it's not somebody going, "It's on now."
[01:20:20] somebody going, "It's on now." takes thousands of people to bring it
[01:20:22] takes thousands of people to bring it up.
[01:20:22] up. >> So mentally you're actually when you're
[01:20:24] >> So mentally you're actually when you're thinking about a single unit of compute,
[01:20:26] thinking about a single unit of compute, you're like literally when you go to bed
[01:20:28] you're like literally when you go to bed at night, you're thinking now about
[01:20:30] at night, you're thinking now about collection of racks. So pods, not
[01:20:32] collection of racks. So pods, not individual chips,
[01:20:33] individual chips, >> entire infrastructure. And I'm hoping my
[01:20:35] >> entire infrastructure. And I'm hoping my next click is when I'm thinking about
[01:20:37] next click is when I'm thinking about building computers, it's, you know,
[01:20:38] building computers, it's, you know, planetary scale. That would be the next
[01:20:41] planetary scale. That would be the next click. What do you think about the space
[01:20:44] click. What do you think about the space angle that Elon has talked about doing
[01:20:46] angle that Elon has talked about doing compute in space uh for solving some of
[01:20:50] compute in space uh for solving some of the it makes some of the energy issues
[01:20:54] the it makes some of the energy issues in terms of scaling energy easier
[01:20:56] in terms of scaling energy easier cooling issues is not easy. Yeah,
[01:20:58] cooling issues is not easy. Yeah, >> cooling well there's a large number of
[01:21:01] >> cooling well there's a large number of engineering complexities involved with
[01:21:02] engineering complexities involved with that.
[01:21:03] that. >> So what you know Nvidia has also
[01:21:05] >> So what you know Nvidia has also announced that
[01:21:07] announced that >> you're already thinking about that.
[01:21:08] >> you're already thinking about that. >> Yeah, we're already there. Uh, Nvidia
[01:21:11] >> Yeah, we're already there. Uh, Nvidia GPUs are the first GPUs in space and um
[01:21:15] GPUs are the first GPUs in space and um I I didn't realize it was it was so
[01:21:17] I I didn't realize it was it was so interesting to I would have declared it
[01:21:19] interesting to I would have declared it maybe we're in space, you know, little
[01:21:23] maybe we're in space, you know, little little astronaut suit on one of our
[01:21:24] little astronaut suit on one of our GPUs.
[01:21:27] GPUs. Um but but we've been in space. Uh it's
[01:21:29] Um but but we've been in space. Uh it's the right place to do a lot of imaging.
[01:21:31] the right place to do a lot of imaging. >> Mhm.
[01:21:32] >> Mhm. >> You know, because those satellites have
[01:21:33] >> You know, because those satellites have really high resolution imaging systems
[01:21:36] really high resolution imaging systems and they're sweeping the Earth, you
[01:21:37] and they're sweeping the Earth, you know, continuously now. And um uh you
[01:21:41] know, continuously now. And um uh you want you know centimeter scale you know
[01:21:43] want you know centimeter scale you know imaging that is done continuously uh for
[01:21:47] imaging that is done continuously uh for the world so that you know you'll
[01:21:49] the world so that you know you'll basically have real time telemetry of
[01:21:50] basically have real time telemetry of everything. Uh you don't want to beam
[01:21:54] everything. Uh you don't want to beam that back down to earth. It's just you
[01:21:56] that back down to earth. It's just you know pabytes and pabytes of data. You
[01:21:59] know pabytes and pabytes of data. You got to just do AI right there at the
[01:22:00] got to just do AI right there at the edge. Throw away everything you don't
[01:22:02] edge. Throw away everything you don't need. You've seen before didn't change
[01:22:04] need. You've seen before didn't change and then just keep the stuff that that
[01:22:06] and then just keep the stuff that that you need. And so AI ought to be done at
[01:22:08] you need. And so AI ought to be done at the edge. Um obviously we have we have
[01:22:11] the edge. Um obviously we have we have uh 24/7 solar if we put it at the polars
[01:22:15] uh 24/7 solar if we put it at the polars and um uh
[01:22:19] and um uh but you know there's no conduction, no
[01:22:21] but you know there's no conduction, no convection and so you know you're pretty
[01:22:24] convection and so you know you're pretty much just radiation
[01:22:26] much just radiation and um uh but you know space is big I
[01:22:29] and um uh but you know space is big I guess. You know we're just going to put
[01:22:30] guess. You know we're just going to put big giant radiators out there.
[01:22:32] big giant radiators out there. >> How crazy of an idea do you think it is?
[01:22:33] >> How crazy of an idea do you think it is? Like is this is this 5 years out, 10
[01:22:35] Like is this is this 5 years out, 10 years out, 20 years out? So, uh, we're
[01:22:39] years out, 20 years out? So, uh, we're talking about blockers for AI scaling.
[01:22:41] talking about blockers for AI scaling. You know, I'm just so much more
[01:22:42] You know, I'm just so much more practical. I I look for where where um I
[01:22:46] practical. I I look for where where um I next next bucket of opportunities are
[01:22:49] next next bucket of opportunities are first.
[01:22:51] first. Meanwhile, I'm cultivating space. And
[01:22:54] Meanwhile, I'm cultivating space. And so, I send I send engineers uh to go
[01:22:56] so, I send I send engineers uh to go work on the problem. We're we're
[01:22:58] work on the problem. We're we're starting to we're learning a lot about
[01:22:59] starting to we're learning a lot about it. Um, how do we deal with radiation?
[01:23:01] it. Um, how do we deal with radiation? How do we deal with degrading
[01:23:03] How do we deal with degrading performance? How do we deal with um uh
[01:23:05] performance? How do we deal with um uh continuous uh testing and addestation of
[01:23:09] continuous uh testing and addestation of of um def defects and and um you know
[01:23:12] of um def defects and and um you know how do we deal with redundancy and how
[01:23:14] how do we deal with redundancy and how do we degrade uh gracefully and things
[01:23:16] do we degrade uh gracefully and things like that and so we could we could do uh
[01:23:18] like that and so we could we could do uh what what about software? How do you
[01:23:20] what what about software? How do you think about software and and redundancy
[01:23:22] think about software and and redundancy and performance out in space? Uh make it
[01:23:25] and performance out in space? Uh make it so that so that the computer never
[01:23:28] so that so that the computer never breaks. It just gets slower, you know.
[01:23:31] breaks. It just gets slower, you know. And um I so we could start doing a lot
[01:23:34] And um I so we could start doing a lot of engineer exploration up front, but in
[01:23:36] of engineer exploration up front, but in the meantime, my my favorite answer is
[01:23:39] the meantime, my my favorite answer is get eliminate waste.
[01:23:41] get eliminate waste. >> You know, we've we've got all that idle
[01:23:43] >> You know, we've we've got all that idle power. I want to evacuate it as fast as
[01:23:45] power. I want to evacuate it as fast as possible.
[01:23:47] possible. >> Yeah. There Yeah, there's a lot of low
[01:23:49] >> Yeah. There Yeah, there's a lot of low hanging fruit here on Earth uh that we
[01:23:51] hanging fruit here on Earth uh that we can utilize uh for the AI scaling. Uh
[01:23:54] can utilize uh for the AI scaling. Uh quick pause, quick 30 second thank you
[01:23:57] quick pause, quick 30 second thank you to our sponsors. Check them out in the
[01:23:59] to our sponsors. Check them out in the description. It really is the best way
[01:24:01] description. It really is the best way to support this podcast. Go to
[01:24:03] to support this podcast. Go to lexfreedman.com/sponsors.
[01:24:06] We got Perplexity for curiositydriven
[01:24:09] We got Perplexity for curiositydriven knowledge exploration, Shopify for
[01:24:11] knowledge exploration, Shopify for selling stuff online, Element for
[01:24:14] selling stuff online, Element for Electrolytes, Finn for customer service
[01:24:16] Electrolytes, Finn for customer service AI agents, and Quo for a phone system
[01:24:20] AI agents, and Quo for a phone system like calls, texts, contacts for your
[01:24:22] like calls, texts, contacts for your business. Choose wisely, my friends. And
[01:24:25] business. Choose wisely, my friends. And now back to my conversation with Johnson
[01:24:28] now back to my conversation with Johnson Kuang. Do you think Nvidia may be worth
[01:24:33] Kuang. Do you think Nvidia may be worth 10 trillion at some point? Let's let's
[01:24:36] 10 trillion at some point? Let's let's ask it this way. What does the future of
[01:24:40] ask it this way. What does the future of the world look like where that where
[01:24:42] the world look like where that where that's true?
[01:24:45] that's true? I think that Nvidia's growth is is um
[01:24:51] I think that Nvidia's growth is is um uh extremely likely and in my mind
[01:24:54] uh extremely likely and in my mind inevitable. And let me explain why.
[01:24:57] inevitable. And let me explain why. We're the largest computer company in
[01:24:58] We're the largest computer company in history.
[01:25:00] history. That alone should beg the question why.
[01:25:04] That alone should beg the question why. And the reason for of course uh two
[01:25:06] And the reason for of course uh two reasons. First two foundational
[01:25:08] reasons. First two foundational technical reasons. The first reason is
[01:25:11] technical reasons. The first reason is that computing went from being a
[01:25:13] that computing went from being a retrievalbased file retrieval system.
[01:25:16] retrievalbased file retrieval system. Almost everything is a file. We we pre
[01:25:19] Almost everything is a file. We we pre pre-write something, we pre-record
[01:25:21] pre-write something, we pre-record something, you know, we we draw
[01:25:23] something, you know, we we draw something, we put it on the web, we put
[01:25:24] something, we put it on the web, we put in a file, and we we use a recommener
[01:25:27] in a file, and we we use a recommener system, some smart filter to figure out
[01:25:30] system, some smart filter to figure out what to retrieve for you. And so we were
[01:25:32] what to retrieve for you. And so we were a pre-recording, human pre-recording and
[01:25:34] a pre-recording, human pre-recording and file retrieving system. That's what a
[01:25:37] file retrieving system. That's what a computer is largely
[01:25:39] computer is largely to now AI computers are contextually
[01:25:42] to now AI computers are contextually aware which means that it has to process
[01:25:45] aware which means that it has to process and generate tokens in real time. So we
[01:25:47] and generate tokens in real time. So we went from a retrievalbased computing
[01:25:49] went from a retrievalbased computing system to a generativebased computing
[01:25:52] system to a generativebased computing system.
[01:25:53] system. We're going to need a lot more
[01:25:55] We're going to need a lot more processing in this new world than in the
[01:25:57] processing in this new world than in the old world. We need a lot of storage in
[01:25:59] old world. We need a lot of storage in the old old world. We need a lot of
[01:26:02] the old old world. We need a lot of computation in this new world. And so so
[01:26:06] computation in this new world. And so so that's that's the first part of it. We
[01:26:08] that's that's the first part of it. We fundamentally changed computing and the
[01:26:10] fundamentally changed computing and the way how computing is done. The only
[01:26:12] way how computing is done. The only thing that would cause it to go back is
[01:26:15] thing that would cause it to go back is if this way of computation, this way of
[01:26:18] if this way of computation, this way of computing generating information that's
[01:26:20] computing generating information that's contextually relevant, situationally
[01:26:22] contextually relevant, situationally aware that is grounded on new insight
[01:26:27] aware that is grounded on new insight before it generates information. this
[01:26:30] before it generates information. this computationintensive way of doing
[01:26:32] computationintensive way of doing computing would only go back if it's not
[01:26:35] computing would only go back if it's not effective. So if for the last 1015 years
[01:26:38] effective. So if for the last 1015 years while working on deep learning if at any
[01:26:41] while working on deep learning if at any single moment
[01:26:43] single moment I would have come to the conclusion that
[01:26:45] I would have come to the conclusion that that you know what this is not going to
[01:26:48] that you know what this is not going to work out I think this is a dead end or
[01:26:50] work out I think this is a dead end or it's not going to scale it's not going
[01:26:51] it's not going to scale it's not going to solve this modality it's not going to
[01:26:53] to solve this modality it's not going to be used in this application then of
[01:26:55] be used in this application then of course I would feel very differently
[01:26:56] course I would feel very differently about it but I think the last five years
[01:26:59] about it but I think the last five years has given me more confidence than the
[01:27:02] has given me more confidence than the last 10 years the previous 10 years the
[01:27:05] last 10 years the previous 10 years the second idea
[01:27:06] second idea is computers because it was a storage
[01:27:09] is computers because it was a storage system. It was largely a warehouse.
[01:27:13] system. It was largely a warehouse. We're now building factories.
[01:27:16] We're now building factories. Warehouses don't make much money.
[01:27:20] Warehouses don't make much money. Factories directly correlates with a
[01:27:23] Factories directly correlates with a company's revenues.
[01:27:25] company's revenues. And so
[01:27:28] And so the computer did two things. Not only
[01:27:30] the computer did two things. Not only did it change the way it did it, its
[01:27:34] did it change the way it did it, its purpose in the world changed. It's no
[01:27:36] purpose in the world changed. It's no longer a computer, it's a factory. It's
[01:27:40] longer a computer, it's a factory. It's a factory is used for generation of
[01:27:42] a factory is used for generation of revenues.
[01:27:45] revenues. We're now seeing not only is this
[01:27:47] We're now seeing not only is this factory generating products, commodities
[01:27:50] factory generating products, commodities that people want to consume,
[01:27:53] that people want to consume, we're seeing that the commodities are so
[01:27:55] we're seeing that the commodities are so interesting, so valuable so to so many
[01:27:58] interesting, so valuable so to so many different audiences that the tokens are
[01:28:00] different audiences that the tokens are starting to segment like iPhones.
[01:28:03] starting to segment like iPhones. >> Mhm.
[01:28:04] >> Mhm. >> You have a free tokens, you have premium
[01:28:07] >> You have a free tokens, you have premium tokens, and you have several tokens in
[01:28:09] tokens, and you have several tokens in the middle. And so intelligence, as it
[01:28:12] the middle. And so intelligence, as it turns out, you know, is a scalable
[01:28:14] turns out, you know, is a scalable product. There's extremely high
[01:28:16] product. There's extremely high intelligence products, tokens that you
[01:28:18] intelligence products, tokens that you could that are used for specialized
[01:28:20] could that are used for specialized things. People be willing to pay, you
[01:28:22] things. People be willing to pay, you know, the idea that somebody's willing
[01:28:24] know, the idea that somebody's willing to pay $1,000 per million tokens is just
[01:28:29] to pay $1,000 per million tokens is just around the corner. It's not if, it's
[01:28:31] around the corner. It's not if, it's only when. And so so now we're seeing
[01:28:34] only when. And so so now we're seeing that the commodity that this factory
[01:28:37] that the commodity that this factory makes is actually valuable and is
[01:28:39] makes is actually valuable and is revenue generating and profit
[01:28:40] revenue generating and profit generating. How now the question is how
[01:28:43] generating. How now the question is how many of these factories can does the
[01:28:46] many of these factories can does the world need?
[01:28:48] world need? How much how many tokens does the world
[01:28:49] How much how many tokens does the world need?
[01:28:52] need? And um
[01:28:54] And um how much is society willing to pay for
[01:28:56] how much is society willing to pay for these tokens?
[01:28:58] these tokens? And
[01:29:00] And what would happen to the world's economy
[01:29:02] what would happen to the world's economy if the productivity were to improve so
[01:29:05] if the productivity were to improve so substantially?
[01:29:07] substantially? What would happen? Are we are we going
[01:29:09] What would happen? Are we are we going to discover new drugs, new products, new
[01:29:11] to discover new drugs, new products, new services? And so when you take these
[01:29:14] services? And so when you take these things in combination, I am absolutely
[01:29:16] things in combination, I am absolutely certain
[01:29:18] certain that the world's GDP is going to
[01:29:20] that the world's GDP is going to accelerate in growth. I'm absolutely
[01:29:23] accelerate in growth. I'm absolutely certain the percentage of that GDP that
[01:29:27] certain the percentage of that GDP that will be used for computation
[01:29:30] will be used for computation will be a 100 times more than the past
[01:29:33] will be a 100 times more than the past because it's no longer a storage unit.
[01:29:35] because it's no longer a storage unit. It's a product generation unit. And so
[01:29:39] It's a product generation unit. And so when you look at it in that context and
[01:29:42] when you look at it in that context and then you back into what is Nvidia's what
[01:29:45] then you back into what is Nvidia's what does Nvidia what does Nvidia do and how
[01:29:48] does Nvidia what does Nvidia do and how much of that
[01:29:50] much of that new economics new industry would we have
[01:29:54] new economics new industry would we have to benefit to address I think we're
[01:29:56] to benefit to address I think we're going to be a lot lot bigger and then
[01:29:58] going to be a lot lot bigger and then the rest of it to me is um you go is it
[01:30:02] the rest of it to me is um you go is it possible for Nvidia to be a you know $3
[01:30:05] possible for Nvidia to be a you know $3 trillion revenues company in the near
[01:30:07] trillion revenues company in the near future
[01:30:08] future The answer is of course yes. And the
[01:30:10] The answer is of course yes. And the reason for that is because it's not
[01:30:12] reason for that is because it's not limited by any physical limits. There's
[01:30:15] limited by any physical limits. There's nothing that I see that says, you know,
[01:30:18] nothing that I see that says, you know, gosh, um, $3 trillion is not possible.
[01:30:22] gosh, um, $3 trillion is not possible. And as it turns out, Nvidia supply chain
[01:30:26] And as it turns out, Nvidia supply chain is the burden is shared by 200
[01:30:29] is the burden is shared by 200 companies.
[01:30:31] companies. and the fact that we scale out on the
[01:30:34] and the fact that we scale out on the backs of with the partnership of this
[01:30:37] backs of with the partnership of this ecosystem.
[01:30:38] ecosystem. The question is do we have the energy to
[01:30:40] The question is do we have the energy to do so? And surely we will have the
[01:30:44] do so? And surely we will have the energy to do so. And so all of these
[01:30:46] energy to do so. And so all of these things combined
[01:30:49] things combined that number is just a number you know
[01:30:51] that number is just a number you know and I still remember Nvidia was a Nvidia
[01:30:54] and I still remember Nvidia was a Nvidia was a the first time we crossed a
[01:30:56] was a the first time we crossed a billion dollars.
[01:30:58] billion dollars. I was reminded of of a CEO who told me,
[01:31:00] I was reminded of of a CEO who told me, you know, Jensen, it's theoretically
[01:31:02] you know, Jensen, it's theoretically impossible for a fabulous semiconductor
[01:31:04] impossible for a fabulous semiconductor company to exceed a billion dollars. And
[01:31:08] company to exceed a billion dollars. And and um I won't bore you with why, but
[01:31:10] and um I won't bore you with why, but but the of course is illogical and
[01:31:13] but the of course is illogical and there's a lot of evidence we're not. And
[01:31:15] there's a lot of evidence we're not. And then there somebody told me, you know,
[01:31:16] then there somebody told me, you know, Jensen, you'll never be more than $25
[01:31:19] Jensen, you'll never be more than $25 billion because of some other company.
[01:31:22] billion because of some other company. Somebody told me that you'll never be,
[01:31:24] Somebody told me that you'll never be, you know, because and then so so the the
[01:31:27] you know, because and then so so the the those aren't principled first principle
[01:31:31] those aren't principled first principle reason thinking and the simple the
[01:31:34] reason thinking and the simple the simple way to think about that is what
[01:31:36] simple way to think about that is what is it that we make and how large is the
[01:31:39] is it that we make and how large is the opportunity that we can create. Now
[01:31:42] opportunity that we can create. Now Nvidia is not in the market share
[01:31:44] Nvidia is not in the market share business. Almost everything that I just
[01:31:46] business. Almost everything that I just talked about don't exist.
[01:31:47] talked about don't exist. >> Mhm.
[01:31:48] >> Mhm. >> That's the part that's hard.
[01:31:51] >> That's the part that's hard. You know, if Nvidia was a was a was a
[01:31:54] You know, if Nvidia was a was a was a $10 billion company trying to take
[01:31:56] $10 billion company trying to take Nvidia's share, then it's easy to to see
[01:31:59] Nvidia's share, then it's easy to to see for shareholders that oh yeah, if they
[01:32:02] for shareholders that oh yeah, if they could just take 10% share, they could be
[01:32:05] could just take 10% share, they could be this much larger. But it's hard for
[01:32:08] this much larger. But it's hard for people to imagine how large we could be
[01:32:10] people to imagine how large we could be because there's nobody I could take
[01:32:12] because there's nobody I could take share from,
[01:32:13] share from, >> you know, and so so I think that that's
[01:32:16] >> you know, and so so I think that that's one of the challenges for the world is
[01:32:18] one of the challenges for the world is is um the imagination of the future. But
[01:32:21] is um the imagination of the future. But I got plenty of time and I'll keep
[01:32:22] I got plenty of time and I'll keep reasoning about it and I'll keep talking
[01:32:24] reasoning about it and I'll keep talking about it and every single GTC will
[01:32:26] about it and every single GTC will become more and more real,
[01:32:27] become more and more real, >> you know, and and and then more and more
[01:32:29] >> you know, and and and then more and more people will talk about one of these
[01:32:30] people will talk about one of these days, you know, we'll we'll get there.
[01:32:32] days, you know, we'll we'll get there. But I'm 100% we'll get there.
[01:32:34] But I'm 100% we'll get there. >> Yeah. this view of uh you know token
[01:32:37] >> Yeah. this view of uh you know token factories essentially this token per
[01:32:39] factories essentially this token per second per watt and every token having
[01:32:42] second per watt and every token having value like it's an actual thing that
[01:32:44] value like it's an actual thing that brings value and it brings different
[01:32:46] brings value and it brings different kinds of value different amounts of
[01:32:48] kinds of value different amounts of value to different people but it's value
[01:32:49] value to different people but it's value that's the actual product is really
[01:32:51] that's the actual product is really could be loosely thought of as the token
[01:32:53] could be loosely thought of as the token and so you have a bunch of token factors
[01:32:54] and so you have a bunch of token factors and it's very easy first principles to
[01:32:57] and it's very easy first principles to imagine a future given all the potential
[01:32:59] imagine a future given all the potential things that AI can solve that you're
[01:33:01] things that AI can solve that you're going to need an exponential number more
[01:33:04] going to need an exponential number more of token factories.
[01:33:05] of token factories. >> Yeah.
[01:33:06] >> Yeah. >> And and what's really interesting, the
[01:33:08] >> And and what's really interesting, the reason why I was so excited about it,
[01:33:09] reason why I was so excited about it, the iPhone of tokens arrived.
[01:33:11] the iPhone of tokens arrived. >> What do you call Wait, are you saying
[01:33:12] >> What do you call Wait, are you saying open clause iPhone?
[01:33:14] open clause iPhone? >> Yeah,
[01:33:14] >> Yeah, >> that's interesting. Uh
[01:33:15] >> that's interesting. Uh >> agents.
[01:33:16] >> agents. >> Yeah, agents. True.
[01:33:18] >> Yeah, agents. True. >> Agents in general. The iPhone of tokens
[01:33:20] >> Agents in general. The iPhone of tokens arrived. Uh it is the fastest growing
[01:33:22] arrived. Uh it is the fastest growing application in history. It went straight
[01:33:24] application in history. It went straight up.
[01:33:25] up. >> Yeah,
[01:33:25] >> Yeah, >> went straight up.
[01:33:26] >> went straight up. >> That says something.
[01:33:27] >> That says something. >> Yep. There's no question OpenClaw is the
[01:33:30] >> Yep. There's no question OpenClaw is the iPhone of tokens. Yeah, there's
[01:33:32] iPhone of tokens. Yeah, there's something truly as you know
[01:33:34] something truly as you know something truly special happening from
[01:33:37] something truly special happening from about December where people really woke
[01:33:40] about December where people really woke up to the power of claw code of codeex
[01:33:42] up to the power of claw code of codeex of open claw. Um, I mean, I've
[01:33:46] of open claw. Um, I mean, I've embarrassed to admit that on the way
[01:33:48] embarrassed to admit that on the way here in the airport,
[01:33:51] here in the airport, I've
[01:33:52] I've this first time I've done this in
[01:33:54] this first time I've done this in public, I was programming quote unquote
[01:33:57] public, I was programming quote unquote by talking to my laptop and I was
[01:33:59] by talking to my laptop and I was embarrassed because I was pretending
[01:34:01] embarrassed because I was pretending like I'm talking to a human colleague.
[01:34:03] like I'm talking to a human colleague. Mhm.
[01:34:03] Mhm. >> Uh I'm not sure how I feel about the
[01:34:05] >> Uh I'm not sure how I feel about the future where everybody
[01:34:07] future where everybody >> is walking around talking to their AI,
[01:34:10] >> is walking around talking to their AI, but it's such an efficient way to get
[01:34:12] but it's such an efficient way to get stuff done
[01:34:13] stuff done >> and and it's it's more likely that your
[01:34:15] >> and and it's it's more likely that your AI is bothering you all the time. And
[01:34:18] AI is bothering you all the time. And the reason for that is because it's
[01:34:20] the reason for that is because it's getting stuff done so fast.
[01:34:21] getting stuff done so fast. >> Yeah.
[01:34:22] >> Yeah. >> Is reporting back to you. I got that
[01:34:23] >> Is reporting back to you. I got that done. You know, what do you want me to
[01:34:25] done. You know, what do you want me to do next? You know, it that's the part
[01:34:27] do next? You know, it that's the part that I think most people don't realize
[01:34:29] that I think most people don't realize is mo the person who's going to be
[01:34:31] is mo the person who's going to be chatting with them, texting them most is
[01:34:34] chatting with them, texting them most is their is their claws or lobster.
[01:34:37] their is their claws or lobster. >> What an incredible future. Uh I read
[01:34:40] >> What an incredible future. Uh I read that you attribute a lot of your success
[01:34:42] that you attribute a lot of your success to your ability to work harder than
[01:34:43] to your ability to work harder than anyone and withstand more suffering than
[01:34:45] anyone and withstand more suffering than anyone.
[01:34:46] anyone. So, we can list many of the things that
[01:34:49] So, we can list many of the things that entails. I mean, dealing with failure,
[01:34:52] entails. I mean, dealing with failure, the constant engineering problems we've
[01:34:54] the constant engineering problems we've talked about, the the human problems,
[01:34:58] talked about, the the human problems, uncertainty, responsibility, exhaustion,
[01:35:00] uncertainty, responsibility, exhaustion, embarrassment, the near-death company
[01:35:02] embarrassment, the near-death company moments that you've mentioned,
[01:35:05] moments that you've mentioned, um, but also the pressure now as the CEO
[01:35:08] um, but also the pressure now as the CEO of this
[01:35:10] of this company that economies and nations
[01:35:14] company that economies and nations strategize around, uh, plan their, um,
[01:35:19] strategize around, uh, plan their, um, financial allocations around plan their
[01:35:21] financial allocations around plan their in AI infrastructure around how do you
[01:35:23] in AI infrastructure around how do you deal with this much pressure?
[01:35:26] deal with this much pressure? What gives you strength given
[01:35:29] What gives you strength given how many nations and peoples depend on
[01:35:33] how many nations and peoples depend on you?
[01:35:37] I'm conscious about the fact that
[01:35:41] I'm conscious about the fact that um Nvidia success is very important to
[01:35:45] um Nvidia success is very important to United States.
[01:35:47] United States. We generate enormous amounts of tax tax
[01:35:49] We generate enormous amounts of tax tax revenues. Uh we establish technology
[01:35:52] revenues. Uh we establish technology leadership for our nation. Technology
[01:35:54] leadership for our nation. Technology leadership is important for national
[01:35:56] leadership is important for national security. National security not just in
[01:35:59] security. National security not just in one aspect of national security. All
[01:36:01] one aspect of national security. All aspects of national security. When our
[01:36:03] aspects of national security. When our country is more prosperous,
[01:36:05] country is more prosperous, we could do a better job with domestic
[01:36:07] we could do a better job with domestic policies and helping social social
[01:36:09] policies and helping social social benefits because we're generating so
[01:36:12] benefits because we're generating so much re-industrialization in the United
[01:36:15] much re-industrialization in the United States. We're creating mountains of
[01:36:16] States. We're creating mountains of jobs. We're helping shift um how we how
[01:36:23] jobs. We're helping shift um how we how we how we build things uh back to United
[01:36:26] we how we build things uh back to United States in so many different plants,
[01:36:28] States in so many different plants, chips, computers, and of course these AI
[01:36:31] chips, computers, and of course these AI factories. I'm completely aware that
[01:36:35] factories. I'm completely aware that that um and I have I have the benefit
[01:36:37] that um and I have I have the benefit and this is a real real um a real gift
[01:36:41] and this is a real real um a real gift uh with with uh mainstream investors,
[01:36:45] uh with with uh mainstream investors, teachers, policemen who have somehow for
[01:36:49] teachers, policemen who have somehow for whatever reason invested in Nvidia or
[01:36:52] whatever reason invested in Nvidia or because they watch Jim Kramer um bought
[01:36:55] because they watch Jim Kramer um bought some stock and now are millionaires.
[01:36:57] some stock and now are millionaires. >> Mhm. And um I I am completely aware of
[01:37:02] >> Mhm. And um I I am completely aware of that circumstance. I'm aware of the
[01:37:04] that circumstance. I'm aware of the circumstance that that Nvidia
[01:37:07] circumstance that that Nvidia uh
[01:37:09] uh is central to a very large network of
[01:37:13] is central to a very large network of ecosystem partners behind us and
[01:37:14] ecosystem partners behind us and downstream from us. And so the way the
[01:37:17] downstream from us. And so the way the way I deal with that is exactly what I
[01:37:20] way I deal with that is exactly what I just did. I reason about
[01:37:23] just did. I reason about what is it what is it that we're doing?
[01:37:26] what is it what is it that we're doing? um what is it causing? What's the impact
[01:37:28] um what is it causing? What's the impact that has other people benefit you know
[01:37:31] that has other people benefit you know positively or even even um uh through
[01:37:35] positively or even even um uh through great burden for example the supply
[01:37:36] great burden for example the supply chain
[01:37:38] chain and and the question is
[01:37:42] and and the question is uh therefore what are you going to do
[01:37:43] uh therefore what are you going to do about it and almost everything that I
[01:37:46] about it and almost everything that I feel I break it down I reason about okay
[01:37:50] feel I break it down I reason about okay what's the circumstance what is what has
[01:37:52] what's the circumstance what is what has changed what's hard um and what am I
[01:37:55] changed what's hard um and what am I going to do about it and I I break it
[01:37:57] going to do about it and I I break it down, decompose the problem. And the de
[01:38:01] down, decompose the problem. And the de the decomposition
[01:38:03] the decomposition of these
[01:38:05] of these circumstances
[01:38:07] circumstances turns it into manageable things that I
[01:38:09] turns it into manageable things that I can do. And the only thing that I after
[01:38:11] can do. And the only thing that I after that I could do is did you do it? Did
[01:38:14] that I could do is did you do it? Did you either do it or did you get somebody
[01:38:16] you either do it or did you get somebody else to do it? And if you didn't do it,
[01:38:18] else to do it? And if you didn't do it, you you reason that you need to do it
[01:38:20] you you reason that you need to do it and you didn't do it and you get didn't
[01:38:21] and you didn't do it and you get didn't get anybody else to do it, then stop
[01:38:23] get anybody else to do it, then stop crying about it, you know. And so, and
[01:38:27] crying about it, you know. And so, and so, so I I'm I'm fairly I'm fairly uh
[01:38:31] so, so I I'm I'm fairly I'm fairly uh >> uh tough on myself. And but I also break
[01:38:34] >> uh tough on myself. And but I also break things down so that so that um uh I
[01:38:37] things down so that so that um uh I don't panic. Uh I can go to sleep
[01:38:39] don't panic. Uh I can go to sleep because I've made the list of things
[01:38:41] because I've made the list of things that needed to be done. And I've made
[01:38:43] that needed to be done. And I've made sure that everything that could put our
[01:38:46] sure that everything that could put our company in harm's way, could put my
[01:38:48] company in harm's way, could put my partners in harm's way, put our industry
[01:38:50] partners in harm's way, put our industry in harm's way, I've told somebody.
[01:38:54] in harm's way, I've told somebody. Everything that I feel could put anybody
[01:38:57] Everything that I feel could put anybody in harm's way, I've told someone. And
[01:39:00] in harm's way, I've told someone. And I've told that someone who could do
[01:39:02] I've told that someone who could do something about it. And so I've gotten
[01:39:04] something about it. And so I've gotten it off my chest or I'm doing something
[01:39:06] it off my chest or I'm doing something about it. And so after that, Lex, what
[01:39:10] about it. And so after that, Lex, what else can you do?
[01:39:10] else can you do? >> So given all the in insane intense
[01:39:13] >> So given all the in insane intense amount of suffering on the journey of
[01:39:17] amount of suffering on the journey of building up Nvidia,
[01:39:19] building up Nvidia, you have you hit low points
[01:39:21] you have you hit low points psychologically?
[01:39:22] psychologically? >> Oh yeah. Oh yeah, sure. All the time.
[01:39:26] >> Oh yeah. Oh yeah, sure. All the time. All the time.
[01:39:27] All the time. >> And there you just break down the
[01:39:29] >> And there you just break down the problem
[01:39:30] problem >> into pieces.
[01:39:31] >> into pieces. >> Yeah.
[01:39:31] >> Yeah. >> See what you can do about it.
[01:39:33] >> See what you can do about it. >> And and part of And you know, Lex, part
[01:39:36] >> And and part of And you know, Lex, part of it part of it is forgetting. One of
[01:39:39] of it part of it is forgetting. One of the most important attributes of AI
[01:39:41] the most important attributes of AI learning as you know is right systematic
[01:39:43] learning as you know is right systematic forgetting. You you need to know when to
[01:39:46] forgetting. You you need to know when to forget some things. You can't memorize
[01:39:48] forget some things. You can't memorize everything. You can't keep everything.
[01:39:50] everything. You can't keep everything. and and you know you want to you don't
[01:39:52] and and you know you want to you don't want to carry everything. One of the
[01:39:53] want to carry everything. One of the things that I do very quickly is I
[01:39:55] things that I do very quickly is I decompose the problem. I reason about
[01:39:57] decompose the problem. I reason about the problem and I I share the load with
[01:39:59] the problem and I I share the load with it. When I say I tell everybody, I'm
[01:40:02] it. When I say I tell everybody, I'm essentially sharing that burden.
[01:40:03] essentially sharing that burden. >> Yeah.
[01:40:04] >> Yeah. >> As quickly as possible.
[01:40:06] >> As quickly as possible. Whatever worries me, tell somebody else.
[01:40:08] Whatever worries me, tell somebody else. Don't just keep it, you know, decompo.
[01:40:10] Don't just keep it, you know, decompo. Don't don't freak them out. decompose
[01:40:13] Don't don't freak them out. decompose the problem into smaller parts and get
[01:40:16] the problem into smaller parts and get people to so and and inspire them to be
[01:40:19] people to so and and inspire them to be able to go do something about it. But
[01:40:21] able to go do something about it. But part of it is just just forgetting, you
[01:40:23] part of it is just just forgetting, you know, I a lot of it is you got to be
[01:40:25] know, I a lot of it is you got to be tough on yourself, you know, just come
[01:40:27] tough on yourself, you know, just come on, stop crying about it, let's get
[01:40:29] on, stop crying about it, let's get going, you know, and and then you get
[01:40:31] going, you know, and and then you get out of bed. And then the other part is
[01:40:33] out of bed. And then the other part is is um you you you're attracted to the
[01:40:36] is um you you you're attracted to the next shiny light, the next future, you
[01:40:38] next shiny light, the next future, you know, the next opportunity, the next
[01:40:40] know, the next opportunity, the next Okay, that's behind us. Let what's next?
[01:40:43] Okay, that's behind us. Let what's next? It's a lot. I think you know you watch
[01:40:45] It's a lot. I think you know you watch this with great athletes. They they um
[01:40:48] this with great athletes. They they um just worry about the next point.
[01:40:49] just worry about the next point. >> Mhm.
[01:40:50] >> Mhm. >> The last point is behind them. The
[01:40:52] >> The last point is behind them. The embarrassment,
[01:40:54] embarrassment, the you know, the setback, you know, and
[01:40:57] the you know, the setback, you know, and and then and because I do so much of my
[01:40:59] and then and because I do so much of my job publicly,
[01:41:01] job publicly, >> you know, Lex, you do a fair amount of
[01:41:03] >> you know, Lex, you do a fair amount of your job publicly, too. And so, so I do
[01:41:05] your job publicly, too. And so, so I do a lot of my job publicly. And so, um,
[01:41:08] a lot of my job publicly. And so, um, you know, I I say a lot of things that
[01:41:10] you know, I I say a lot of things that that seem sensible at the time or funny
[01:41:13] that seem sensible at the time or funny at the time. Mostly it's just because
[01:41:14] at the time. Mostly it's just because it's funny to me at the time and then,
[01:41:16] it's funny to me at the time and then, you know, you reflect on it's less
[01:41:18] you know, you reflect on it's less money, but but
[01:41:20] money, but but >> Yeah. No, trust me, I know. But you
[01:41:22] >> Yeah. No, trust me, I know. But you basically allow yourself to be pulled by
[01:41:25] basically allow yourself to be pulled by the light of the future. Forget the past
[01:41:27] the light of the future. Forget the past and just keep
[01:41:28] and just keep >> That's right.
[01:41:28] >> That's right. >> Keep keep working towards that. I mean,
[01:41:30] >> Keep keep working towards that. I mean, you did say there's this kind of famous
[01:41:33] you did say there's this kind of famous thing you said that um if you knew how
[01:41:36] thing you said that um if you knew how hard it would be to build Nvidia, uh it
[01:41:40] hard it would be to build Nvidia, uh it turned out to be what is it a million
[01:41:42] turned out to be what is it a million times more hard than you anticipated
[01:41:45] times more hard than you anticipated that you wouldn't do it?
[01:41:46] that you wouldn't do it? >> Yeah.
[01:41:47] >> Yeah. >> But is isn't you know when I hear that
[01:41:51] >> But is isn't you know when I hear that that's probably true about everything
[01:41:52] that's probably true about everything worth doing, right?
[01:41:53] worth doing, right? >> Exactly. That is by the way what I was
[01:41:56] >> Exactly. That is by the way what I was trying to explain is that there's a
[01:41:59] trying to explain is that there's a there's a incredible superpower of being
[01:42:03] there's a incredible superpower of being um being being uh have the mind of a
[01:42:06] um being being uh have the mind of a child.
[01:42:07] child. >> Yeah.
[01:42:07] >> Yeah. >> You know, and I say to myself often
[01:42:09] >> You know, and I say to myself often times when I look at something and and
[01:42:12] times when I look at something and and almost almost everything
[01:42:15] almost almost everything um my first thought is how hard can it
[01:42:17] um my first thought is how hard can it be?
[01:42:19] be? >> You know, and so and so you get yourself
[01:42:22] >> You know, and so and so you get yourself into that mode. How hard could it be?
[01:42:24] into that mode. How hard could it be? and and nobody's ever done it. It looks
[01:42:27] and and nobody's ever done it. It looks gigantic. It's going to cost hundreds of
[01:42:30] gigantic. It's going to cost hundreds of billions of dollars. It's going to take,
[01:42:32] billions of dollars. It's going to take, you know, all this. And you just go,
[01:42:33] you know, all this. And you just go, "Yeah, but how hard could it be?" You
[01:42:35] "Yeah, but how hard could it be?" You know, how hard could it be?
[01:42:37] know, how hard could it be? >> And and so, so you got to get yourself
[01:42:39] >> And and so, so you got to get yourself into that state of mind. You don't want
[01:42:41] into that state of mind. You don't want to you don't want to actually
[01:42:43] to you don't want to actually overstimulate
[01:42:45] overstimulate everything and all the setbacks and all
[01:42:47] everything and all the setbacks and all the trials and tribulations and all the
[01:42:49] the trials and tribulations and all the disappointments. You don't want to
[01:42:50] disappointments. You don't want to simulate all that in advance. You don't
[01:42:51] simulate all that in advance. You don't want to know that. You don't you don't
[01:42:53] want to know that. You don't you don't you want to go into a new experience
[01:42:55] you want to go into a new experience thinking it's going to be perfect. It's
[01:42:57] thinking it's going to be perfect. It's going to be great. It's going to be
[01:42:58] going to be great. It's going to be incredibly fun. And then while you're
[01:43:00] incredibly fun. And then while you're there, you know, you need to have you
[01:43:03] there, you know, you need to have you need to have endurance. You need to have
[01:43:05] need to have endurance. You need to have grit so that when the setbacks actually
[01:43:07] grit so that when the setbacks actually happened and those setbacks are going to
[01:43:09] happened and those setbacks are going to surprise you, the disappoints
[01:43:11] surprise you, the disappoints disappointments aren't going to surprise
[01:43:13] disappointments aren't going to surprise you. You know, the embarrassments are
[01:43:14] you. You know, the embarrassments are going to surprise you, the humiliations
[01:43:16] going to surprise you, the humiliations are going to surprise you. Um you just
[01:43:18] are going to surprise you. Um you just can't let now you just got to turn on
[01:43:20] can't let now you just got to turn on the other bit which is just forget about
[01:43:22] the other bit which is just forget about it. move on, keep keep moving. And and
[01:43:24] it. move on, keep keep moving. And and to the extent that
[01:43:27] to the extent that to the extent that my assumptions
[01:43:31] to the extent that my assumptions about the future and why the future is
[01:43:34] about the future and why the future is going to manifest,
[01:43:36] going to manifest, so long as those assumptions and that
[01:43:38] so long as those assumptions and that input
[01:43:40] input doesn't change or didn't change
[01:43:42] doesn't change or didn't change materially, then I should expect that
[01:43:44] materially, then I should expect that the output won't change. And so my
[01:43:46] the output won't change. And so my simulated output of the future is still
[01:43:50] simulated output of the future is still going to happen. And if it's still going
[01:43:51] going to happen. And if it's still going to happen, I'm still going to go after
[01:43:54] to happen, I'm still going to go after it. I believe it's going to, you know,
[01:43:55] it. I believe it's going to, you know, and so there's a combination of two or
[01:43:58] and so there's a combination of two or three human characteristics.
[01:44:01] three human characteristics. The ability to go into a into an
[01:44:03] The ability to go into a into an experience fresh-minded,
[01:44:05] experience fresh-minded, the ability to forget the setbacks,
[01:44:08] the ability to forget the setbacks, the ability to believe in yourself,
[01:44:11] the ability to believe in yourself, you know, to believe what you believe
[01:44:13] you know, to believe what you believe and stay stay true to that belief. Um,
[01:44:16] and stay stay true to that belief. Um, but you're constantly re-evaluating.
[01:44:19] but you're constantly re-evaluating. >> Mhm. This combination of three, four,
[01:44:22] >> Mhm. This combination of three, four, five things I think is is really
[01:44:25] five things I think is is really important for resilience. And and um
[01:44:30] important for resilience. And and um and you know, I I'm I'm fortunate that
[01:44:32] and you know, I I'm I'm fortunate that that whatever whatever life experience
[01:44:33] that whatever whatever life experience has led to this, I've got kind of those
[01:44:36] has led to this, I've got kind of those four or five things. You know, I'm
[01:44:38] four or five things. You know, I'm always curious, always learning. I'm
[01:44:40] always curious, always learning. I'm always learning from everybody. You
[01:44:42] always learning from everybody. You know, I'm always asking my and because
[01:44:44] know, I'm always asking my and because I'm humble about about about everything,
[01:44:47] I'm humble about about about everything, I'm always thinking, gosh, they did that
[01:44:49] I'm always thinking, gosh, they did that so nicely. They did that so wonderfully.
[01:44:52] so nicely. They did that so wonderfully. You know, I wonder what they're thinking
[01:44:54] You know, I wonder what they're thinking through. How do they, you know, so I'm
[01:44:56] through. How do they, you know, so I'm simulating everybody in a lot of ways,
[01:44:59] simulating everybody in a lot of ways, you know, I'm emulating almost everybody
[01:45:00] you know, I'm emulating almost everybody I watch, right? you're you're empathetic
[01:45:02] I watch, right? you're you're empathetic towards towards everything that they do
[01:45:04] towards towards everything that they do that that you're observing and respect
[01:45:06] that that you're observing and respect and and so you you're constantly
[01:45:09] and and so you you're constantly learning and you know
[01:45:11] learning and you know >> you're now one of the wealthiest people
[01:45:13] >> you're now one of the wealthiest people on earth, one of the most successful
[01:45:16] on earth, one of the most successful humans on earth. Is it harder to be
[01:45:19] humans on earth. Is it harder to be humble and to be able to do you feel the
[01:45:22] humble and to be able to do you feel the effect of money and power and fame in
[01:45:26] effect of money and power and fame in making it harder for you to
[01:45:28] making it harder for you to sort of be wrong in your own head enough
[01:45:32] sort of be wrong in your own head enough to
[01:45:34] to hear out an opinion of somebody else
[01:45:35] hear out an opinion of somebody else when it disagrees with you and learn
[01:45:37] when it disagrees with you and learn from them? Those kinds of things.
[01:45:41] >> Um, surprisingly, no. And and I would I
[01:45:44] >> Um, surprisingly, no. And and I would I would actually go the other way because
[01:45:46] would actually go the other way because I do so much of my work publicly.
[01:45:50] I do so much of my work publicly. When I'm wrong, pretty much everybody
[01:45:52] When I'm wrong, pretty much everybody sees it.
[01:45:53] sees it. >> You get humbled.
[01:45:54] >> You get humbled. >> Yeah. And and uh and when I'm wrong,
[01:45:57] >> Yeah. And and uh and when I'm wrong, when I'm wrong or it didn't turn out
[01:45:59] when I'm wrong or it didn't turn out that way or um you know, I mean, most of
[01:46:03] that way or um you know, I mean, most of the things that that I say outside um
[01:46:06] the things that that I say outside um I'm fairly certain about. And the reason
[01:46:08] I'm fairly certain about. And the reason for that is because because it's going
[01:46:10] for that is because because it's going to impact somebody else and I want to be
[01:46:12] to impact somebody else and I want to be quite concerned about that and quite
[01:46:14] quite concerned about that and quite quite circumspect about that. Um for
[01:46:17] quite circumspect about that. Um for stuff that that I'm reasoning about
[01:46:18] stuff that that I'm reasoning about inside a meeting, you know, a lot of
[01:46:21] inside a meeting, you know, a lot of things could turn out differently. And
[01:46:23] things could turn out differently. And so, but it doesn't ever stop me from
[01:46:25] so, but it doesn't ever stop me from reasoning. The way that that the way
[01:46:27] reasoning. The way that that the way that I manage and lead, I you know, I'm
[01:46:30] that I manage and lead, I you know, I'm constantly reasoning in front of people.
[01:46:32] constantly reasoning in front of people. Even when I'm talking to you, you can
[01:46:34] Even when I'm talking to you, you can kind of see me kind of reasoning through
[01:46:35] kind of see me kind of reasoning through things.
[01:46:35] things. >> And I want to make sure that you
[01:46:37] >> And I want to make sure that you understand what I'm saying, not because
[01:46:38] understand what I'm saying, not because I told you,
[01:46:40] I told you, >> because I'm so humble about what I'm
[01:46:42] >> because I'm so humble about what I'm about to tell you.
[01:46:43] about to tell you. >> I kind of show you the steps that I got
[01:46:45] >> I kind of show you the steps that I got there.
[01:46:46] there. >> And then you could decide whether you
[01:46:47] >> And then you could decide whether you believe what I said in the end. And so
[01:46:49] believe what I said in the end. And so I'm doing that all day long in meetings
[01:46:52] I'm doing that all day long in meetings with all of my employees. I'm constantly
[01:46:54] with all of my employees. I'm constantly reasoning through. Let me tell you, let
[01:46:55] reasoning through. Let me tell you, let me tell you what how I see it. And I
[01:46:57] me tell you what how I see it. And I reason through it. It gives everybody
[01:46:59] reason through it. It gives everybody the opportunity to intercept and say, "I
[01:47:02] the opportunity to intercept and say, "I disagree with that part."
[01:47:04] disagree with that part." >> The nice thing about reasoning through
[01:47:05] >> The nice thing about reasoning through things and letting and letting people
[01:47:07] things and letting and letting people interact with it is that they don't have
[01:47:09] interact with it is that they don't have to disagree with your outcome.
[01:47:12] to disagree with your outcome. They can disagree with your reasoning
[01:47:13] They can disagree with your reasoning steps and they could pull me in
[01:47:16] steps and they could pull me in different directions and then we can
[01:47:18] different directions and then we can reason forward. And so we're we're kind
[01:47:20] reason forward. And so we're we're kind of, you know, collective
[01:47:25] of, you know, collective patharching method and it's really
[01:47:27] patharching method and it's really fantastic.
[01:47:29] fantastic. >> Yeah. You have this way about you of
[01:47:32] >> Yeah. You have this way about you of when you're explaining stuff, I can feel
[01:47:34] when you're explaining stuff, I can feel you actually reasoning on the spot about
[01:47:37] you actually reasoning on the spot about it with a constant open-mindedness where
[01:47:40] it with a constant open-mindedness where you could I I could feel like I could
[01:47:42] you could I I could feel like I could steer your thinking. Yeah. And that's a
[01:47:44] steer your thinking. Yeah. And that's a that's really beautiful that you've been
[01:47:46] that's really beautiful that you've been able to maintain that after so many
[01:47:48] able to maintain that after so many years of success and pain. I think
[01:47:50] years of success and pain. I think sometimes pain makes you close you down
[01:47:54] sometimes pain makes you close you down a bit.
[01:47:55] a bit. >> Yeah.
[01:47:56] >> Yeah. >> And I I think to maintain
[01:47:57] >> And I I think to maintain >> tolerance for embarrassment I think is
[01:48:00] >> tolerance for embarrassment I think is >> that's that's the tolerance. I mean
[01:48:01] >> that's that's the tolerance. I mean that's a real thing.
[01:48:03] that's a real thing. >> Yeah. There's many years of embarrassing
[01:48:05] >> Yeah. There's many years of embarrassing yourself. Even those meetings knowing
[01:48:07] yourself. Even those meetings knowing that there's people around you where you
[01:48:09] that there's people around you where you declared one idea and it was shown that
[01:48:12] declared one idea and it was shown that that idea was wrong and be able to admit
[01:48:14] that idea was wrong and be able to admit that and to grow from that. That's not
[01:48:15] that and to grow from that. That's not that's very difficult on a human level.
[01:48:17] that's very difficult on a human level. >> Yeah. Well, you know, they knew I was
[01:48:20] >> Yeah. Well, you know, they knew I was they knew that recently my first job was
[01:48:22] they knew that recently my first job was was, you know, cleaning toilets. So,
[01:48:25] was, you know, cleaning toilets. So, >> I'm glad you maintain that same spirit
[01:48:27] >> I'm glad you maintain that same spirit of Denny's um the the work. I mean, that
[01:48:30] of Denny's um the the work. I mean, that that was beautiful. your whole journey
[01:48:32] that was beautiful. your whole journey from starting from Denny's is a
[01:48:33] from starting from Denny's is a beautiful one. Uh let me ask you about
[01:48:37] beautiful one. Uh let me ask you about video games. So I'm a big gaming fan.
[01:48:40] video games. So I'm a big gaming fan. >> Yeah.
[01:48:41] >> Yeah. >> So I have to say thank you to Nvidia for
[01:48:44] >> So I have to say thank you to Nvidia for many years of incredible graphics.
[01:48:46] many years of incredible graphics. Um
[01:48:47] Um >> by the way it it is GeForce is our still
[01:48:50] >> by the way it it is GeForce is our still to this day.
[01:48:51] to this day. >> Yeah.
[01:48:51] >> Yeah. >> Our number one marketing strategy.
[01:48:55] >> Our number one marketing strategy. Right. People learn about Nvidia while
[01:48:57] Right. People learn about Nvidia while they're in their teenage years.
[01:48:59] they're in their teenage years. >> Mhm. And then they go to college and
[01:49:01] >> Mhm. And then they go to college and they know who Nvidia is and they and
[01:49:02] they know who Nvidia is and they and then in the beginning it's just you know
[01:49:05] then in the beginning it's just you know playing Call of Duty you know you know
[01:49:06] playing Call of Duty you know you know Fortnite and then later they're using
[01:49:08] Fortnite and then later they're using CUDA and then later they're using Nvidia
[01:49:10] CUDA and then later they're using Nvidia and you know Blender and Do and Auto.
[01:49:16] and you know Blender and Do and Auto. >> I mean I should say I I mentioned to a
[01:49:18] >> I mean I should say I I mentioned to a friend that I'm uh talking with you. He
[01:49:21] friend that I'm uh talking with you. He said oh they make
[01:49:23] said oh they make great gaming GPUs.
[01:49:25] great gaming GPUs. >> Yeah. Exactly. Exactly. you know,
[01:49:27] >> Yeah. Exactly. Exactly. you know, there's there's more to it, but but
[01:49:30] there's there's more to it, but but yeah. Yeah, people really love the it
[01:49:32] yeah. Yeah, people really love the it really brought a lot of joy to a lot of
[01:49:34] really brought a lot of joy to a lot of people. The the the hardware really
[01:49:36] people. The the the hardware really brings these worlds to life.
[01:49:38] brings these worlds to life. >> Uh there was some controversy around
[01:49:41] >> Uh there was some controversy around this uh with DLSS 5. Yeah.
[01:49:44] this uh with DLSS 5. Yeah. >> Can you explain to me the drama around
[01:49:45] >> Can you explain to me the drama around this? Uh I guess people gamers online
[01:49:49] this? Uh I guess people gamers online were concerned that it makes games look
[01:49:52] were concerned that it makes games look like AI slop.
[01:49:53] like AI slop. >> Yeah.
[01:49:54] >> Yeah. >> Uh what do you think of this drama?
[01:49:56] >> Uh what do you think of this drama? Yeah, I think their their perspective
[01:49:59] Yeah, I think their their perspective makes sense and I could see where
[01:50:02] makes sense and I could see where they're coming from because I don't love
[01:50:04] they're coming from because I don't love AI slob myself. You know, all of the the
[01:50:07] AI slob myself. You know, all of the the AI generated content increasingly
[01:50:10] AI generated content increasingly um looks similar and they're all
[01:50:13] um looks similar and they're all beautiful and and I can so I can I'm
[01:50:16] beautiful and and I can so I can I'm empathetic towards what they're what
[01:50:17] empathetic towards what they're what they're thinking. Um that's just not
[01:50:19] they're thinking. Um that's just not what DLSS 5 is trying to do. I showed
[01:50:22] what DLSS 5 is trying to do. I showed several examples of it, but DLSS5
[01:50:26] several examples of it, but DLSS5 is 3D conditioned, 3D guided. It's
[01:50:30] is 3D conditioned, 3D guided. It's ground truth structure data guided. And
[01:50:33] ground truth structure data guided. And so, so the artist determine the
[01:50:35] so, so the artist determine the geometry. We are completely truthful
[01:50:39] geometry. We are completely truthful to the geometry maintain so in every
[01:50:42] to the geometry maintain so in every single frame. Um it's uh conditioned by
[01:50:45] single frame. Um it's uh conditioned by the textures, the artistry of the
[01:50:48] the textures, the artistry of the artist. And so every single frame it
[01:50:51] artist. And so every single frame it enhances but it doesn't change anything.
[01:50:55] enhances but it doesn't change anything. Now the question is the question about
[01:50:57] Now the question is the question about enhancing.
[01:50:59] enhancing. DLSS5 also lets because it's the system
[01:51:02] DLSS5 also lets because it's the system is open you could train your own models
[01:51:05] is open you could train your own models to determine and you could even in the
[01:51:08] to determine and you could even in the future prompt it you know I want it to
[01:51:10] future prompt it you know I want it to be a toune shader. I want it to look
[01:51:12] be a toune shader. I want it to look like this kind of, you know, so you can
[01:51:13] like this kind of, you know, so you can give it even an example and it would
[01:51:16] give it even an example and it would generate in the style of that all
[01:51:19] generate in the style of that all consistent with the artistry, you know,
[01:51:22] consistent with the artistry, you know, the style, the intent of the artist. And
[01:51:26] the style, the intent of the artist. And so all of that is done for the artist so
[01:51:30] so all of that is done for the artist so that they can create something that is
[01:51:32] that they can create something that is more beautiful um but still in the style
[01:51:35] more beautiful um but still in the style that they want.
[01:51:37] that they want. I think that they got the impression
[01:51:40] I think that they got the impression that the the games are going to come out
[01:51:43] that the the games are going to come out the way the games are shipped the way
[01:51:45] the way the games are shipped the way they do and then we're going to
[01:51:47] they do and then we're going to post-process it. That's not what DLSS is
[01:51:50] post-process it. That's not what DLSS is intended to do. DLSS is integrated with
[01:51:53] intended to do. DLSS is integrated with the artist. And so it's it's about
[01:51:55] the artist. And so it's it's about giving the artist the tool of AI, the
[01:51:58] giving the artist the tool of AI, the tool of generative AI. They could decide
[01:52:00] tool of generative AI. They could decide not to use it. You know,
[01:52:01] not to use it. You know, >> I think people are very sensitive to
[01:52:02] >> I think people are very sensitive to human faces.
[01:52:03] human faces. >> Yeah. And we're now living in this
[01:52:05] >> Yeah. And we're now living in this moment, which I think is a is a
[01:52:07] moment, which I think is a is a beautiful one, which is people are
[01:52:09] beautiful one, which is people are sensitive to AI slop.
[01:52:10] sensitive to AI slop. >> Yeah.
[01:52:11] >> Yeah. >> It it puts a mirror to ourselves to help
[01:52:14] >> It it puts a mirror to ourselves to help us realize that what we seek as
[01:52:15] us realize that what we seek as imperfections, what we seek is sometimes
[01:52:18] imperfections, what we seek is sometimes not perfect graphics, it helps us
[01:52:20] not perfect graphics, it helps us understand what we find compelling in
[01:52:23] understand what we find compelling in the worlds we create.
[01:52:25] the worlds we create. >> And that's beautiful. And as long as
[01:52:26] >> And that's beautiful. And as long as it's tools that help us create those
[01:52:27] it's tools that help us create those worlds.
[01:52:28] worlds. >> Yeah, that's right.
[01:52:29] >> Yeah, that's right. >> It's it's wonderful.
[01:52:30] >> It's it's wonderful. >> That's right. It's yet another tool. and
[01:52:31] >> That's right. It's yet another tool. and they want the generative uh models to
[01:52:35] they want the generative uh models to generate the opposite of photoreal.
[01:52:38] generate the opposite of photoreal. >> Mhm.
[01:52:38] >> Mhm. >> Yeah. It'll do that too. And so it's
[01:52:40] >> Yeah. It'll do that too. And so it's just yet another tool. I think the um
[01:52:43] just yet another tool. I think the um the gamers might might also appreciate
[01:52:46] the gamers might might also appreciate that that um in the last couple years we
[01:52:50] that that um in the last couple years we we introduced
[01:52:53] we introduced uh skin shaders
[01:52:55] uh skin shaders to the game developers and many of those
[01:52:58] to the game developers and many of those games have skin shaders that include
[01:53:00] games have skin shaders that include subs subsurface scattering that make
[01:53:03] subs subsurface scattering that make skin look more skin-like. And so the
[01:53:06] skin look more skin-like. And so the industry's game developers are looking
[01:53:09] industry's game developers are looking for more and more and more tools to
[01:53:12] for more and more and more tools to express their art. And so this is just
[01:53:14] express their art. And so this is just yet more one more tool they could decide
[01:53:16] yet more one more tool they could decide what to use.
[01:53:16] what to use. >> Ridiculous question. Uh what do you
[01:53:18] >> Ridiculous question. Uh what do you think is the greatest or most
[01:53:20] think is the greatest or most influential game ever made? Maybe from
[01:53:22] influential game ever made? Maybe from Nvidia's perspective.
[01:53:24] Nvidia's perspective. >> Doom.
[01:53:25] >> Doom. >> Doom. Unquestionably. That was the start
[01:53:27] >> Doom. Unquestionably. That was the start of the 3D. I would say Doom from a from
[01:53:30] of the 3D. I would say Doom from a from a the intersection of the cultural
[01:53:32] a the intersection of the cultural implication as well as the industry
[01:53:35] implication as well as the industry turning a PC into a gaming device. That
[01:53:39] turning a PC into a gaming device. That was a very important moment. Now, of
[01:53:40] was a very important moment. Now, of course, flight simulation companies were
[01:53:42] course, flight simulation companies were before it
[01:53:44] before it >> and um but they just didn't have the
[01:53:46] >> and um but they just didn't have the popularity that Doom did to have made
[01:53:48] popularity that Doom did to have made the industry turned the PC from a office
[01:53:51] the industry turned the PC from a office automation tool into a personal computer
[01:53:55] automation tool into a personal computer for families and gamers and things like
[01:53:57] for families and gamers and things like that. And so Doom was really impactful
[01:53:58] that. And so Doom was really impactful there. From a from an actual game
[01:54:00] there. From a from an actual game technology perspective, I would say
[01:54:02] technology perspective, I would say Virtual Fighter. And so we we're great
[01:54:05] Virtual Fighter. And so we we're great friends with both of them, you know. And
[01:54:07] friends with both of them, you know. And then there's games more recently. I
[01:54:09] then there's games more recently. I mean, Cyberpunk 2077,
[01:54:12] mean, Cyberpunk 2077, really nice GPU,
[01:54:15] really nice GPU, accelerated graphics, like fully ray
[01:54:17] accelerated graphics, like fully ray traced,
[01:54:17] traced, >> fully ray traced. Um, also I like I
[01:54:20] >> fully ray traced. Um, also I like I personally I'm a huge fan of Skyrim, uh,
[01:54:22] personally I'm a huge fan of Skyrim, uh, Elder Scrolls and the, you know, it's
[01:54:25] Elder Scrolls and the, you know, it's been released a long long time ago, but
[01:54:27] been released a long long time ago, but people release mods and they
[01:54:31] people release mods and they I mean it it's like a different game and
[01:54:33] I mean it it's like a different game and it just allows me to replay the game
[01:54:35] it just allows me to replay the game over and over and it get it makes you
[01:54:38] over and over and it get it makes you realize you can reexperience in a
[01:54:41] realize you can reexperience in a totally new way the world you already
[01:54:44] totally new way the world you already love.
[01:54:44] love. >> So I I do that all the time. One of my
[01:54:46] >> So I I do that all the time. One of my favorite games just walk around Skyrim.
[01:54:48] favorite games just walk around Skyrim. We created this thing called RTX Mod.
[01:54:50] We created this thing called RTX Mod. >> Uhhuh.
[01:54:51] >> Uhhuh. >> Yeah. It's a modding tool.
[01:54:52] >> Yeah. It's a modding tool. >> Awesome.
[01:54:53] >> Awesome. >> And allows it allows the community to
[01:54:55] >> And allows it allows the community to inject the latest technology into an old
[01:54:59] inject the latest technology into an old game.
[01:55:00] game. >> Of course, like what makes a great video
[01:55:01] >> Of course, like what makes a great video game is not just graphics. It's also
[01:55:03] game is not just graphics. It's also story and character development. But
[01:55:06] story and character development. But that's right. Beautiful graphics can add
[01:55:08] that's right. Beautiful graphics can add to the the immersion, the the feeling
[01:55:11] to the the immersion, the the feeling like it's another place you're
[01:55:13] like it's another place you're transported to.
[01:55:16] transported to. uh what's uh you said I think accurately
[01:55:18] uh what's uh you said I think accurately that the AGI timeline
[01:55:22] that the AGI timeline question rests on your definition of
[01:55:24] question rests on your definition of AGI.
[01:55:26] AGI. So let let's let me ask you about a
[01:55:29] So let let's let me ask you about a possible timelines here. Let's this
[01:55:32] possible timelines here. Let's this ridiculous definition perhaps of what
[01:55:34] ridiculous definition perhaps of what AGI is but a an AI system that's able to
[01:55:39] AGI is but a an AI system that's able to essentially do your job. So run, no
[01:55:44] essentially do your job. So run, no start,
[01:55:46] start, grow and run a successful technology
[01:55:50] grow and run a successful technology company that's worth
[01:55:52] company that's worth >> a good one or A1.
[01:55:54] >> a good one or A1. >> No, it has to it has to be worth more
[01:55:56] >> No, it has to it has to be worth more than a billion
[01:55:58] than a billion more more than a billion dollars.
[01:56:01] more more than a billion dollars. So you know, you know how hard it is to
[01:56:05] So you know, you know how hard it is to do all those components. So how far are
[01:56:07] do all those components. So how far are we away from that? So we're talking
[01:56:10] we away from that? So we're talking about open claw that does all the
[01:56:14] about open claw that does all the incredibly complex stuff that are
[01:56:16] incredibly complex stuff that are required to to first of all innovate to
[01:56:20] required to to first of all innovate to find customers to sell to them to to
[01:56:22] find customers to sell to them to to manage to build a team of some agents
[01:56:26] manage to build a team of some agents some humans all that kind of stuff. Is
[01:56:28] some humans all that kind of stuff. Is this 5 10 15 20 years away?
[01:56:31] this 5 10 15 20 years away? >> I think it's now I think we've achieved
[01:56:33] >> I think it's now I think we've achieved AGI.
[01:56:34] AGI. >> You think you can have a company run by
[01:56:36] >> You think you can have a company run by an AI system like this?
[01:56:37] an AI system like this? >> Possible. And the reason for that is
[01:56:39] >> Possible. And the reason for that is this. You said a billion and you didn't
[01:56:41] this. You said a billion and you didn't say forever and and so for example uh it
[01:56:46] say forever and and so for example uh it is not out of the question that
[01:56:50] is not out of the question that uh a claw was able to create a web
[01:56:53] uh a claw was able to create a web service some interesting little app that
[01:56:59] service some interesting little app that all of a sudden you know a few billion
[01:57:03] all of a sudden you know a few billion people used for 50 and then it went out
[01:57:08] people used for 50 and then it went out of business again shortly after. Now, we
[01:57:09] of business again shortly after. Now, we saw a whole bunch of those type of
[01:57:10] saw a whole bunch of those type of companies during the internet era and
[01:57:13] companies during the internet era and most of the those websites were not
[01:57:15] most of the those websites were not anything more sophisticated than what
[01:57:19] anything more sophisticated than what Open Claw could generate today.
[01:57:20] Open Claw could generate today. >> Interesting. Achieve virality and
[01:57:22] >> Interesting. Achieve virality and monetize that virality.
[01:57:24] monetize that virality. >> Yeah. It's just that I don't know what
[01:57:25] >> Yeah. It's just that I don't know what it is, but I I couldn't have predicted
[01:57:27] it is, but I I couldn't have predicted any of those companies at the time
[01:57:28] any of those companies at the time either. You know,
[01:57:30] either. You know, >> you're going to get a lot of people
[01:57:31] >> you're going to get a lot of people excited with that statement.
[01:57:32] excited with that statement. >> Yeah. It's like, what do you mean? I can
[01:57:34] >> Yeah. It's like, what do you mean? I can I can just uh launch an agent and u make
[01:57:38] I can just uh launch an agent and u make a lot of money? Well, by the way, it's
[01:57:39] a lot of money? Well, by the way, it's happening right now, right? You know
[01:57:40] happening right now, right? You know that when when you go to China, uh
[01:57:42] that when when you go to China, uh you're going to see you're going to see
[01:57:44] you're going to see you're going to see um a whole bunch of people uh teaching
[01:57:47] um a whole bunch of people uh teaching their getting their claws to try to go
[01:57:48] their getting their claws to try to go out and look for jobs and, you know, do
[01:57:51] out and look for jobs and, you know, do work, make money. And and I'm not I'm
[01:57:55] work, make money. And and I'm not I'm not actually I wouldn't be surprised if
[01:57:57] not actually I wouldn't be surprised if some social thing happened or somebody
[01:57:59] some social thing happened or somebody created a a digital influencer, super
[01:58:02] created a a digital influencer, super super cute. um or some social
[01:58:05] super cute. um or some social application that you know feeds your
[01:58:08] application that you know feeds your little tomagotchi or something like that
[01:58:09] little tomagotchi or something like that and and it become an out of the blue an
[01:58:13] and and it become an out of the blue an instant success. A lot of people use it
[01:58:16] instant success. A lot of people use it for a couple of months and it kind of
[01:58:17] for a couple of months and it kind of dies away. Now the odds of of of
[01:58:22] dies away. Now the odds of of of you know 100,000 of those agents um
[01:58:25] you know 100,000 of those agents um building Nvidia 0%.
[01:58:28] building Nvidia 0%. And and then and then the the one part
[01:58:30] And and then and then the the one part that I will I will do um and I and I I
[01:58:34] that I will I will do um and I and I I want to make sure we all do is to
[01:58:36] want to make sure we all do is to recognize that people are really worried
[01:58:38] recognize that people are really worried about their jobs
[01:58:40] about their jobs and and um I just want to remind them
[01:58:43] and and um I just want to remind them that the purpose of your job and the
[01:58:47] that the purpose of your job and the tasks and the tools that you use to do
[01:58:50] tasks and the tools that you use to do your job are related, not the same. I've
[01:58:53] your job are related, not the same. I've been doing my job for 33 years. I'm the
[01:58:55] been doing my job for 33 years. I'm the longest running tech CEO in the world.
[01:58:57] longest running tech CEO in the world. 34 years and the tools that I've used to
[01:59:00] 34 years and the tools that I've used to do my job has changed
[01:59:03] do my job has changed continuously in the last 34 years and
[01:59:06] continuously in the last 34 years and sometimes quite dramatically you know
[01:59:09] sometimes quite dramatically you know over the course of a couple two three
[01:59:10] over the course of a couple two three years and and the the the one story that
[01:59:13] years and and the the the one story that I I I really want to make sure that
[01:59:14] I I I really want to make sure that everybody hears is the story the the
[01:59:18] everybody hears is the story the the first job that every that computer
[01:59:20] first job that every that computer scientists said AI researchers said was
[01:59:22] scientists said AI researchers said was going to go away was radiology
[01:59:25] going to go away was radiology because computer vision was going to
[01:59:26] because computer vision was going to achieve superhuman levels and it did. CV
[01:59:31] achieve superhuman levels and it did. CV computer vision was superhuman in 2019
[01:59:35] computer vision was superhuman in 2019 20 maybe maybe a little bit later 2020.
[01:59:38] 20 maybe maybe a little bit later 2020. >> Mhm.
[01:59:39] >> Mhm. >> Okay. And so it's been a long time since
[01:59:41] >> Okay. And so it's been a long time since computer vision has been superhuman. And
[01:59:43] computer vision has been superhuman. And so the prediction was radiologists would
[01:59:45] so the prediction was radiologists would go away because studying radiology scans
[01:59:48] go away because studying radiology scans was thing of the past. AI will do that.
[01:59:50] was thing of the past. AI will do that. Well, they were absolutely right.
[01:59:54] Well, they were absolutely right. Computer vision is completely
[01:59:56] Computer vision is completely superhuman. Every radiology platform and
[01:59:59] superhuman. Every radiology platform and package today is driven by AI.
[02:00:02] package today is driven by AI. And yet the number of radiologists grew.
[02:00:06] And yet the number of radiologists grew. And so the question is why? And we now
[02:00:08] And so the question is why? And we now have a shortage of radiologists in the
[02:00:10] have a shortage of radiologists in the world. And so one the alarmist
[02:00:15] world. And so one the alarmist warning went too far and has scared
[02:00:17] warning went too far and has scared people from
[02:00:20] people from doing this profession that is so
[02:00:21] doing this profession that is so important to society. And so it did
[02:00:24] important to society. And so it did harm. Now why was it wrong? The reason
[02:00:27] harm. Now why was it wrong? The reason why is because the purpose of a
[02:00:30] why is because the purpose of a radiologist, the purpose is to diagnose
[02:00:32] radiologist, the purpose is to diagnose disease and help patients and doctors
[02:00:36] disease and help patients and doctors diagnose disease.
[02:00:38] diagnose disease. And because we're able to study scans so
[02:00:41] And because we're able to study scans so much faster now, you could study more
[02:00:43] much faster now, you could study more scans. You could diagnose better. You
[02:00:46] scans. You could diagnose better. You could you could um impatient faster. We
[02:00:50] could you could um impatient faster. We can see people more. the hospitals are
[02:00:53] can see people more. the hospitals are making more money. You have more
[02:00:54] making more money. You have more patients in the hospital. You need more
[02:00:56] patients in the hospital. You need more radiologists. I mean the the amazing
[02:00:58] radiologists. I mean the the amazing thing is it's so obvious this was going
[02:01:02] thing is it's so obvious this was going to happen. The number of software
[02:01:04] to happen. The number of software engineers at NVIDIA is going to grow,
[02:01:05] engineers at NVIDIA is going to grow, not decline.
[02:01:08] not decline. And the reason for that is because the
[02:01:10] And the reason for that is because the purpose of a software engineer and the
[02:01:12] purpose of a software engineer and the task of a software engineer for coding
[02:01:14] task of a software engineer for coding are related, not the same. I wanted my
[02:01:17] are related, not the same. I wanted my software engineers to solve problems. I
[02:01:19] software engineers to solve problems. I didn't care how many lines of code they
[02:01:20] didn't care how many lines of code they wrote.
[02:01:22] wrote. You know, but their job, their purpose
[02:01:24] You know, but their job, their purpose of their job didn't change. Solving
[02:01:26] of their job didn't change. Solving problems, working as a team, diagnosing
[02:01:28] problems, working as a team, diagnosing problems, evaluating the result, looking
[02:01:32] problems, evaluating the result, looking for new problems to solve innovation,
[02:01:35] for new problems to solve innovation, connecting dots, you know, none of that
[02:01:38] connecting dots, you know, none of that stuff is going to go away.
[02:01:39] stuff is going to go away. >> So, you think it's possible that let's
[02:01:41] >> So, you think it's possible that let's even take coding, you think the number
[02:01:43] even take coding, you think the number of programmers in the world might
[02:01:44] of programmers in the world might increase, not decrease?
[02:01:47] increase, not decrease? >> And the reason for that is this. What is
[02:01:49] >> And the reason for that is this. What is the definition of coding?
[02:01:52] the definition of coding? I believe that is the definition coding
[02:01:54] I believe that is the definition coding as of today is simply specifying
[02:01:58] as of today is simply specifying specification and maybe if you want to
[02:02:01] specification and maybe if you want to be rather directive you could even give
[02:02:04] be rather directive you could even give it an architecture of the software that
[02:02:06] it an architecture of the software that you're you wanted to write. So the
[02:02:08] you're you wanted to write. So the question is how many people could do
[02:02:10] question is how many people could do that? Describe a specification for a
[02:02:13] that? Describe a specification for a computer to go telling the computer what
[02:02:15] computer to go telling the computer what to go build. How many people? I think we
[02:02:18] to go build. How many people? I think we just went from 30 million to probably 1
[02:02:20] just went from 30 million to probably 1 billion.
[02:02:22] billion. And so every every carpenter in the
[02:02:25] And so every every carpenter in the future will be a coder. Except a
[02:02:28] future will be a coder. Except a carpenter with AI is also an architect.
[02:02:33] carpenter with AI is also an architect. They just increased the value that they
[02:02:34] They just increased the value that they could deliver to the customer. Their
[02:02:37] could deliver to the customer. Their their
[02:02:39] their artistry just elevated tremendously.
[02:02:43] artistry just elevated tremendously. I believe that every accountant is, you
[02:02:45] I believe that every accountant is, you know, also your financial analyst, also
[02:02:47] know, also your financial analyst, also your financial adviser. So all of these
[02:02:50] your financial adviser. So all of these professions have just been elevated and
[02:02:53] professions have just been elevated and if I were a carpenter, I sees a I see
[02:02:55] if I were a carpenter, I sees a I see AI, I would just completely go berserk.
[02:02:58] AI, I would just completely go berserk. You know, the services I can bring to my
[02:03:00] You know, the services I can bring to my clients, if I were a plumber, completely
[02:03:03] clients, if I were a plumber, completely go berserk. and the the people that are
[02:03:05] go berserk. and the the people that are currently programmers and software
[02:03:07] currently programmers and software engineers, I think they're at the
[02:03:09] engineers, I think they're at the cutting edge of understanding
[02:03:11] cutting edge of understanding intuitively how to communicate
[02:03:15] intuitively how to communicate with the agents using natural language
[02:03:17] with the agents using natural language in order to design the best kind of
[02:03:19] in order to design the best kind of software.
[02:03:20] software. >> That's right. So over time they'll
[02:03:22] >> That's right. So over time they'll converge but I think uh there's still
[02:03:25] converge but I think uh there's still value in getting I think uh learning how
[02:03:27] value in getting I think uh learning how to program like learning what
[02:03:29] to program like learning what programming languages are uh the old the
[02:03:32] programming languages are uh the old the old kind of programming uh what what are
[02:03:35] old kind of programming uh what what are good practices for programming languages
[02:03:37] good practices for programming languages what are design principles for
[02:03:39] what are design principles for programming languages for large software
[02:03:42] programming languages for large software systems
[02:03:43] systems >> and and the reason for that lex and you
[02:03:46] >> and and the reason for that lex and you know I just say for the audience I think
[02:03:49] know I just say for the audience I think the goal of the goal of specification,
[02:03:52] the goal of the goal of specification, the artistry of specification, the goal
[02:03:55] the artistry of specification, the goal and the artistry of it um is going to
[02:03:58] and the artistry of it um is going to depend on what problem you're trying to
[02:04:00] depend on what problem you're trying to solve. when I'm thinking when I'm
[02:04:02] solve. when I'm thinking when I'm thinking about giving the company
[02:04:04] thinking about giving the company strategies and um formulating corporate
[02:04:07] strategies and um formulating corporate directions and things that we should do
[02:04:10] directions and things that we should do um I describe it at a level that is
[02:04:14] um I describe it at a level that is sufficiently
[02:04:16] sufficiently specific that people generally
[02:04:19] specific that people generally understand the direction and it's
[02:04:21] understand the direction and it's actionable they it's so specific enough
[02:04:24] actionable they it's so specific enough that they can take action on it but I
[02:04:26] that they can take action on it but I underspecify it on purpose so that
[02:04:30] underspecify it on purpose so that enable 43 3,000 amazing people to make
[02:04:33] enable 43 3,000 amazing people to make it even better than I imagined.
[02:04:36] it even better than I imagined. And so when I'm working with engineers,
[02:04:39] And so when I'm working with engineers, when I'm working with people, um, I
[02:04:41] when I'm working with people, um, I think about who what problem am I trying
[02:04:43] think about who what problem am I trying to solve? Who am I working with?
[02:04:47] to solve? Who am I working with? And the level of specification, the
[02:04:51] And the level of specification, the level of architecture definition
[02:04:54] level of architecture definition relates to that. And and so
[02:04:59] relates to that. And and so everybody's going to have to learn how
[02:05:01] everybody's going to have to learn how where in the spectrum of coding they
[02:05:03] where in the spectrum of coding they want to be. Writing a specification is
[02:05:05] want to be. Writing a specification is coding. And so you might decide to be
[02:05:08] coding. And so you might decide to be quite prescriptive because there's a
[02:05:10] quite prescriptive because there's a very specific outcome you're looking
[02:05:11] very specific outcome you're looking for. You might decide that you know this
[02:05:14] for. You might decide that you know this is an area you want to be much more
[02:05:16] is an area you want to be much more exploratory. And so you might
[02:05:18] exploratory. And so you might underspecify and enable you to go back
[02:05:21] underspecify and enable you to go back and forth with the AI to even push your
[02:05:23] and forth with the AI to even push your own boundaries of creativity. And so
[02:05:26] own boundaries of creativity. And so this artistry of where you are in the
[02:05:28] this artistry of where you are in the spectrum, this is the future of coding.
[02:05:31] spectrum, this is the future of coding. >> But just to linger on it, outside of
[02:05:32] >> But just to linger on it, outside of coding, I think a lot of people
[02:05:34] coding, I think a lot of people rightfully so
[02:05:36] rightfully so uh are worried about their jobs, have a
[02:05:38] uh are worried about their jobs, have a lot of anxiety about their jobs,
[02:05:40] lot of anxiety about their jobs, especially in the white collar sector.
[02:05:43] especially in the white collar sector. Um I don't think any of us know
[02:05:47] Um I don't think any of us know what to do
[02:05:50] what to do uh with tumultuous times that always
[02:05:52] uh with tumultuous times that always come when automations and new technology
[02:05:54] come when automations and new technology arrives. And I just
[02:05:57] arrives. And I just first of all I think um
[02:06:01] first of all I think um we all need to have compassion and the
[02:06:03] we all need to have compassion and the responsibility to feel sort of the
[02:06:05] responsibility to feel sort of the burden of what the actual suffering
[02:06:07] burden of what the actual suffering feels like for individual people and
[02:06:09] feels like for individual people and families that lose their job. I think
[02:06:11] families that lose their job. I think whenever you have transformative
[02:06:13] whenever you have transformative technology like that's coming with with
[02:06:15] technology like that's coming with with artificial intelligence, there's going
[02:06:17] artificial intelligence, there's going to be a lot of pain and I don't know
[02:06:19] to be a lot of pain and I don't know what to do about that uh pain.
[02:06:21] what to do about that uh pain. Hopefully, it creates much more
[02:06:22] Hopefully, it creates much more opportunities for those same people uh
[02:06:26] opportunities for those same people uh for the same kind of job as uh the
[02:06:30] for the same kind of job as uh the tooling evolves and makes them more
[02:06:32] tooling evolves and makes them more productive and makes it more fun
[02:06:34] productive and makes it more fun hopefully as it does in the programming.
[02:06:36] hopefully as it does in the programming. I've I haven't I've been having so much
[02:06:37] I've I haven't I've been having so much fun programming, I have to say. like
[02:06:39] fun programming, I have to say. like I've never had this much fun. So
[02:06:41] I've never had this much fun. So hopefully it makes their job automates
[02:06:42] hopefully it makes their job automates the boring parts and makes the creative
[02:06:45] the boring parts and makes the creative parts uh the ones that the the human
[02:06:48] parts uh the ones that the the human beings are responsible for. But still
[02:06:49] beings are responsible for. But still there's going to be a lot of pain and
[02:06:51] there's going to be a lot of pain and suffering. So my first recommendation
[02:06:53] suffering. So my first recommendation before and this is now how I deal with
[02:06:56] before and this is now how I deal with anxiety. In fact, we just talked about
[02:06:57] anxiety. In fact, we just talked about it earlier.
[02:06:58] it earlier. >> Mhm.
[02:06:59] >> Mhm. >> Enormous anxiety about the future,
[02:07:01] >> Enormous anxiety about the future, enormous anxiety about the pressure,
[02:07:02] enormous anxiety about the pressure, enormous anxiety about uncertainty.
[02:07:05] enormous anxiety about uncertainty. I first break it down and then I'm going
[02:07:08] I first break it down and then I'm going to tell myself,
[02:07:10] to tell myself, okay, there are some things you can do
[02:07:12] okay, there are some things you can do something about. There are some things
[02:07:13] something about. There are some things you can't do anything about, but for the
[02:07:15] you can't do anything about, but for the stuff that you can do something about,
[02:07:17] stuff that you can do something about, let's reason reason about it and let's
[02:07:19] let's reason reason about it and let's go do it.
[02:07:20] go do it. >> If we were to hire a new college
[02:07:21] >> If we were to hire a new college graduate today and I have a choice
[02:07:24] graduate today and I have a choice between two, one that have that is no
[02:07:28] between two, one that have that is no clue what AI is and one that is expert
[02:07:32] clue what AI is and one that is expert in using AI. I would hire the one who's
[02:07:35] in using AI. I would hire the one who's expert in using AI. If I had an
[02:07:38] expert in using AI. If I had an accountant, a marketing person, the one
[02:07:42] accountant, a marketing person, the one that is expert in using AI, supply
[02:07:45] that is expert in using AI, supply chain, customer service, a salesperson,
[02:07:48] chain, customer service, a salesperson, business development, a lawyer,
[02:07:51] business development, a lawyer, I would hire the one who is expert in
[02:07:54] I would hire the one who is expert in using AI. And so I would I would advise
[02:07:57] using AI. And so I would I would advise that every college student, every every
[02:08:00] that every college student, every every teacher should encourage their student
[02:08:02] teacher should encourage their student to to go use AI. Every college student
[02:08:06] to to go use AI. Every college student should graduate and be an expert in AI.
[02:08:08] should graduate and be an expert in AI. And every everybody, if you're a
[02:08:10] And every everybody, if you're a carpenter, if you're, you know,
[02:08:12] carpenter, if you're, you know, electrician, go use AI. Go see what it
[02:08:16] electrician, go use AI. Go see what it can do to transform your current job.
[02:08:19] can do to transform your current job. Elevate yourself. If I were a farmer, I
[02:08:22] Elevate yourself. If I were a farmer, I would absolutely use AI. If I were a
[02:08:24] would absolutely use AI. If I were a pharmacist, pharmacist, I would use AI.
[02:08:27] pharmacist, pharmacist, I would use AI. I want to see how what it could do to
[02:08:28] I want to see how what it could do to elevate my job so that I could be the I
[02:08:31] elevate my job so that I could be the I could be the innovator to revolutionize
[02:08:34] could be the innovator to revolutionize this industry myself.
[02:08:36] this industry myself. >> And so that that would be the first
[02:08:37] >> And so that that would be the first thing that I would do. And and then I
[02:08:39] thing that I would do. And and then I would also I would also help them. Um it
[02:08:44] would also I would also help them. Um it is the case that the technology will
[02:08:46] is the case that the technology will dislocate and will eliminate many tasks.
[02:08:52] dislocate and will eliminate many tasks. If and because it will automate it. If
[02:08:54] If and because it will automate it. If your job is the task if your job is the
[02:08:58] your job is the task if your job is the task then you're very highly going to be
[02:09:01] task then you're very highly going to be disrupted.
[02:09:02] disrupted. If your
[02:09:04] If your job's purpose includes you certain
[02:09:07] job's purpose includes you certain tasks. Mhm.
[02:09:08] tasks. Mhm. >> Then it it's vital that you go learn how
[02:09:10] >> Then it it's vital that you go learn how to use AI to automate those tasks. And
[02:09:12] to use AI to automate those tasks. And then there's the world of spectrum in
[02:09:14] then there's the world of spectrum in between.
[02:09:14] between. >> And by the way, the beautiful thing
[02:09:16] >> And by the way, the beautiful thing about AI, so the the the chatbot
[02:09:19] about AI, so the the the chatbot versions
[02:09:21] versions is you can break down you have anxiety
[02:09:24] is you can break down you have anxiety and you can break down the problem by
[02:09:26] and you can break down the problem by talking to it. Like I've I've recently
[02:09:29] talking to it. Like I've I've recently it's really just incredible how much you
[02:09:31] it's really just incredible how much you can think through your life's problems
[02:09:33] can think through your life's problems and through and I don't mean like
[02:09:34] and through and I don't mean like therapy problems. I mean like very
[02:09:36] therapy problems. I mean like very practically, okay, I'm worried about my
[02:09:39] practically, okay, I'm worried about my literally I'm worried about my job. What
[02:09:40] literally I'm worried about my job. What are the skills? What are the steps I
[02:09:42] are the skills? What are the steps I need to take? How do I get better at AI?
[02:09:44] need to take? How do I get better at AI? Everything you just said, you can
[02:09:45] Everything you just said, you can literally ask and it's going to give you
[02:09:47] literally ask and it's going to give you a point by point. I mean, it's just a
[02:09:50] a point by point. I mean, it's just a great life coach. Period. This
[02:09:52] great life coach. Period. This >> I don't know how to use AI. And the AI
[02:09:53] >> I don't know how to use AI. And the AI goes, well, let me show you.
[02:09:54] goes, well, let me show you. >> Exactly. It's very meta, but it's
[02:09:57] >> Exactly. It's very meta, but it's >> it's kind of incredible. So, people
[02:09:59] >> it's kind of incredible. So, people definitely should.
[02:10:00] definitely should. >> You can't walk up to Excel and say, I
[02:10:01] >> You can't walk up to Excel and say, I don't know how to use Excel. You're
[02:10:02] don't know how to use Excel. You're done. I mean that's really what AI has
[02:10:05] done. I mean that's really what AI has done for me in all walks of life is that
[02:10:07] done for me in all walks of life is that initial friction of being a beginner of
[02:10:10] initial friction of being a beginner of using a thing for the first time. I can
[02:10:12] using a thing for the first time. I can literally ask about any single thing.
[02:10:14] literally ask about any single thing. >> What are the first steps I need to take?
[02:10:16] >> What are the first steps I need to take? >> That's right.
[02:10:17] >> That's right. >> And and that that handholding that it
[02:10:18] >> And and that that handholding that it does removing the friction of uh all the
[02:10:22] does removing the friction of uh all the experiences that the world offers is you
[02:10:25] experiences that the world offers is you know like like I mentioned to you
[02:10:26] know like like I mentioned to you offline you mentioned I'm I'm going to
[02:10:28] offline you mentioned I'm I'm going to China and Taiwan.
[02:10:30] China and Taiwan. >> So awesome.
[02:10:32] >> So awesome. for you. Where do I go? What where do I
[02:10:34] for you. Where do I go? What where do I go? How do I all those questions
[02:10:35] go? How do I all those questions immediately answered and it's beautiful?
[02:10:37] immediately answered and it's beautiful? >> Well, when you when you go to Taiwan,
[02:10:39] >> Well, when you when you go to Taiwan, just ask AI, what are Jensen's favorite
[02:10:42] just ask AI, what are Jensen's favorite restaurants in Taiwan?
[02:10:44] restaurants in Taiwan? >> Yeah.
[02:10:44] >> Yeah. >> And it'll actually Oh, yeah. Yeah.
[02:10:46] >> And it'll actually Oh, yeah. Yeah. >> Is it accurate? Okay.
[02:10:46] >> Is it accurate? Okay. >> Yeah. Yeah. All right.
[02:10:47] >> Yeah. Yeah. All right. >> It's all over all over Taiwan.
[02:10:50] >> It's all over all over Taiwan. >> Well, you're you're a rock star over
[02:10:52] >> Well, you're you're a rock star over there and um and like we also mentioned
[02:10:54] there and um and like we also mentioned offline, maybe our paths will cross,
[02:10:56] offline, maybe our paths will cross, which would be really wonderful in
[02:10:57] which would be really wonderful in Computex GTC Taiwan.
[02:11:01] Computex GTC Taiwan. Uh do you think there are some things
[02:11:04] Uh do you think there are some things about human nature about human
[02:11:06] about human nature about human consciousness
[02:11:08] consciousness that is
[02:11:10] that is fundamentally non-computational
[02:11:12] fundamentally non-computational maybe something a chip no matter how
[02:11:14] maybe something a chip no matter how powerful uh can never replicate? I don't
[02:11:18] powerful uh can never replicate? I don't know if the chip will ever get nervous
[02:11:20] know if the chip will ever get nervous and that's the you know of course the
[02:11:22] and that's the you know of course the conditions by which uh that causes
[02:11:26] conditions by which uh that causes anxiety or nervousness or whatever
[02:11:28] anxiety or nervousness or whatever emotion um I believe that AI will be
[02:11:34] emotion um I believe that AI will be able to recognize those and understand
[02:11:37] able to recognize those and understand those. I don't think my chips will feel
[02:11:40] those. I don't think my chips will feel those and therefore the how how that
[02:11:44] those and therefore the how how that anxiety, how that feeling, how that
[02:11:46] anxiety, how that feeling, how that excitement, how that how that you know
[02:11:50] excitement, how that how that you know all of those feelings manifest in human
[02:11:53] all of those feelings manifest in human performance for example extremely
[02:11:56] performance for example extremely amazing human performance, athletic
[02:11:57] amazing human performance, athletic performance, you know, average or lesser
[02:12:00] performance, you know, average or lesser than average. um that that entire
[02:12:02] than average. um that that entire spectrum of human performance that comes
[02:12:05] spectrum of human performance that comes out of exactly the same circumstances
[02:12:08] out of exactly the same circumstances for different people manifesting in
[02:12:11] for different people manifesting in different outcome
[02:12:13] different outcome manifesting in different performance. I
[02:12:16] manifesting in different performance. I I don't think there's anything about
[02:12:19] I don't think there's anything about anything that we're building that would
[02:12:20] anything that we're building that would suggest that two different computers
[02:12:24] suggest that two different computers being presented with all of exactly the
[02:12:26] being presented with all of exactly the same context would per of course it
[02:12:29] same context would per of course it would produce statistically different
[02:12:31] would produce statistically different outcomes but it's not because it felt
[02:12:33] outcomes but it's not because it felt different.
[02:12:34] different. >> Yeah. The subjective
[02:12:36] >> Yeah. The subjective boy there's something truly special
[02:12:38] boy there's something truly special about the subjective experience
[02:12:41] about the subjective experience that we humans feel. Like I mentioned to
[02:12:43] that we humans feel. Like I mentioned to you, I was I was I was pretty nervous
[02:12:46] you, I was I was I was pretty nervous talking to you like I mentioned to you
[02:12:48] talking to you like I mentioned to you that the hope the fear the anxiety and
[02:12:51] that the hope the fear the anxiety and just life itself the richness of life
[02:12:54] just life itself the richness of life how amazing everything is how deeply we
[02:12:56] how amazing everything is how deeply we fall in love how deeply our hearts get
[02:12:58] fall in love how deeply our hearts get broken how afraid we are of death and
[02:13:01] broken how afraid we are of death and how much pain we feel when our loved
[02:13:03] how much pain we feel when our loved ones pass away all of that the whole
[02:13:06] ones pass away all of that the whole thing I don't it's very hard to think AI
[02:13:10] thing I don't it's very hard to think AI being able to a computational device
[02:13:12] being able to a computational device being able to do that but there's so
[02:13:14] being able to do that but there's so many mysteries about this whole thing
[02:13:16] many mysteries about this whole thing that we're yet to uncover that I am open
[02:13:19] that we're yet to uncover that I am open to be surprised.
[02:13:21] to be surprised. >> I've been surprised a lot over the past
[02:13:23] >> I've been surprised a lot over the past >> few months and few years. Scaling can
[02:13:26] >> few months and few years. Scaling can create some incredible miracles in the
[02:13:28] create some incredible miracles in the space of intelligence
[02:13:31] space of intelligence >> has been truly marvelous to watch. So
[02:13:32] >> has been truly marvelous to watch. So I'm open to surprise
[02:13:34] I'm open to surprise >> and and it's just really important to to
[02:13:37] >> and and it's just really important to to break down what is intelligence and the
[02:13:39] break down what is intelligence and the word that word we use all the time. It's
[02:13:41] word that word we use all the time. It's not a mysterious word. Intelligence has
[02:13:44] not a mysterious word. Intelligence has a meaning, you know,
[02:13:46] a meaning, you know, >> and it's a system that, you know, it's
[02:13:49] >> and it's a system that, you know, it's it it's something that we do that in
[02:13:51] it it's something that we do that in includes perception and understanding
[02:13:54] includes perception and understanding and reasoning and the ability to do plan
[02:13:56] and reasoning and the ability to do plan and you know that that loop that loop is
[02:14:00] and you know that that loop that loop is is um the fundamentally what
[02:14:02] is um the fundamentally what intelligence is. Intelligence is not one
[02:14:05] intelligence is. Intelligence is not one word that is exactly equal to humanity.
[02:14:11] word that is exactly equal to humanity. And that's I think it's really important
[02:14:12] And that's I think it's really important to separate the two. We have two words
[02:14:14] to separate the two. We have two words for that. I'm not I don't over fantasize
[02:14:18] for that. I'm not I don't over fantasize about and I don't over romanticize about
[02:14:21] about and I don't over romanticize about intelligence. Intelligence is and people
[02:14:25] intelligence. Intelligence is and people have heard me say it before. I actually
[02:14:27] have heard me say it before. I actually think intelligence is a commodity.
[02:14:29] think intelligence is a commodity. I'm surrounded by intelligent people.
[02:14:33] I'm surrounded by intelligent people. And I'm surrounded by intelligent people
[02:14:34] And I'm surrounded by intelligent people more intelligent than I am in each one
[02:14:36] more intelligent than I am in each one of the spaces that they're in. And yet I
[02:14:40] of the spaces that they're in. And yet I have a role in that circle. It's
[02:14:43] have a role in that circle. It's actually kind of interesting.
[02:14:45] actually kind of interesting. They're more educated than I am.
[02:14:49] They're more educated than I am. They went to better schools than I did.
[02:14:51] They went to better schools than I did. They're deeper than in any in this field
[02:14:54] They're deeper than in any in this field that they're in. All of them. I have 60
[02:14:56] that they're in. All of them. I have 60 of them. They're all superhuman to me.
[02:14:59] of them. They're all superhuman to me. >> And somehow I'm sitting in the middle
[02:15:01] >> And somehow I'm sitting in the middle orchestrating all 60 of them. And so you
[02:15:03] orchestrating all 60 of them. And so you got to ask yourself,
[02:15:05] got to ask yourself, what is what is it about a dishwasher
[02:15:09] what is what is it about a dishwasher that allows that dishwasher to sit in
[02:15:11] that allows that dishwasher to sit in the middle of superhumans?
[02:15:13] the middle of superhumans? Does that make sense?
[02:15:15] Does that make sense? >> And so, but that's my point. My point is
[02:15:18] >> And so, but that's my point. My point is intelligence is a is a functional thing.
[02:15:22] intelligence is a is a functional thing. Humanity is not a not specified
[02:15:25] Humanity is not a not specified functionally.
[02:15:26] functionally. It's a much much bigger word. and and
[02:15:29] It's a much much bigger word. and and our life experience, our tolerance for
[02:15:33] our life experience, our tolerance for pain, our determination,
[02:15:36] pain, our determination, those are those are different words in
[02:15:37] those are those are different words in intelligence. And so the the thing that
[02:15:40] intelligence. And so the the thing that I I want to help the audience
[02:15:42] I I want to help the audience understand, if I could give them one
[02:15:44] understand, if I could give them one thing is is intelligence is a word that
[02:15:47] thing is is intelligence is a word that we've elevated to very high form over
[02:15:50] we've elevated to very high form over time. the the word we should really
[02:15:52] time. the the word we should really elevate is humanity, character,
[02:15:54] elevate is humanity, character, humanity, all of those things,
[02:15:57] humanity, all of those things, compassion, generosity,
[02:16:00] compassion, generosity, all of the things that you say just now,
[02:16:03] all of the things that you say just now, >> I believe those are superhuman powers
[02:16:06] >> I believe those are superhuman powers and that now intelligence is going to be
[02:16:08] and that now intelligence is going to be commoditized because we've spoken about
[02:16:10] commoditized because we've spoken about it. The most important thing is your
[02:16:12] it. The most important thing is your education. The most now even even when
[02:16:15] education. The most now even even when they said the most important thing is
[02:16:16] they said the most important thing is your education. when you went to school,
[02:16:18] your education. when you went to school, there's more than just knowledge that
[02:16:20] there's more than just knowledge that you gained.
[02:16:22] you gained. >> And so, but unfortunately, our society
[02:16:25] >> And so, but unfortunately, our society had put everything into one single word.
[02:16:28] had put everything into one single word. And life is more than one word. And I'm
[02:16:31] And life is more than one word. And I'm just telling you, my life would suggest
[02:16:34] just telling you, my life would suggest that being lower
[02:16:36] that being lower on the intelligence curve than everybody
[02:16:39] on the intelligence curve than everybody around me doesn't change the fact I'm
[02:16:42] around me doesn't change the fact I'm the most successful. And so and and I
[02:16:45] the most successful. And so and and I think I think that that kind of is I'm
[02:16:47] think I think that that kind of is I'm trying hopefully to inspire everybody
[02:16:49] trying hopefully to inspire everybody else that don't let this de
[02:16:51] else that don't let this de democratization of intelligence, this
[02:16:53] democratization of intelligence, this commoditization of intelligence,
[02:16:57] commoditization of intelligence, you know, cause you anxiety. You should
[02:16:59] you know, cause you anxiety. You should be inspired by that.
[02:17:00] be inspired by that. >> Yeah. I I I think uh AI will help us
[02:17:02] >> Yeah. I I I think uh AI will help us celebrate humans more. And I'm certainly
[02:17:07] celebrate humans more. And I'm certainly humanity and human first. And I I think
[02:17:11] humanity and human first. And I I think what makes this world incredible is
[02:17:13] what makes this world incredible is humans forever will be so. And just AI
[02:17:16] humans forever will be so. And just AI is this incredible tool that makes us
[02:17:18] is this incredible tool that makes us >> That's exactly right.
[02:17:19] >> That's exactly right. >> Humans more powerful.
[02:17:20] >> Humans more powerful. >> That's exactly right.
[02:17:21] >> That's exactly right. >> Uh so much of the success of Nvidia
[02:17:25] >> Uh so much of the success of Nvidia and um the lives of millions of people
[02:17:28] and um the lives of millions of people that I mentioned uh depend on you.
[02:17:31] that I mentioned uh depend on you. Uh but you're just one human like we
[02:17:33] Uh but you're just one human like we mentioned u mortal like all of us. Do
[02:17:36] mentioned u mortal like all of us. Do you think about your mortality? Are you
[02:17:39] you think about your mortality? Are you afraid of death?
[02:17:42] afraid of death? >> I really don't want to die. Um, I have a
[02:17:45] >> I really don't want to die. Um, I have a great life. I have a great family.
[02:17:48] great life. I have a great family. I have really important work.
[02:17:53] I have really important work. Uh,
[02:17:54] Uh, this is this is not a once in a once in
[02:17:58] this is this is not a once in a once in a lifetime experience suggests that it
[02:18:02] a lifetime experience suggests that it has been experienced by many people just
[02:18:05] has been experienced by many people just not one person. Uh this is a once in a
[02:18:08] not one person. Uh this is a once in a humanity experience what I'm going
[02:18:10] humanity experience what I'm going through. Uh Nvidia is one of the most
[02:18:13] through. Uh Nvidia is one of the most consequential technology companies in
[02:18:14] consequential technology companies in history. We're doing very important
[02:18:16] history. We're doing very important work. I take it very seriously.
[02:18:19] work. I take it very seriously. Um
[02:18:20] Um and and so some of the some of the
[02:18:22] and and so some of the some of the things that that of course are are
[02:18:24] things that that of course are are practical things like how do we think
[02:18:27] practical things like how do we think about succession planning and and um I
[02:18:32] about succession planning and and um I I'm famous in saying that I don't
[02:18:33] I'm famous in saying that I don't believe in succession planning
[02:18:36] believe in succession planning and and the reason the reason for that
[02:18:38] and and the reason the reason for that the reason for that isn't because I'm
[02:18:40] the reason for that isn't because I'm immortal. Um the reason for that is
[02:18:42] immortal. Um the reason for that is because if you're worried about
[02:18:46] because if you're worried about succession planning, if you're worried
[02:18:49] succession planning, if you're worried all that anxiety of succession planning,
[02:18:50] all that anxiety of succession planning, then what should you do about it? Then
[02:18:52] then what should you do about it? Then you break it all the way back down. The
[02:18:54] you break it all the way back down. The most important thing you should do today
[02:18:56] most important thing you should do today if you care about the future of your
[02:18:57] if you care about the future of your company post you is to pass on
[02:19:01] company post you is to pass on knowledge, information, insight, skills,
[02:19:05] knowledge, information, insight, skills, experience as often and continuously as
[02:19:07] experience as often and continuously as you can. which is the reason why I
[02:19:09] you can. which is the reason why I continuously reason about everything in
[02:19:11] continuously reason about everything in front of my team. Every single meeting
[02:19:14] front of my team. Every single meeting is about a reasoning meeting. Every
[02:19:17] is about a reasoning meeting. Every moment I spend inside a company, outside
[02:19:19] moment I spend inside a company, outside the company is about passing on
[02:19:21] the company is about passing on knowledge to people as fast as I can.
[02:19:23] knowledge to people as fast as I can. Nothing I learn ever sits on my desk
[02:19:27] Nothing I learn ever sits on my desk longer than, you know, a fraction of a
[02:19:29] longer than, you know, a fraction of a second. I'm passing that information,
[02:19:31] second. I'm passing that information, that know. Oh my gosh, this is cool.
[02:19:33] that know. Oh my gosh, this is cool. Before I even finish learning all of it
[02:19:35] Before I even finish learning all of it myself, I've already pointing it to
[02:19:37] myself, I've already pointing it to somebody else. get on this. This is so
[02:19:38] somebody else. get on this. This is so cool. You're going to want to you're
[02:19:40] cool. You're going to want to you're going to want to learn this. And so I'm
[02:19:42] going to want to learn this. And so I'm constantly passing knowledge, empowering
[02:19:45] constantly passing knowledge, empowering people, elevating the capability of
[02:19:48] people, elevating the capability of everybody around me so that
[02:19:51] everybody around me so that um the outcome that I that I seek that I
[02:19:55] um the outcome that I that I seek that I hope for is that I die on the job, you
[02:19:58] hope for is that I die on the job, you know, and and hopefully I die on the job
[02:20:00] know, and and hopefully I die on the job instantaneously. You
[02:20:03] instantaneously. You and there's no long periods of
[02:20:04] and there's no long periods of suffering, you know. Well, from a fan
[02:20:07] suffering, you know. Well, from a fan perspective, given your your uh
[02:20:10] perspective, given your your uh extremely
[02:20:13] extremely um your enormous positive impact on on
[02:20:16] um your enormous positive impact on on civilization, of course, I hope you keep
[02:20:18] civilization, of course, I hope you keep going, but also it's just fun to watch
[02:20:20] going, but also it's just fun to watch what is doing. You're, you know, it's
[02:20:23] what is doing. You're, you know, it's just the rate of innovation and I'm a
[02:20:25] just the rate of innovation and I'm a huge fan of engineering. It's so much
[02:20:27] huge fan of engineering. It's so much incredible engineering is continuously
[02:20:29] incredible engineering is continuously being done by Nvidia. It's just fun to
[02:20:31] being done by Nvidia. It's just fun to watch. It's a celebration of humanity.
[02:20:33] watch. It's a celebration of humanity. is a celebration of great builders, a
[02:20:35] is a celebration of great builders, a celebration of great engineering. So it
[02:20:37] celebration of great engineering. So it represents something special. Uh so I
[02:20:40] represents something special. Uh so I hope uh you and Nvidia keep going. What
[02:20:42] hope uh you and Nvidia keep going. What gives you hope about this whole thing we
[02:20:45] gives you hope about this whole thing we got going on about humanity? About the
[02:20:47] got going on about humanity? About the future of humanity when you look out and
[02:20:49] future of humanity when you look out and you think about the future quite a bit
[02:20:51] you think about the future quite a bit when you look out 10, 20, 50, 100 years
[02:20:53] when you look out 10, 20, 50, 100 years from now, what gives you hope? I I've
[02:20:56] from now, what gives you hope? I I've always had I've always had uh uh great
[02:20:59] always had I've always had uh uh great confidence in in the in the kindness
[02:21:06] confidence in in the in the kindness uh the generosity
[02:21:09] uh the generosity uh
[02:21:10] uh um the compassion, the human capacity.
[02:21:14] um the compassion, the human capacity. I've always been extremely confident of
[02:21:18] I've always been extremely confident of that. sometimes um
[02:21:23] more so than I should and and I I get
[02:21:26] more so than I should and and I I get taken advantage of. But it doesn't it
[02:21:28] taken advantage of. But it doesn't it doesn't ever cause me not to. I start
[02:21:32] doesn't ever cause me not to. I start with always
[02:21:34] with always uh that that people want want to do
[02:21:36] uh that that people want want to do good. People want to um uh help others
[02:21:41] good. People want to um uh help others and
[02:21:43] and uh vastly I am proven right,
[02:21:48] uh vastly I am proven right, constantly proven right and and often
[02:21:53] constantly proven right and and often uh exceeds my expectations
[02:21:56] uh exceeds my expectations and and so I have complete confidence in
[02:21:59] and and so I have complete confidence in the human capacity.
[02:22:01] the human capacity. I think the the the thing that the
[02:22:04] I think the the the thing that the things that give me incredible hope
[02:22:07] things that give me incredible hope is what I see as as I extrapolate as I
[02:22:11] is what I see as as I extrapolate as I what I see now is possible and as I
[02:22:15] what I see now is possible and as I extrapolate
[02:22:16] extrapolate um based on the things that we're doing
[02:22:18] um based on the things that we're doing what will very likely happen
[02:22:22] what will very likely happen >> and and um and that there's so many
[02:22:25] >> and and um and that there's so many things that we want to solve there's so
[02:22:28] things that we want to solve there's so many problems we want to solve there's
[02:22:29] many problems we want to solve there's so any things that we want to build.
[02:22:32] so any things that we want to build. There's so many good things that we want
[02:22:34] There's so many good things that we want to do that are now within our reach and
[02:22:37] to do that are now within our reach and within the reach of my my lifetime. You
[02:22:40] within the reach of my my lifetime. You just can't possibly not be romantic
[02:22:44] just can't possibly not be romantic about that. You know what I'm saying?
[02:22:46] about that. You know what I'm saying? >> Yeah. What an exciting time to be alive.
[02:22:48] >> Yeah. What an exciting time to be alive. >> Yeah.
[02:22:48] >> Yeah. >> Like truly truly. So
[02:22:50] >> Like truly truly. So >> how can you not be romantic about about
[02:22:52] >> how can you not be romantic about about about that? the the the fact that that
[02:22:56] about that? the the the fact that that there is a there it's a reasonable thing
[02:22:59] there is a there it's a reasonable thing to expect the end of disease. It's a
[02:23:02] to expect the end of disease. It's a reasonable thing to expect. It's a
[02:23:04] reasonable thing to expect. It's a reasonable thing to expect that
[02:23:06] reasonable thing to expect that pollution will be drastically reduced.
[02:23:09] pollution will be drastically reduced. It's a reasonable thing to expect that
[02:23:12] It's a reasonable thing to expect that traveling at the speed of light is
[02:23:15] traveling at the speed of light is actually in our future. And then you
[02:23:17] actually in our future. And then you know not not for long distances but
[02:23:20] know not not for long distances but short distances you know you people ask
[02:23:22] short distances you know you people ask me how you well first of all very soon
[02:23:24] me how you well first of all very soon I'm going to put a humanoid on a
[02:23:26] I'm going to put a humanoid on a spaceship and it's going to be you know
[02:23:28] spaceship and it's going to be you know my humanoid and and we're going to send
[02:23:30] my humanoid and and we're going to send it out as soon you know as soon as
[02:23:32] it out as soon you know as soon as possible and it's going to keep
[02:23:34] possible and it's going to keep improving and enhancing along the flight
[02:23:36] improving and enhancing along the flight >> and then when it's time
[02:23:39] >> and then when it's time all of the all of my consciousness has
[02:23:41] all of the all of my consciousness has already been you know so much of my life
[02:23:43] already been you know so much of my life has been uploaded in the internet take
[02:23:45] has been uploaded in the internet take all my inbox take everything that I've
[02:23:46] all my inbox take everything that I've done, everything I've said, you know,
[02:23:48] done, everything I've said, you know, it's been collect and be becoming my AI
[02:23:51] it's been collect and be becoming my AI and um I'm just, you know, when the time
[02:23:53] and um I'm just, you know, when the time comes, you know, we just send that at
[02:23:55] comes, you know, we just send that at the speed of light, catch up with my
[02:23:56] the speed of light, catch up with my robot.
[02:23:59] Oh, that's brilliant. I mean, but for
[02:24:02] Oh, that's brilliant. I mean, but for me, that's sort of application focused,
[02:24:04] me, that's sort of application focused, but also for me the curiosity
[02:24:07] but also for me the curiosity uh maxing perspective, I just all of
[02:24:10] uh maxing perspective, I just all of those mysteries. It's so much
[02:24:12] those mysteries. It's so much fascinating scientific questions there.
[02:24:14] fascinating scientific questions there. Understanding the biological machine is
[02:24:16] Understanding the biological machine is is right around the corner. It's it's
[02:24:18] is right around the corner. It's it's not 10 years. It's 5 years probably.
[02:24:20] not 10 years. It's 5 years probably. >> And then your biological machine, the
[02:24:22] >> And then your biological machine, the the human mind and cracking physics,
[02:24:24] the human mind and cracking physics, theoretical physics open. It's so
[02:24:25] theoretical physics open. It's so exciting.
[02:24:26] exciting. >> Explaining consciousness, that one would
[02:24:28] >> Explaining consciousness, that one would be awesome
[02:24:29] be awesome >> and it's all within our reach.
[02:24:30] >> and it's all within our reach. >> Yeah.
[02:24:31] >> Yeah. >> Uh Jensen, thank you so much for
[02:24:32] >> Uh Jensen, thank you so much for everything you've done over the years.
[02:24:34] everything you've done over the years. Thank you for everything you're doing
[02:24:35] Thank you for everything you're doing for the world. Thank you for being who
[02:24:37] for the world. Thank you for being who you are. Uh, I can tell you're a great
[02:24:40] you are. Uh, I can tell you're a great human being and uh, I wish you
[02:24:44] human being and uh, I wish you incredible success this year. I can't
[02:24:46] incredible success this year. I can't wait as a fan. I can't wait to see what
[02:24:48] wait as a fan. I can't wait to see what you do next and hopefully I'll see you
[02:24:49] you do next and hopefully I'll see you in Taiwan. And thank you so much for
[02:24:51] in Taiwan. And thank you so much for talking today.
[02:24:52] talking today. >> Thank you, Lex. I had a great time and
[02:24:54] >> Thank you, Lex. I had a great time and and also if I could just say one more
[02:24:56] and also if I could just say one more thing
[02:24:56] thing >> and thank you for all the interviews
[02:24:58] >> and thank you for all the interviews that you do, the depth, the the respect
[02:25:02] that you do, the depth, the the respect that you go through with and the
[02:25:04] that you go through with and the research that you do uh to reveal, you
[02:25:07] research that you do uh to reveal, you know, for all of us, uh the the amazing
[02:25:10] know, for all of us, uh the the amazing people that you've interviewed over the
[02:25:12] people that you've interviewed over the years. I've enjoyed I I've enjoyed them
[02:25:14] years. I've enjoyed I I've enjoyed them immensely and and and as an innovator to
[02:25:18] immensely and and and as an innovator to have created this long form unbelievable
[02:25:22] have created this long form unbelievable and and yet you know it's just
[02:25:24] and and yet you know it's just captivating. So anyways, thank you for
[02:25:25] captivating. So anyways, thank you for everything you do.
[02:25:26] everything you do. >> It means the world. Thank you, Jess.
[02:25:27] >> It means the world. Thank you, Jess. >> Thank you, Lex. Thank you for listening
[02:25:30] >> Thank you, Lex. Thank you for listening to this conversation with Jensen Kuang.
[02:25:32] to this conversation with Jensen Kuang. To support this podcast, please check
[02:25:34] To support this podcast, please check out our sponsors in the description
[02:25:36] out our sponsors in the description where you can also find links to contact
[02:25:38] where you can also find links to contact me, ask questions, give feedback, and so
[02:25:41] me, ask questions, give feedback, and so on. And now let me leave you with some
[02:25:44] on. And now let me leave you with some words from Alan K.
[02:25:46] words from Alan K. The best way to predict the future is to
[02:25:49] The best way to predict the future is to invent it.
[02:25:51] invent it. Thank you for listening and hope to see
[02:25:53] Thank you for listening and hope to see you next time.