Episode 49

NVIDIA's AI Empire: The $1 Trillion Stack

NVIDIA built a trillion-dollar AI empire spanning 5 layers. Is this the greatest tech moat ever, or a bubble waiting to pop?

At the World Economic Forum in Davos in January 2026, Jensen Huang — NVIDIA’s leather-jacketed CEO — sat down on the main stage with Larry Fink, CEO of BlackRock (the world’s largest asset manager), and laid out a blueprint for where trillions of dollars are headed over the next decade. Not millions. Not billions. Trillions, plural. And he did it through a deceptively simple metaphor: a five-layer cake.

The framework Huang presented isn’t just corporate storytelling. It’s genuinely the clearest map available for understanding why the AI buildout is happening, where the money is going, and whether any of it makes sense. Whether you’re an investor, a policymaker, a student deciding what to study, or just someone wondering why your electricity bill keeps going up — the five-layer cake is the place to start.

The Five-Layer Cake

Huang broke the AI economy into five interdependent layers, each building on the one below it.

Layer 1: Energy. At the very foundation sits raw power generation. AI systems process and generate intelligence in real time, and that takes enormous amounts of electricity. Before the fancy algorithms, before the chips, before anything — someone has to keep the lights on.

Layer 2: Chips and Computing Infrastructure. This is NVIDIA’s home turf — the GPUs, processors, and silicon that actually perform the mathematical operations underlying AI. The “crunchy part of the cake,” as Fortune put it.

Layer 3: Cloud Infrastructure. The data centers and cloud services that package all that computing power and deliver it on demand. This is where companies like Amazon Web Services, Microsoft Azure, and Google Cloud operate.

Layer 4: AI Models. The systems people actually interact with — ChatGPT, Claude, DeepSeek, Gemini. The intelligence layer, trained on the infrastructure below it.

Layer 5: Applications. Healthcare, finance, manufacturing, robotics, drug discovery — where economic benefit ultimately happens. Huang emphasized this is the layer that transforms the investment in layers 1 through 4 into real-world value.

The critical insight is dependency. You can’t have applications without models. Can’t have models without cloud infrastructure. Can’t have cloud without chips. And can’t have chips without energy. Every single layer must be built and operated. Skip one, and the layers above it collapse.

It’s like Maslow’s hierarchy of needs, but for artificial intelligence.

The Scale of the Buildout

The numbers at the lower layers are genuinely staggering. TSMC — the Taiwanese company that manufactures most of the world’s advanced chips — announced it’s building twenty new chip fabrication plants. Foxconn, Wistron, and Quanta are constructing thirty new computer manufacturing plants — not making laptops, but building the machines that populate AI factories. Micron has committed $200 billion to memory chip production in the United States alone.

Two hundred billion dollars from a single company in a single country. Physically, this means massive fabrication facilities, clean rooms the size of football fields, and thousands of construction workers. Huang specifically highlighted the human element: salaries for tradespeople building these facilities have nearly doubled. Six-figure incomes for plumbers, electricians, steelworkers, and network technicians.

The first wave of jobs created by AI isn’t coders — it’s hard hats.

Huang described what’s happening as “the largest infrastructure buildout in human history.” And the kicker: we’re only a few hundred billion dollars into what will eventually be a multi-trillion-dollar project.

The Big Four hyperscalers — Microsoft, Alphabet, Amazon, and Meta — are projected to spend a combined $500 billion in capital expenditure in 2026 alone. Half a trillion from four companies in one year. Meanwhile, the semiconductor industry is pacing toward $1 trillion in annual revenue by end of 2026 — a milestone analysts didn’t expect until the 2030s.

The Bubble Test

Every time someone says “trillions of dollars” and “largest in human history,” the bubble radar starts pinging. And the skeptics have real ammunition. Throughout 2025, big names like Jeff Bezos, Goldman Sachs CEO David Solomon, and even Satya Nadella warned about potential AI overcapacity. An MIT study found 95% of generative AI pilots at companies were failing to generate a return on investment.

Ninety-five percent. That’s a terrifying number.

But Huang offered an elegant counter-test. He said: go try to rent an NVIDIA GPU right now. Just try it. The spot price of GPU rentals is going up — not just for the latest generation chips, but for hardware that’s two generations old. H100 equivalents were sitting around $2.27 per hour in mid-January, with European rates pushing $3 or more.

In a bubble, you’d expect oversupply. Prices would be cratering. Instead, there’s scarcity across the board. Even old GPUs are in demand — the opposite of what a bubble looks like.

Huang pointed to Eli Lilly as a concrete example: the pharmaceutical giant is actively shifting R&D budget from traditional wet labs to AI supercomputing. That’s not speculative hype. That’s an established company fundamentally restructuring how it does science because AI has become more productive than traditional methods for certain research tasks.

By the end of their conversation, Fink actually flipped the question. He said: “What I’m hearing is, we’re far from an AI bubble. The question is — are we investing enough?”

The BlackRock CEO wondering if the world is under-investing in AI is a significant perspective shift.

Huang backed it up by noting that 2025 was the largest year for global venture capital investment on record — over $100 billion worldwide, with most flowing into what he called “AI-native companies” in healthcare, robotics, manufacturing, and financial services. These aren’t speculative moonshots. They’re firms building real products on top of AI models that are finally mature enough to support them.

Physical Intelligence: AI Beyond Language

Perhaps the most fascinating part of Huang’s presentation was his description of three breakthroughs from 2025 that are reshaping what AI can do.

Breakthrough 1: Reliable Reasoning. AI models went from “novel and interesting” to “reliable and down-to-earth.” The key leap was chain-of-thought reasoning, where models learned to break down complex problems step by step, form research plans, and reason about scenarios they’d never encountered in training data. Instead of just pattern-matching from what they’ve read, models began genuinely thinking through problems. This enabled agentic AI systems — models that don’t just answer questions but can take on tasks, conduct research, and execute multi-step plans.

Breakthrough 2: Open Models. Huang specifically credited DeepSeek with catalyzing this shift. When DeepSeek released a powerful open-source reasoning model, it lowered the barrier for everyone — companies, universities, startups. Suddenly you didn’t need a billion-dollar training budget to build domain-specific AI. Democratization of the model layer meant more people could “bake at layer four.”

Breakthrough 3: Physical Intelligence. This is where it gets genuinely mind-bending. AI is no longer limited to understanding human language. It’s starting to learn the language of nature itself — protein structures, chemical reactions, fluid dynamics, particle physics, quantum mechanics. These natural phenomena follow patterns and rules — a kind of grammar — and AI systems are learning to read and write in that grammar.

The Eli Lilly partnership exemplifies this perfectly. They realized AI has become so good at understanding protein and chemical structures that researchers can essentially “talk” to proteins the way we talk to ChatGPT. What happens if I modify this amino acid sequence? How will this molecule interact with that receptor? The model can simulate millions of interactions before anyone touches a test tube.

This is what Huang means by AI moving from the digital world to the physical world. It’s not just about chatbots anymore. Manufacturing, pharmaceutical R&D, materials science, climate modeling — these fields are being transformed by AI that understands physics, not just language. Robotics is part of it — Huang called it “a once-in-a-generation opportunity” — but physical intelligence is broader: AI that can model and predict real-world phenomena before a single prototype is built.

AI as National Infrastructure

Huang made bold claims about AI as national infrastructure — comparable to roads, electrical grids, or telecommunications networks. His exact words: “AI is infrastructure. You should have AI as part of your infrastructure.” He urged every country to build its own AI capabilities, drawing on local language, culture, and data.

On the workforce question — the fear that AI will destroy jobs — Huang offered radiology as a case study. Over the past decade, AI has diffused into every aspect of radiology: reading scans, flagging anomalies, prioritizing cases. You’d think radiologists would be out of work. Instead, there are more radiologists now than before AI entered the field.

Huang’s logic: the purpose of a radiologist isn’t to stare at scans. The purpose is to diagnose disease and help patients. Studying scans is just a task. When AI handles that task faster, doctors spend more time with patients, see more cases, and hospitals hire more staff.

“Distinguish between the purpose and the task.” That’s a framework that applies to almost every profession.

He made the same argument about nursing. The U.S. faces a shortage of roughly five million nurses, partly because nurses spend nearly half their time on documentation and charting. Companies like Abridge are building AI tools that handle transcription and charting automatically. Hospitals get more productive, outcomes improve, and they hire more nurses — not fewer.

Huang even joked about it with Fink: if someone watched the two of them doing their jobs, they’d probably think they were typists. Automating typing wouldn’t eliminate their jobs because typing isn’t their purpose.

The Crucial Distinction

The dot-com bubble comparison looms large over any conversation about AI investment. But Huang’s counter-argument highlights a crucial difference. A lot of the dot-com bubble consisted of financial assets built on top of financial assets — paper value with nothing underneath. What Huang describes is concrete. Steel, silicon, copper wire, electricity generation, fabrication plants, construction workers earning six figures.

Even if the application layer stumbles — if 95% of AI pilots continue to fail, if the killer apps take longer than expected — those lower layers have durable value. The chips still compute. The data centers still operate. The energy infrastructure still generates power. The cake might not taste perfect yet, but the oven and the kitchen aren’t going anywhere.

This doesn’t mean the AI buildout is risk-free. The history of infrastructure booms includes plenty of overbuilding — railroad bubbles, telecom overbuild in the 1990s, fiber-optic gluts. But even those busts left behind physical infrastructure that eventually got used. The railroads still run. The fiber still carries data.

The Takeaway

Huang’s five-layer framework is genuinely useful regardless of your relationship to AI. Every conversation about the technology should start with: which layer are we talking about? Energy and chip manufacturing are not the same conversation as model capabilities, which is not the same conversation as applications in healthcare.

The scale is real. The investment is unprecedented. The infrastructure being built is physical and durable. Whether the returns justify the spending — whether this is the greatest industrial transformation since electrification or the biggest misallocation of capital in history — remains an open question.

But here’s the test Huang offered, and it’s worth remembering: go try to rent a GPU. If you can’t get one at a reasonable price, the demand is real. If demand is real, it’s not a bubble.

It’s infrastructure. And we’re only a few hundred billion into a multi-trillion-dollar build.

Related Articles

Episode 1Jul 18

Creatine: From Discovery to Health Benefits

Discover the science behind creatine supplementation: muscle growth, brain health benefits, exercise performance, and safety. Learn how this natural compound powers your cells and enhances both physical and cognitive function.

Read More
Episode 10Jul 31

The Health and Science of Heat Therapy

Discover the science of heat therapy: sauna benefits, heat shock proteins, cardiovascular health, and mental wellness. Learn optimal protocols, temperature settings, and safety guidelines for maximum benefits.

Read More