【AI+NVDA】Jensen Huang article: AI is a "Five-Layer Cake" and Still Requires Trillions of Dollars in Investment (Full Text Included)

Nvidia CEO Jensen Huang wrote on the Nvidia blog on Tuesday (the 10th), comparing the development of AI artificial intelligence to a “five-layer cake” structure, and stated that there are still trillions of dollars worth of AI infrastructure to be built.

Huang said that AI is one of the powerful forces shaping the world today. AI operates on real hardware, energy, and economic systems. It can transform raw materials into intelligent capabilities that can operate at large scale. Every company will use AI, and every country will develop AI.

					▼Click the image to enlarge
					

				


	
	
	
	
	

				

				


	
	
	
	
	

				

				


	
	
	
	
	

						+5
				

				


	
	
	
	
	

						+4

AI transformation doesn’t require a PhD in computer science

Huang breaks down the AI architecture into five layers, from bottom to top: energy, chips, infrastructure, models, and applications. He states that AI development has just begun; currently, only hundreds of billions of dollars have been invested, and trillions of dollars of infrastructure remain to be built. The workforce needed for this construction is enormous—electricians, plumbers, pipefitters, steelworkers, network technicians, installers, and operators—all highly skilled and well-paid jobs that are currently in short supply. And “participating in this transformation does not require a PhD in computer science.”

Huang also points out that AI is increasing productivity across the knowledge economy. Productivity creates capacity, and capacity drives growth. For example, in radiology, AI can assist in interpreting scans, but the demand for radiologists continues to grow. This is not a contradiction.

Huang said: “The role of radiologists is to care for patients. Interpreting images is just one part of their work. As AI takes on more routine tasks, radiologists can focus on judgment, communication, and care. Hospital productivity increases, serving more patients and hiring more staff.”

Huang’s original text:

AI is a five-layer cake

AI is one of the most powerful forces shaping the world today. It’s not just smart applications or single models, but an essential infrastructure like electricity and the internet.

AI operates on real hardware, real energy, and real economic systems. It transforms raw materials into intelligent capabilities that can operate at large scale. Every company will use AI, and every country will build AI.

To understand why AI is developing this way, it helps to start from first principles and observe the fundamental changes happening in the computing field.

From pre-recorded software to real-time intelligence

Throughout most of the history of computing, software has been “pre-recorded.” Humans write algorithms, and computers execute them. Data must be carefully structured, stored in tables, and retrieved through precise queries. SQL (Structured Query Language) is indispensable because it enables this kind of computational world.

AI breaks this pattern.

For the first time, we have computers capable of understanding unstructured information. They can recognize images, read text, listen to sounds, and understand their meaning; they can also reason about context and intent. More importantly, they can generate intelligence in real time.

Every response is newly generated; every answer depends on the context you provide. This is no longer software retrieving from pre-stored instructions but software reasoning in real time and generating intelligence on demand.

Because intelligence is generated instantly, the entire underlying computing stack must be redesigned.

AI as infrastructure

From an industry perspective, AI can be viewed as a five-layer stack.

Energy

The bottom layer is energy. Real-time generated intelligence requires real-time power. Each token generated involves electron flow, heat management, and energy conversion into computational capacity. There is no abstraction below this. Energy is the first principle of AI infrastructure and the fundamental constraint on how much intelligence the system can produce.

Chips

Above energy are chips. These processors are designed to efficiently convert energy into computational power at large scale. AI workloads demand massive parallel processing, high-bandwidth memory, and high-speed interconnects. Progress in chip technology determines how quickly AI can scale and how low the cost of intelligence can go.

Infrastructure

Above chips is infrastructure. This includes land, power delivery, cooling, construction, networking, and systems that coordinate tens of thousands of processors into a single machine. These systems are not for storing information but for manufacturing intelligence.

Models

Above infrastructure is the model layer. AI models can understand many types of information, including language, biology, chemistry, physics, finance, medicine, and the real world itself. Language models are just one type. The most transformative advances are happening in protein AI, chemistry AI, physics simulation, robotics, and autonomous systems.

Applications

The top layer is applications, where economic value is created. Examples include drug discovery platforms, industrial robots, copilot legal assistants, and autonomous vehicles. Autonomous vehicles are an AI application embodied in machines; humanoid robots are AI applications embodied in bodies. The same stack, different outcomes.

This is the five-layer cake: Energy → Chips → Infrastructure → Models → Applications

Every successful application depends on the layers below, extending down to the power plants that support its operation.

This construction has only just begun. We have invested only hundreds of billions of dollars, but trillions of dollars of infrastructure remain to be built.

Around the world, we are seeing chip factories, computer assembly plants, and AI factories being built on an unprecedented scale. This is becoming the largest infrastructure project in human history.

The workforce needed to support this construction is enormous. AI factories require electricians, plumbers, pipefitters, steelworkers, network technicians, installers, and operators.

These are highly skilled, well-paid jobs, and they are in short supply. Participating in this transformation does not require a PhD in computer science.

At the same time, AI is increasing productivity across the knowledge economy. For example, AI can assist in interpreting scans in radiology, but the demand for radiologists continues to grow. This is not a contradiction.

The role of radiologists is to care for patients. Interpreting images is just one part of their work. As AI takes on more routine tasks, radiologists can focus on judgment, communication, and care. Hospital productivity increases, serving more patients and hiring more staff.

Productivity creates capacity, and capacity drives growth.

What has changed in the past year?

Over the past year, AI has crossed an important threshold: model capabilities have reached a level suitable for large-scale application for the first time. Reasoning ability has improved, hallucination phenomena have significantly decreased, and grounding ability has greatly increased. AI-based applications have also begun to generate real economic value for the first time.

In drug discovery, logistics, customer service, software development, and manufacturing, AI applications have demonstrated strong product-market fit. These applications demand high performance from every layer below.

Open-source models play a key role. Most models worldwide are free, and researchers, startups, companies, and even entire countries rely on open-source models to participate in advanced AI development. When open-source models reach the cutting edge, they not only change software but also trigger demand across the entire tech stack.

DeepSeek-R1 is a powerful example. By making strong reasoning models widely available, it accelerates application adoption and increases demand for training compute, infrastructure, chips, and energy.

What does this mean?

When you see AI as a necessary infrastructure, its implications are clear.

The starting point of AI may be a large language model (LLM) based on the Transformer architecture, but it is much more than that. It’s an industry transformation that is reshaping how energy is produced and used, how factories are built, how work is organized, and how economies grow.

The reason AI factories are being built is that intelligence can now be generated instantly. Chips are being redesigned because efficiency determines how fast intelligence can scale. Energy is at the core because it limits the overall potential for producing intelligence. The acceleration of applications is driven by the fact that the underlying models have crossed a threshold and are now truly practical at large scale.

Each layer reinforces the others.

That’s why the scale of construction is so enormous, why it touches so many industries, and why it will not be limited to a single country or industry. Every company will use AI, every country will build AI. We are still in the very early stages. Much infrastructure remains to be built, many workers need training, and many opportunities are yet to be realized.

But the direction is clear.

AI is becoming the most fundamental infrastructure of the modern world. The choices we make now, the speed at which we build, the scope of our participation, and how responsibly we deploy AI will shape the future of this era.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin