Nvidia’s (NVDA) Memory Shortage Could Cause Google (GOOGL) Billions

Nvidia NVDA +0.75% ▲ is buying up High‑bandwidth memory (HBM) for its AI chips at a pace that is triggering a global shortage and driving up costs for cloud providers across the industry. Alphabet GOOGL +0.91% ▲ , which also depends on this memory for its AI infrastructure, could take the hardest hit, since it will have to pay more to keep its Cloud’s servers running.

Claim 70% Off TipRanks Premium

  • Unlock hedge fund-level data and powerful investing tools for smarter, sharper decisions

  • Stay ahead of the market with the latest news and analysis and maximize your portfolio’s potential

Nvidia’s Memory Demand Surges

According to reports from The Kobeissi Letter on X, HBM, the specialized type of Dynamic Random-Access Memory (DRAM) used in AI hardware is becoming a critical constraint for the global semiconductor industry. Nvidia is reported to be sourcing next-generation HBM4 memory from Samsung SSNLF +54.05% ▲ and SK Hynix (HXSCL) for its upcoming Rubin AI accelerator, which is expected to launch in the second half of 2026.

The Rubin chip requires up to 288GB of HBM4 memory per GPU, far more than typical devices, which usually have 8–12GB in PCs or smartphones. These massive requirements are straining suppliers, all of whom report that HBM production is fully booked through 2026. Industry forecasts project that demand for high-bandwidth memory from AI accelerators such as Nvidia’s GPUs could rise by about 70% in 2026.

The scale of the firm’s purchases is also driving sharp price increases for common computer memory used in PCs and servers, with spot prices for 16GB DDR4 RAM jumping 2,352% year-over-year to $76.90, and 8GB DDR4 rising 1,873% to $28.90. Even Nvidia is feeling the pressure, with CEO Jensen Huang urging memory manufacturers to expand production capacity to keep up with demand from AI chips.

AI Memory Shortage Pressures Google and Others

The global shortage of HBM will affect major AI developers, including Google, Advanced Micro Devices AMD +1.85% ▲ , and OpenAI, all of which require large amounts of advanced memory to run AI systems. Google provides a clear example as the company relies on custom TPUs (Tensor Processing Units) to power its AI workloads. However, its chips still depend on the same high-end memory that Nvidia is securing in large volumes, leaving competitors to compete for a limited supply.

DeepMind CEO Demis Hassabis has described the shortage as a “choke point,” highlighting how constrained memory availability can slow the expansion of AI infrastructure across the industry. The supply crunch is also pushing up technology costs more broadly, affecting products such as PCs, gaming consoles, and smartphones. As newer generations of Nvidia’s AI chips are expected to require increasingly larger memory capacity, demand will place growing pressure on the already limited global supply.

Is Nvidia a Strong Buy Now?

According to 40 Wall Street analysts tracked by the TipRanks, Nvidia currently holds a “Strong Buy” consensus, with 39 Buy ratings and one Hold, and a 12-month average price target of $272.16. Meanwhile, Google has 32 analyst ratings overall, including 26 Buy and six Hold recommendations. To see how these major AI and semiconductor stocks may perform amid tightening memory supply, investors can compare forecasts using the TipRanks Stock Comparison tool.

Disclaimer & DisclosureReport an Issue

This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin