In a recent thematic investment report, Barclays analysts discussed the energy demand that is set to accompany the rise of AI technologies, with a particular focus on NVIDIA (NASDAQ:)’s role in this landscape.
According to analysts, the expected power needs associated with advances in artificial intelligence highlight a crucial aspect of NVIDIA’s market outlook.
Barclays analysis suggests that data centers could consume more than 9% of current U.S. electricity demand by 2030, driven largely by the power demands of AI. The analysts noted that “AI embedded in the NVIDIA consensus” is a key factor behind this high power forecast.
The report also notes that while AI efficiency continues to improve with each new generation of GPUs, the size and complexity of AI models is growing at a rapid pace. For example, the size of major large language models (LLMs) has been increasing by about 3.5 times per year.
Despite these improvements, overall power demand is expected to rise as AI applications expand. Each new generation of GPUs, such as NVIDIA’s Hopper and Blackwell series, are becoming more power efficient. However, larger and more complex AI models require significant computational power.
“Large language models require massive computational power to achieve real-time performance,” the report says. “The computational demands of large language models also translate into higher power consumption as more and more memory, accelerators, and servers are needed to scale, train, and infer from these models.”
“Organizations aiming to deploy real-time reasoning LLMs will have to address these challenges,” Barclays added.
To illustrate the magnitude of this power demand, Barclays projects that running nearly 8 million GPUs would require about 14.5 gigawatts of power, equivalent to about 110 terawatt hours of energy. This projection assumes an average load factor of 85%.
About 70% of these GPUs are expected to be deployed in the US by the end of 2027, which equates to more than 10 gigawatts and 75 terawatt-hours of AI power and energy demand in the US alone over the next three years.
“Nvidia’s market cap suggests this is just the beginning of AI power demand,” analysts said. The chipmaker’s continued development and deployment of GPUs is expected to drive a significant increase in power consumption across data centers.
Furthermore, the reliance on grid power for data centers emphasizes the importance of meeting peak power demand. Data centers operate continuously, requiring a balanced power supply.
The report cites a prominent statement by Sam Altman, CEO of OpenAI, at the World Economic Forum in Davos, where he said: “We need a lot more energy in the world than we thought we needed before… I think we still underestimate the energy requirements of this technology.”
More Stories
Bank of Japan decision, China PMI, Samsung earnings
Dow Jones Futures: Microsoft, MetaEngs Outperform; Robinhood Dives, Cryptocurrency Plays Slip
Strategist explains why investors should buy Mag 7 ‘now’