Nvidia CEO Jensen Huang gives AI warning Wall Street can’t ignore

TheStreet

Nvidia CEO Jensen Huang gives AI warning Wall Street can’t ignore

Faizan Farooque

Sun, January 25, 2026 at 2:47 PM EST

5 min read

In this article:

Nvidia CEO Jensen Huang didn't deliver the same old "dog and pony show" at Davos. Everyone expected another long talk about chips; instead, he flipped the script.

The charismatic leader of the world's biggest chipmaker took the opportunity to touch upon electricity, construction, bond issuance, and the part of AI that turns infrastructure into economic value.

Prediction Market powered by

In a mainstage World Economic Forum conversation with BlackRock CEO Larry Fink, the Nvidia (NVDA)founder and CEO described AI in stark terms.

Nvidia quick facts at a glance

  • Revenue (Q3 FY26):$57.0B (up 22% Q/Q, up 62% Y/Y)

  • Data center revenue (Q3 FY26):$51.2B (up 25% Q/Q, up 66% Y/Y)

  • GAAP gross margin (Q3 FY26):73.4%

  • Q4 FY26 revenue outlook:$65.0B (+/– 2%)

  • Shareholder returns (first nine months FY26):$37.0B returned; $62.2B remaining under buyback authorization

Huang’s “five-layer cake” with the money trail attached

Huang’s Davos “five-layer cake” is essentially a scoreboard. Let's take a look at the stack and review the numbers connected with each layer.

Layer 1: Energy/power

  • The IT load in U.S. data centers might increase from about 80 GW in 2025 to more than 150 GW in 2028, according to Bloom Energy's industry assessment from January 2026.

  • The same report states that by 2030, around one in five data center campuses will be bigger than a gigawatt, and by 2035, that number will rise to one in three.

Layer 2: Chips + computing infrastructure

  • Nvidia’s data center business is now so large it’s effectively “macro”: $51.2B in a single quarter.

  • Nvidia’s own earnings commentary has been blunt about demand: “Blackwell sales are off the charts…”

Layer 3: Cloud data centers

  • Goldman Sachs Research cites a $527B consensus estimate for 2026 hyperscaler capex, up from $465B earlier in the earnings season, and notes estimates are revised upward repeatedly.

  • Those hyperscalers spent $106B on capex in Q3 alone (AI and non-AI), up 75% year over year, per Goldman’s summary.

Layer 4: Models

Huang argues for the adoption of AI by stating that it is becoming "default software." He remarked, “AI is super easy to use — it’s the easiest software to use in history.”

Layer 5: Applications

Huang made a key payoff claim: “This layer on top, ultimately, is where economic benefit will happen.”

That last layer is what can transform capex from a "cycle" into something that needs to happen continuously.

The most important AI layer isn’t chips; it’s power

Now, here is where Davos talk can get very specific.

A January 2026 data center power report says the data center business is crossing a border: Power availability is no longer a "planning variable." Rather, power availability now determines the success or failure of certain markets.

Story Continues

Related: Top analyst drops bold call on Morgan Stanley after blowout earnings

Some eye-popping specifics on AI and power needs:

  • Texas is expected to surpass 40 GW of data center capacity by 2028, representing nearly 30% of total U.S. demand, which is a steep 142% jump in market share versus today.

  • Time-to-power expectations are coming apart. Utilities say delivery timelines are about 1.5 to 2 years longer than hyperscalers and colocation providers assume.

  • Grid operators are significantly changing their load projections. For example, ERCOT raised its expectations for data center expansion in 2030 from 29 GW to 77 GW.

  • Onsite generation is going “permanent”: The share of respondents expecting fully onsite-powered campuses by 2030 rose 22% in six months to roughly one-third of data centers.

Huang's thesis, supported by real numbers, asserts that only the bottom layer can scale.

Big Tech is even tapping the bond market to keep building AI capacity

This is when the tale takes a turn that is only possible in 2026.

As the prices of AI infrastructure rise, big tech companies are borrowing money at a record pace. This is not because they can't afford capital expenditures, but because they want flexibility and, some would say, to protect shareholder returns.

More Nvidia:

In Q4 2025, IT companies sold $108.7B in bonds (nearly twice as much as in the previous quarter). In early 2026, they sold $15.5B more, The Washington Post reported, citing Moody's Analytics.

That’s an underrated tell. When the bond market becomes part of the AI supply chain, the "buildout" begins to seem like an industrial cycle instead of a trend in gadgets.

What investors should watch next in the AI buildout story

If Huang is accurate that AI is a five-layer stack, "model excitement" is no longer the main swing factor. There are limits on what you can do and how much money you can spend.

Here’s the practical watch list:

  • Power bottlenecks: Are time-to-power timetables continually being pushed back (and do they force buildouts into "power-advantaged" areas)?

  • Capex revisions: Do the projections for 2026 spending increase again from the $527B consensus that Goldman cites?

  • Nvidia guidance cadence: The market worries more about the slope of guidance than the headline quarter; Nvidia previously guided $65.0B for the following quarter.

  • Proof of application: Are businesses really making money off of AI at the workflow level, which Huang believes is where "economic benefit will happen"?

Investors can summarize Huang's Davos thesis as follows: AI is not a single trade.

It’s a stack of bottlenecks, and right now, the bottleneck that counts most may be the one Wall Street can't "code" its way past.

Related: Bank of America drops bold call on Roblox, sees massive upside

This story was originally published by TheStreet on Jan 25, 2026, where it first appeared in the Economy section. Add TheStreet as a Preferred Source by clicking here.

View Comments

Source