Stock Markets March 17, 2026

Nvidia Elevates Inference Opportunity to $1 Trillion as GTC Shifts Focus to Deployment

Company outlines expanded inference strategy and product roadmap as analysts call the $1T figure a baseline for growing demand

By Maya Rios NVDA
Nvidia Elevates Inference Opportunity to $1 Trillion as GTC Shifts Focus to Deployment
NVDA

At its annual GTC developer conference, Nvidia presented an enlarged revenue view for its AI chips driven by a move from model training toward inference and mass deployment. The company raised the potential addressable market to at least $1 trillion through 2027, introduced new processors and system integrations including technology licensed from Groq, and detailed product roles for Rubin and Groq chips in inference workloads. Analysts broadly welcomed the update, describing the $1 trillion figure as a floor rather than a cap and signaling potential upside to consensus estimates.

Key Points

  • Nvidia increased its AI chip revenue opportunity to at least $1 trillion through 2027, up from a prior $500 billion through 2026 for specific chip lines.
  • The company is prioritizing inference computing with new processors, a system incorporating Groq technology, and agentic AI tools like NemoClaw integrated with OpenClaw.
  • Analysts view the $1 trillion figure as a conservative floor, noting the metric covers only Blackwell and Rubin platforms plus networking and may understate total datacenter demand.

Nvidia used its annual GTC developer conference to emphasize a transition in enterprise AI spending - from the heavy compute of model training toward the high-volume, real-time demands of inference and large-scale deployment. The company portrayed that shift as a major growth driver for its chips over the next several years.

At the event, Nvidia’s chief executive described demand for graphics processing units as "skyrocketing," pointing to a roughly millionfold increase in computing requirements over the past two years as inference workloads scale up. That surge underpins the company’s revised revenue view for its AI hardware.

Nvidia told attendees that the revenue opportunity for its artificial intelligence chips could reach at least $1 trillion through 2027. That new projection represents an increase from the prior $500 billion opportunity through 2026 the company had cited for its Blackwell and Rubin families of processors.

As part of the announcements, Nvidia introduced a new central processor and unveiled an AI system that incorporates technology licensed from Groq, the chip start-up whose designs Nvidia acquired rights to for $17 billion in December. The moves are framed as steps to expand Nvidia’s footprint in inference computing - the process of answering queries - an area that now faces more direct competition from central processing units and custom inference chips developed by other firms.

Historically, Nvidia has been dominant in model training, which required the massive parallel compute of GPUs. The company emphasized at GTC that inference is becoming the next major phase for AI infrastructure. "The inference inflection has arrived," the company said during the keynote. "And demand just keeps on going up."

Nvidia provided details on how it expects specific chips to be deployed within inference workflows. The company said its Vera Rubin chips will address the "prefill" step - transforming user inputs into tokens that feed AI systems - while Groq-derived chips will handle the "decode" stage that generates the model responses.

Looking beyond Rubin, Nvidia referenced a Feynman roadmap expected in 2028 following Rubin Ultra, with limited public detail other than that future generations will include both AI and networking chips. The company also announced initiatives aimed at autonomous AI agents. NemoClaw, which integrates with the OpenClaw platform, was presented as a tool to add privacy and safety controls to agentic systems capable of executing tasks with minimal human direction.

On the market reaction, Nvidia’s shares briefly ticked higher after the conference before trimming gains to finish the day up 1.65%.


Analysts react to GTC and the $1 trillion outlook

Wall Street research teams parsed the $1 trillion projection as a notable signal about demand trajectory and product positioning. Several of the firms that commented framed the disclosure as a conservative baseline that leaves room for upside.

Wolfe Research characterized the updated disclosure as an increase to the prior $500 billion estimate and said the new figure "suggests upside to CY27 revenue, and the company noted that demand was still growing." Wolfe added: "We consider this revenue disclosure to be ambiguous enough so as to not reflect firm guidance, yet still provides significant room for upside vs. consensus. As such, we consider this revenue level to be a floor, not a ceiling."

Bernstein noted that the $1 trillion number, like the earlier $500 billion figure, represents a snapshot with several quarters remaining before CY27 ends. The firm said: "More importantly, Colette confirmed to us that the number includes ONLY Blackwell and Rubin (and associated networking); it does NOT include any other products (such as Groq LPUs, CPX, CPU racks etc). Hence, we suspect datacenter will come in well above this $1T target, and well above expectations." Bernstein added that Nvidia’s roadmap looks strong and that new offerings should help secure its inference position as it already dominates training.

Goldman Sachs said Nvidia provided visibility into a strong 2027 growth outlook consistent with its estimates and well above the Street. The firm indicated that Nvidia’s introduction of Groq’s LPX rack reinforces the company’s commitment to inference - a critical and increasingly competitive segment within AI infrastructure.

Morgan Stanley emphasized cost-per-token leadership for Nvidia-based inference that the firm expects to improve with Rubin. Their checks led them to the view that Nvidia’s market share will be more stable than some expect and that AI spending strength should persist. Morgan Stanley continues to rate Nvidia as Top Pick in semiconductors.

Stifel highlighted the headline disclosure of $1 trillion in cumulative purchase order visibility for the Grace Blackwell and Vera Rubin platforms through CY2027, interpreting the figure as confirmation of accelerating demand and continued "AI Factory" build-out. The firm pointed to strategic elements including the unbundling of CPU and networking stacks, integration of Groq LPUs to capture inference edges, and the launch of OpenClaw/NemoClaw, which it described as an "HTTPS moment" for Agentic AI.


Implications for the market

Taken together, Nvidia’s announcements and the analyst reactions suggest that investors and customers should expect an expanding market for inference-focused hardware and systems. The company’s positioning across GPUs, licensed Groq technology, CPUs and networking could influence demand patterns across datacenter equipment and cloud infrastructure.

Analysts cautioned that the $1 trillion figure is narrowly defined to include specific platforms and networking and does not incorporate other product families. Several research teams interpreted that omission as a source of potential upside for datacenter spending beyond the $1 trillion baseline.

At present, the market response was positive but measured, reflecting both enthusiasm about the extended opportunity and the recognition that the disclosed figure is intended as a conservative starting point rather than definitive guidance.


Summary of key takeaways

  • Nvidia raised its addressable revenue outlook for AI chips to at least $1 trillion through 2027, up from a previously cited $500 billion through 2026 for specific chip families.
  • The company signaled a strategic push into inference computing with new processors, an AI system integrating Groq technology, and platform work around agentic AI via NemoClaw and OpenClaw.
  • Analysts largely view the $1 trillion disclosure as a conservative floor that leaves room for upside, while noting it covers only certain products and associated networking.

Risks and uncertainties

  • Scope of the $1 trillion figure - the disclosure explicitly includes Blackwell and Rubin platforms plus networking but excludes other Nvidia products, which leaves ambiguity about total datacenter revenue outcomes.
  • Competition in inference - custom CPUs and alternative inference chips from other firms present a growing competitive challenge in the market Nvidia seeks to expand into.
  • Market reaction variability - initial share gains following announcements were pared back, underscoring that investor responses may be mixed as more detail is needed to convert visibility into firm guidance.

Risks

  • The $1 trillion disclosure explicitly includes only Blackwell and Rubin platforms and associated networking, leaving uncertainty about total revenue if other products are not counted.
  • Inference computing faces growing competition from CPUs and custom chips by other firms, threatening Nvidia’s market share gains in the segment.
  • Investor reactions may remain mixed as the company’s disclosure is described as ambiguous and not equivalent to formal guidance.

More from Stock Markets

TSX Futures Tick Up as Oil Holds Above $100, Markets Brace for Central Bank Decisions Mar 17, 2026 Amplifon Plunges After €2.3bn GN Hearing Deal, Raising Balance Sheet Concerns Mar 17, 2026 J.P. Morgan Rates Comet Holding Overweight, Assigns CHF335 Target on RF Generator Gains and Cost Cuts Mar 17, 2026 Academy Sports Posts Q4 Miss as Comparable Sales Slide Mar 17, 2026 U.S. Futures Slip as Oil Holds Above $100; Delta and Frontier Gain Premarket, Honeywell Falls Mar 17, 2026