Stock Markets February 17, 2026

Nvidia inks multiyear supply pact to provide Meta with millions of AI chips and CPUs

Agreement covers current and upcoming Blackwell and Rubin AI accelerators plus standalone Grace and Vera data-center CPUs

By Maya Rios NVDA
Nvidia inks multiyear supply pact to provide Meta with millions of AI chips and CPUs
NVDA

Nvidia has agreed to a multiyear arrangement to supply Meta Platforms with millions of its existing and next-generation artificial intelligence accelerators and central processors. The deal, whose financial terms were not disclosed, covers Nvidia's Blackwell and upcoming Rubin AI chips as well as standalone installations of its Arm-based Grace and Vera CPUs - chips Nvidia has been deploying since 2023 as companions to its AI accelerators. The move underscores Nvidia's push to broaden the role of its CPUs into database and AI-agent workloads while Meta continues parallel work on its own silicon and discussions with Google on TPU usage.

Key Points

  • Nvidia signed a multiyear agreement to sell Meta millions of AI chips and standalone CPUs, covering Blackwell and upcoming Rubin accelerators as well as Grace and Vera central processors.
  • The Grace and Vera CPUs are Arm-based chips introduced beginning in 2023 as companions to Nvidia's AI accelerators and are being positioned for data-center workloads such as databases and AI agents.
  • Meta continues to develop its own AI chips and is in talks with Google about using TPUs, creating parallel sourcing options that could affect procurement and infrastructure strategies.

Nvidia said on Tuesday it has entered a multiyear agreement to sell Meta Platforms millions of its current and future artificial intelligence chips, along with standalone central processing units that compete with offerings from Intel and Advanced Micro Devices. The companies did not disclose a monetary value for the arrangement.

The contract covers Nvidia's existing Blackwell family of AI accelerators and its forthcoming Rubin AI chips. It also includes separate installations of Nvidia's Grace and Vera central processors. Nvidia introduced Grace and Vera - CPUs built on Arm Holdings technology - beginning in 2023 as companions to its AI accelerators.

The announcement signals an effort by Nvidia to extend the use of those Arm-based CPUs beyond companion roles and into emerging workloads such as running AI agents, as well as into more conventional server tasks including database operations.

Meta, meanwhile, is advancing its own chip development and has been in discussions with Google about the possible use of that company's Tensor Processing Units, or TPUs, for AI workloads. Those parallel efforts by Meta represent an alternative path for some of its infrastructure needs.

Ian Buck, general manager of Nvidia's hyperscale and high-performance computing unit, said Nvidia's Grace central processors have demonstrated the ability to use roughly half the power for certain common tasks - citing database workloads as an example - and that additional efficiency gains are expected with the next-generation Vera processors.

"It actually continues down that path and makes it an excellent data center-only CPU for those high-intensity data processing back-end operations," Buck said. "Meta has already had a chance to get on Vera and run some of those workloads. And the results look very promising."

Nvidia has not disclosed the size of its sales to Meta. Analysts widely believe Meta is one of four customers that together accounted for 61% of Nvidia's revenue in its most recent fiscal quarter. Commentators say Nvidia likely promoted the deal to underscore that it has maintained substantial business with Meta while also showing traction for its central processor lineup.


Context and implications

The pact aligns Nvidia's strategy of positioning Arm-based CPUs as data-center-focused processors for intensive back-end tasks, while keeping the company in supply conversations with a major hyperscaler that is simultaneously pursuing in-house silicon and exploring other third-party accelerators.

Risks

  • Financial terms of the Nvidia-Meta agreement were not disclosed, creating uncertainty about the deal's revenue impact for Nvidia - this affects semiconductor and cloud infrastructure markets.
  • Meta's parallel efforts to build its own chips and its discussions with Google about TPUs present competitive sourcing risk that could limit future volumes purchased from Nvidia - this impacts hyperscalers and AI hardware suppliers.
  • Nvidia's revenue concentration risk remains a concern, as analysts believe four customers made up 61% of its revenue in the most recent fiscal quarter, leaving the company exposed to changes in spending by major cloud and social-media customers.

More from Stock Markets

Market Turbulence Reinforces Case for Broader Diversification Feb 21, 2026 NYSE Holdings UK Ltd launches unified trading platform to streamline market access Feb 21, 2026 Earnings Drive Weekly Winners and Losers as Buyout Headlines Lift Masimo Feb 21, 2026 Barclays Sees 'Physical AI' Scaling to Hundreds of Billions by 2035 Feb 21, 2026 Germany's Wind Expansion Accelerates Amid Growing Questions Over Durability Feb 21, 2026