Nvidia’s (NVDA) own Big Tech customers are poised to capture a sizable slice of the AI chip market — a dynamic that could eventually dig into the chipmaker’s profit margins.
Tech giants have been making moves to bolster their in-house chipmaking businesses. ChatGPT developer OpenAI (OPAI.PVT) — a big customer of Nvidia’s chips by way of renting cloud space in Microsoft’s (MSFT) and CoreWeave’s (CRWV) data centers — said it will begin designing its own custom chips in partnership with Broadcom.
Meta (META) announced a plan in late September to acquire chip startup Rivos as it bolsters its own in-house chip efforts. Amazon (AMZN) said this summer that its massive data center project called Project Rainier, in which hundreds of thousands of its Trainium2 chips that will be used by AI developer Anthropic, is “well underway,” and analysts have said demand for its chips that have already been deployed in data centers from Anthropic has significantly risen.
While the lion’s share of the AI chip market is dominated by Nvidia’s GPUs (graphics processing units), tech companies led by Alphabet-owned Google (GOOGL, GOOG), Amazon, and Microsoft design custom chips in partnership with chipmakers Broadcom and Marvell Technology (MRVL).
The chips are cheaper and better optimized for those companies’ software, analysts explained. While these Big Tech cloud providers don’t sell physical, standalone chips to other companies like Nvidia does, the firms internally run their AI models off their own chips, and their cloud customers also have the option to run AI workloads using those custom chips at a lower cost.
JPMorgan said in a research note in June that custom chips designed by companies like Google, Amazon, Meta, and OpenAI will account for 45% of the AI chip market by 2028, up from 37% in 2024 and 40% in 2025. The rest of the chip market is held by producers of GPUs, namely Nvidia and its rival Advanced Micro Devices (AMD).
The “Magnificent Seven” names have a good reason to design their own chips.
“In the case of all the hyperscalers looking at custom silicon, the strategic angle here is they don’t want to be stuck behind an NVIDIA monopoly,” Seaport analyst Jay Goldberg said. The incredibly high cost of Nvidia’s AI chips has meant cloud providers make lower profits renting out those chips than they could renting their own, analysts explained.
“Nvidia now has to compete with its customers,” Goldberg added.
While tech firms’ custom chips are broadly used to run internal AI workloads, Google reportedly began physically selling its AI chips called TPUs (tensor processing units) to a cloud provider in September — a move that would see it directly compete with the likes of Nvidia. DA Davidson analyst Gil Luria estimated Google’s TPU business, combined with its DeepMind AI segment, to be worth $900 billion and said it is “arguably one of Alphabet’s most valuable businesses.”
“Google’s TPUs remain the best alternative to NVIDIA, with the gap between the two closing significantly over the past 9-12 months,” Luria wrote in a note to clients in September. “[S]hould Google sell its systems externally to customers, demand would be there and specifically from notable frontier AI labs.”
Overall, Seaport’s Goldberg said he expects “a lot of activity around custom silicon” in 2026 based on his conversations throughout the AI chip supply chain.
Big Tech companies are at different stages in the evolution of their chip businesses. Google has been developing its AI chips called TPUs for more than a decade and is a clear leader among its peers, analysts told Yahoo Finance. Amazon started its in-house chip journey a year after Google launched its first TPU through the acquisition of chip startup Anapurna Labs in 2015 and released its first Trainium chip in 2020. Meanwhile, Microsoft launched its first custom Maia AI chip in 2023 and has fallen behind its peers.
While the custom chips may be cheaper to use, AI developers often prefer Nvidia’s chips because of the software stack that goes with them.
Though Nvidia is a clear leader today, Futurum Group analyst David Nicholson said the tech giants’ custom chip efforts will eventually dig into Nvidia’s profits: “Over time, the margins that Nvidia can command right now get degraded … [it will be] sort of death by a thousand cuts because you have all of these different custom silicon accelerators [chips] that exist because there’s such an opportunity.”
When asked about the competitive threat of custom chips in a recent podcast appearance, Nvidia CEO Jensen Huang appeared to dismiss the concern, saying Nvidia is more than just an AI chipmaker, as it provides full-scale server systems, not just individual GPUs. Whereas its customers are, in many cases, building single chips, Nvidia designs massive server racks with multiple in-house chips from its Blackwell GPUs to its Arm-based central processing units (CPUs) and networking products, which allow the chips to talk to one another.
“We’re the only company in the world today that builds all of the chips inside an AI infrastructure,” Huang told the “BG2” podcast in September.
Some analysts also argue that the market for AI chips is so big that there’s room for tech firms to expand their custom chip efforts without crowding the market for Nvidia.
Bank of America’s Vivek Arya and DA Davidson’s Gil Luria both said verbatim in separate interviews that the growing market share of custom chips “doesn’t matter.”
That’s because Nvidia has “managed to consistently expand the size of the market,” Arya explained. Nvidia has invested heavily in the AI ecosystem and “neocloud” companies that rival its own customers, with $47 billion worth of venture capital investments in AI companies from 2020 through September of this year, per Pitchbook data.
“The growth and demand is so substantial,” Luria said. “We’re going to need a lot more compute and the [AI] models are getting more useful, which is to say the pie is gonna get a lot bigger over the next couple of years.”
“Nvidia won’t grow as fast as the market, but because the market is growing so fast, they’ll still be able to grow,” he added.
Also diminishing the concern for Nvidia: Not all tech giants embarking on custom chip efforts will present an equal threat to Nvidia’s dominance.
“The drawback of doing your own silicon, though, is that it’s hard,” Goldberg said. “I think ultimately what will happen is not all of them will succeed.”
Laura Bratton is a reporter for Yahoo Finance. Follow her on Bluesky @laurabratton.bsky.social. Email her at laura.bratton@yahooinc.com.
Click here for the latest technology news that will impact the stock market
Read the latest financial and business news from Yahoo Finance