Both sides are publicly playing it cool. Huang says he’s keen to invest in OpenAI’s next round, no questions asked. Altman seems baffled by “all this insanity”. Yet, their actions tell a different story. Nvidia is hedging its bets by backing OpenAI’s rival, Anthropic. Meanwhile, OpenAI is cosying up to a who’s who of Nvidia’s competitors, including AMD, Broadcom, and Cerebras. So what’s really going on? This isn’t just a simple disagreement; it’s a power play that reveals everything about the brittle foundations of the current AI gold rush.
A Market of One (and a Half)
Let’s be brutally honest. When we talk about the hardware for training large AI models, we’re mostly talking about one company: Nvidia. They aren’t just leading the market for AI chips; for the past few years, they have been the market. Since ChatGPT ignited the AI boom, Nvidia’s quarterly revenue has rocketed from a respectable $6 billion to an eye-watering $57 billion. That’s the kind of growth that makes financial analysts weep with joy.
This explosion in demand has given them an iron grip on over 90% of the AI GPU market. It’s also catapulted their valuation to a peak of $5 trillion, now sitting comfortably at around $4.4 trillion. They’ve built the digital equivalent of the M25 ring road around London, and every single AI company is stuck in traffic, trying to get its delivery vans through.
And who’s the biggest customer with the most vans? OpenAI. With 800 million weekly active users on ChatGPT and a projected $20 billion in annual revenue, they are the primary driver of the demand that made Nvidia a multi-trillion-dollar behemoth. This creates a fascinating, if slightly awkward, dynamic where AI chip partnerships are less about collaboration and more about managing a critical dependency.
The $100 Billion Handshake That Wasn’t
So why is this landmark deal stalled? Publicly, everyone is full of praise. Huang told CNBC, “We are looking forward to Sam closing it and he’s doing terrifically. And we will invest in the next round. There is no question about that.” Altman, in return, stated, “We hope to be a gigantic customer for a very long time.”
But behind the scenes, the story is about diversification and leverage. For OpenAI, being solely reliant on Nvidia is a strategic nightmare. Imagine trying to build the world’s most ambitious construction project when only one company on the planet sells the steel girders you need. The price, the availability, the delivery schedule—it’s all dictated by them. This is why OpenAI is so keen to explore other GPU supply chains with partners like AMD’s Lisa Su. They need to create competition, even if it’s just a sliver of what Nvidia offers, to gain some negotiating power.
For Nvidia, the concern is different. Is OpenAI, with its complex governance and a business model still racing towards profitability (expected by 2030), the only horse to back? Investing in Anthropic isn’t just a dalliance; it’s a diversification strategy. It sends a clear message to the market and to OpenAI: our chips power everyone, and we won’t be held captive by a single customer’s destiny.
The Unbreakable Tether of AI Hardware
This brings us to the core of the issue: the intense AI hardware dependencies that define the industry. Building and running models like GPT-4 requires a staggering amount of computational power. As Sachin Katti, a senior vice president at Intel (another player in this space), puts it, “The world needs orders of magnitude more compute.” This isn’t an optional extra; it is the fundamental large model infrastructure.
This reality tethers OpenAI to Nvidia, whether they formalise a $100 billion deal or not. OpenAI’s entire computing fleet runs on Nvidia GPUs. They simply cannot switch to another provider overnight, or even over a year. The software, the architecture, the expertise—it’s all built around Nvidia’s CUDA platform.
This codependence is the central theme of today’s tech industry alliances. It’s a complicated dance where partners are also potential hostages. Nvidia needs OpenAI’s insatiable appetite for chips to justify its astronomical valuation and continued growth. OpenAI needs Nvidia’s hardware to keep pushing the boundaries of AI and serve its massive user base. They are, for now, locked in a multi-trillion-dollar embrace.
What Does the Future Hold?
The delay in this mega-deal is less about friction and more about strategic positioning for the next phase of AI. OpenAI is trying to build a moat around its models, but its biggest vulnerability is the hardware it runs on. A valuation of $500 billion, with ambitions to reach $800 billion, depends on a secure and scalable supply of chips.
Nvidia, on the other hand, wants to maintain its role as the indispensable arms dealer in the AI war. By supplying all sides, they ensure their own dominance regardless of which AI lab comes out on top. They’re not just selling picks and shovels in a gold rush; they own the patent on the shovel itself.
Ultimately, this stalemate won’t last forever. The market is moving too quickly. Either the partnership will be redefined, or the diversification efforts from both sides will begin to bear real fruit, fracturing Nvidia’s near-monopoly. For now, they continue their symbiotic, slightly resentful, and enormously profitable dance.
What do you think? Is OpenAI wise to court other chipmakers, or is it a futile gesture against Nvidia’s dominance? Let me know your thoughts in the comments below.


