Let’s get one thing straight: the numbers being tossed around in the world of artificial intelligence right now are, to use the technical term, completely bonkers. When Sam Altman, the man at the centre of the AI circus, starts talking about needing to raise sums that could comfortably buy a small country, you have to wonder if we’re witnessing the birth of a new economic reality or the inflation of the biggest financial bubble since the dot-com era. The sheer scale of AI infrastructure investment isn’t just reshaping the tech industry; it’s testing the very limits of our economic sanity. Is this the moment we build the future, or are we just funding a very, very expensive hangover?
The New Gold Rush: Trillions, Not Billions
It’s one thing to say AI is expensive; it’s another to put a figure on it. And the figures are staggering. We’re not just talking about a few billion here and there. According to reports like one from News Ghana, we’re entering a new stratosphere of spending. Take the “Stargate” project, a reported Microsoft-OpenAI collaboration with a cool price tag of around $100 billion. The goal? To build a supercomputer that would guzzle gigawatts of power to train the next generation of AI models. Then there’s Altman’s even grander vision, a rumoured fundraising effort of between $5 and $7 trillion.
Let that sink in. Seven trillion dollars. That’s more than the combined GDP of the United Kingdom and Germany. The objective is to overhaul the global semiconductor industry to produce the sheer volume of chips AI requires. While the trillion-dollar figures remain speculative, the AI infrastructure investment we can confirm is already colossal. The spending spree is predicated on a simple, yet terrifying, assumption: the demand for artificial intelligence will be so insatiable that any amount of investment today will seem like a bargain tomorrow.
This isn’t your typical venture capital punt. This is nation-state-level spending, driven by a handful of tech giants who believe that cornering the market on compute—the raw processing power AI models need—is the ultimate strategic advantage. It’s an all-in bet on a future that hasn’t arrived yet, and everyone from the Bank of England to JPMorgan Chase is starting to look on with a mixture of awe and anxiety.
Echoes of Bubbles Past?
If all this sounds a bit familiar, it should. We’ve seen this movie before. Remember the late 1990s? The mantra was “get big fast”. Companies with flimsy business plans but a “.com” in their name were achieving astronomical valuations. Investors poured billions into laying a global network of fibre-optic cables, convinced that internet traffic would grow infinitely. The problem wasn’t that they were wrong about the internet’s importance, but their timing and exuberance were catastrophically off. The result? A spectacular crash that wiped out trillions in shareholder value and left miles of useless “dark fibre” underground.
Today, the parallels are hard to ignore. We are again seeing what many analysts are calling classic tech bubble patterns. OpenAI, the poster child for the AI boom, is projected to hit $4.5 billion in revenue in the first half of 2025. A fantastic number, no doubt. The catch? The company isn’t profitable. It’s burning through cash at an alarming rate to pay for the immense compute resource allocation needed to run models like GPT-4. Stanford’s Anat Admati, a sharp critic of corporate governance, has pointed out the dangers of this kind of speculative frenzy, warning that systemic risks could spill over into the broader economy.
The dark fibre of the 2000s has a modern equivalent: the Nvidia GPU. Just as the telecom boom was a bet on bandwidth, the AI boom is a bet on processing power. The question is, are we building essential infrastructure for the future, or are we creating a glut of hyper-specialised data centres that could become the next generation of expensive paperweights?
The Altman-Nvidia Engine: A Self-Fulfilling Prophecy?
To understand the core of this frenzy, you have to look at the relationship between the key players, particularly Sam Altman and Nvidia’s Jensen Huang. It’s a dynamic that raises more than a few eyebrows, resembling a kind of circular financing scheme. Here’s a simplified analogy: imagine you’re a baker who makes the world’s best, most sought-after bread (Nvidia). Your biggest customer, a sandwich shop (OpenAI), wants to expand globally but needs a massive loan. So, you lend them an enormous sum of money, on the condition that they use it to buy your bread, and only your bread, to stock their new shops.
This is a crude but not entirely inaccurate representation of what’s happening. Big Tech and venture funds pour money into AI start-ups. What’s the first thing these start-ups do? They turn around and spend a huge chunk of that cash on Nvidia’s chips. This drives Nvidia’s revenues and stock price through the roof, which in turn makes AI look like an even hotter investment, attracting more capital into the ecosystem. Is this a virtuous cycle of innovation or a brilliantly engineered feedback loop that primarily benefits one chipmaker?
The Altman investment strategy is the logical, if terrifying, conclusion of this dynamic. He seems to have realised that being beholden to a single supplier for the one resource you can’t live without is a precarious position. His multi-trillion-dollar ambition isn’t just about building more AI; it’s a strategic move to break the dependency on Nvidia by fundamentally remaking the supply chain. It’s a breathtakingly audacious plan to control his own destiny. But it also concentrates an unimaginable amount of power and risk in the hands of one man and one company. What happens if his vision of endless AI demand doesn’t quite pan out?
The Unseen Costs: Power, Water, and Profitability
Beyond the financial acrobatics, there are two stubbornly persistent real-world problems: profitability and environmental impact. For all the talk of exponential growth, the path to sustainable profit for many AI companies remains blurry. The cost of compute resource allocation is immense. Training a single large model can cost tens of millions of dollars in electricity and processing time alone. Running it for millions of users costs even more. Unless the value these models create dramatically outpaces their operating costs, the business model remains fundamentally broken.
This frantic build-out also has a significant environmental footprint. Data centres are incredibly thirsty for power and water. The Stargate project alone is projected to consume up to 5 gigawatts of power—the equivalent of several nuclear power plants. In a world already grappling with climate change and resource scarcity, is a headlong rush to build power-hungry infrastructure without a clear return on investment truly a wise move? Tech leaders have historically been able to wave away these concerns with promises of future efficiency gains, but the sheer scale of the current AI infrastructure investment makes that a much harder sell.
The rush to build can lead to cutting corners, not just financially but also environmentally. If this AI boom busts, we won’t just be left with empty balance sheets, but with a legacy of concrete server farms that consumed vast resources for a future that never fully materialised.
So, where does this leave us? The excitement around AI is genuine, and its potential is undeniable. But the current investment climate feels less like a calculated strategy and more like a fever dream. The tech bubble patterns are blinking red, from sky-high valuations disconnected from profit to a concerning concentration of power and risk.
The crucial question we should all be asking is not if AI is the future, but whether the path we are taking to get there is sustainable. Is the Altman investment strategy a visionary blueprint for a new industrial revolution, or is it a symptom of speculative mania? Are we building the foundations of tomorrow’s economy, or are we simply funding the world’s most elaborate and expensive game of pass-the-parcel?
What do you think? Are the warnings of an AI bubble overblown, or are we ignoring the lessons of history at our peril? Let me know your thoughts in the comments.


