There’s a certain flavour of audaciousness that only the tech world can serve up. Elon Musk, never one to shy away from a grand pronouncement, recently claimed that “by far the cheapest place to put AI will be space in 36 months or less.” It’s a bold, headline-grabbing statement that conjures images of servers humming quietly amongst the stars. But before we all get swept up in this sci-fi dream, it’s worth asking a rather blunt question: have you seen the bill? The economics of putting data centres into orbit aren’t just challenging; as a recent TechCrunch analysis points out, they are downright brutal.
The vision is undeniably seductive. Put your power-hungry AI chips in space, where solar panels can soak up unfiltered sunlight 24/7, generating five to eight times more power than their terrestrial counterparts. This is the core idea behind Google’s Project Suncatcher, a play on energy arbitrage. Yet, turning this vision into a viable business model runs headfirst into the colossal wall of space infrastructure costs.
What Exactly Are We Paying For?
When we talk about space infrastructure costs, we’re not just talking about the price of a rocket launch, although that’s a hefty line item. It’s a cascade of expenses that make even the most lavish terrestrial projects look like a bargain.
Think of it this way: building a data centre on Earth is like building a warehouse. You need land, construction, and a solid connection to the power grid. Building a data centre in space is like trying to build a Swiss watch inside a blast furnace that’s also hurtling around the planet at 17,000 miles per hour. Every single component, from the processors to the cooling systems, must be custom-built and battle-hardened.
The breakdown is staggering. A new analysis suggests that a one-gigawatt orbital data facility would cost an eye-watering $42.4 billion. For context, its Earth-based equivalent costs around $14.4 billion. That’s nearly three times the price for the same compute power, a premium that would make any CFO’s blood run cold. This isn’t just a minor cost overrun; it’s a fundamentally different economic proposition.
The Devil in the (Orbital) Details
So, where does all that money go? First, the satellites themselves. These aren’t your typical off-the-shelf servers. They need specialised, radiation-hardened chips to survive the constant bombardment of cosmic rays. As Planet Labs’ co-founder Mike Safyan notes, you also need enormous radiators “to dissipate heat into the blackness of space,” because there’s no air to help with cooling.
Then there’s the launch. While SpaceX’s Starship aims to drastically lower launch costs to a miraculous $200 per kilogram, we are not there yet. Current costs are still orders of magnitude higher. And even with cheaper launches, the sheer mass required is immense. This isn’t a one-and-done launch; it’s a constant, ferociously expensive orbital construction project.
Finally, there’s the inconvenient truth of solar panel degradation. Those super-efficient space solar panels? They get weather-beaten by radiation, and their performance drops significantly over about five years. According to Starcloud CEO Philip Johnston, after that point, “the dollars per kilowatt-hour doesn’t produce a return.” Your multi-billion dollar asset has a depressingly short shelf life.
The Two Faces of Satellite AI
Inside this challenging economic picture, there’s a crucial distinction to be made in the world of satellite AI: the difference between inference and training.
– Inference is the act of using a pre-trained AI model to make a prediction or a decision. It’s the AI equivalent of answering a question.
– Training is the immensely power-intensive process of building the model in the first place, feeding it vast datasets until it learns.
Philip Johnston of Starcloud believes that “almost all inference workloads will be done in space.” This makes some sense. You could perform relatively lightweight tasks on a satellite, like analysing Earth observation data on the fly. However, training a foundational model like GPT-4 in orbit is a whole other beast. It requires massive clusters of GPUs working in perfect concert, something that’s nearly impossible with today’s inter-satellite communication speeds, which top out at around 100 Gbps—a fraction of the speeds inside a terrestrial data centre.
The Brutal Economics of Starcloud and Suncatcher
This brings us to the grand plans of companies like Starcloud. Their vision for low-earth orbit computing involves a constellation of up to 80,000 satellites flying in precise formation to create a cohesive computing fabric. The scale is mind-boggling, and the Starcloud economics are equally so.
The core problem remains power cost. On Earth, a kilowatt-year of power for a data centre costs between $570 and $3,000. The estimated cost for that same kilowatt-year in orbit? A colossal $14,700. No amount of sunshine can erase that disparity.
As Amazon Web Services’ Matt Gorman bluntly put it, “There are not enough rockets to launch a million satellites yet… It is just not economical.” The industry is caught in a classic chicken-and-egg problem. Costs won’t come down without mass production, but no one can afford the mass production until the costs come down.
Is There a Path Forward?
The entire premise of orbital AI hinges on a revolution in space infrastructure costs. Elon Musk’s predictions are banking on SpaceX’s Starship not just succeeding, but becoming so routine and cheap that it fundamentally rewrites the laws of space economics.
If—and it’s a big if—launch costs plummet and satellite manufacturing can achieve true economies of scale, the numbers might start to look a little less terrifying. But we’re a long way from an assembly line churning out data-centre-grade satellites like they’re iPhones. The technical hurdles around radiation, thermal management, and data connectivity are not trivial engineering problems; they are fundamental physics challenges.
For now, the logic seems clear. The dream of AI in the heavens is a powerful one, but the balance sheet is firmly planted on Earth. The promise of Project Suncatcher and others is a testament to human ingenuity, but the immediate future of heavy AI computation will remain terrestrial.
The question isn’t whether we can put data centres in space—with enough money, you can do almost anything. The real question is whether we should. Is the astronomical price tag simply the cost of admission for the next frontier of computing, or is it a clear sign that, for now, our ambitions are writing cheques our rockets can’t cash? What do you think?


