Let’s be honest, the AI gold rush is in full swing. Everyone, from the goliaths in Silicon Valley to plucky start-ups in Shoreditch, is scrambling to build the next great model, the next world-changing application. We’re all mesmerised by the alchemy of turning data into intelligence. But amidst all this frantic coding and breathless hype, we’re collectively ignoring the incredibly unglamorous, yet existential, problem looming in the background: power. AI is fantastically, terrifyingly thirsty for electricity, and the creaking infrastructure propping it all up is starting to groan under the strain.
The conversation is shifting, albeit slowly, from simply can we build it, to how can we build it without boiling the oceans? This brings us to the real elephant in the room: the quest for sustainable AI data centres. It’s not a topic that will get you a million views on TikTok, but it’s the one that will determine whether this entire AI revolution is built on a solid foundation or a house of cards. The industry is desperately searching for breakthroughs, not just in algorithms, but in the nuts and bolts of a building’s electrical room. And that’s where the story gets interesting, because the answer might not come from some exotic new physics, but from a quiet reinvention of a battery chemistry we’ve known for over a century.
The Unquenchable Thirst of AI Workloads
First, let’s get a handle on why this is suddenly such a crisis. For years, data centres have hummed along with a relatively predictable rhythm. Traditional IT workloads—running websites, databases, enterprise software—are like a steady, constant hum of electricity. You can plan for it, provision for it, and manage it. The power draw is consistent, like a well-behaved stream.
Enter the GPU. The engines of modern AI, these graphical processing units, don’t hum; they roar. An AI training workload is less like a stream and more like a flash flood. When a cluster of thousands of GPUs spins up to train a model, the power demand spikes violently and instantaneously. We’re not talking about a gentle ramp-up. We’re talking about a demand surge that, according to a recent report in Network World, can hit 150% of a data centre’s uninterruptible power supply (UPS) rated capacity. Imagine plugging a dozen vacuum cleaners into one socket at the exact same second. Now scale that up a thousand-fold.
This spiky, unpredictable demand profile is a nightmare for power infrastructure. The grid itself isn’t built for such erratic behaviour, and the internal systems of a data centre—the transformers, the switchgear, the backup systems—are pushed to their absolute limits. The default solution? Massive over-provisioning. You build a power system capable of handling the absolute worst-case scenario, which means 95% of the time, billions of pounds worth of equipment sits idle. It’s colossally inefficient and wildly expensive, throwing any pretence of energy efficiency out of the window.
A Battery That Does More Than Just Wait
This is the problem that Oregon-based ZincFive believes it has cracked. The company is betting the farm on nickel-zinc batteries (NiZn), a technology that has been around since the early 1900s but has been given a thoroughly modern makeover for the AI era. Most data centres today rely on lithium-ion batteries for their backup power. They sit there, fully charged, waiting for the day the grid fails. They are the silent sentinels, purely defensive.
ZincFive’s approach with its new BC 2 AI cabinet is fundamentally different. It’s a dual-mode system, which is a rather dry way of describing something quite clever. It acts as both a traditional battery backup and an active energy management tool. It’s not just a sentinel; it’s a shock absorber.
Think of it like the suspension on a rally car. A normal car’s suspension is designed for a smooth motorway. A rally car’s suspension is built to absorb the violent, unpredictable shocks of a dirt track, keeping the car stable and in control. The BC 2 AI cabinet does this for a data centre’s power. When a GPU cluster suddenly demands a tidal wave of electricity, the nickel-zinc battery system instantly discharges to satisfy that peak, absorbing the shock before it can overwhelm the building’s main power supply or send a disruptive signal back to the grid. It smooths out the ‘bumpy road’ of AI power demands into a manageable, predictable flow.
The Strategic Advantages of Nickel-Zinc
So, what makes nickel-zinc so special for this task?
– Incredible Power Density: NiZn chemistry can discharge huge amounts of power very, very quickly. That’s precisely what’s needed to handle those transient GPU loads. While lithium-ion is great at storing a lot of energy and releasing it over time (perfect for an electric car), NiZn excels at delivering immense power in an instant.
– Safety and Sustainability: This is a big one. Unlike lithium-ion, which is known for its tendency to experience thermal runaway (a technical term for “bursting into flames”), nickel-zinc is a water-based chemistry. It’s non-flammable and far safer to operate in a dense data centre environment. Furthermore, both nickel and zinc are more abundant and easier to recycle than the cobalt and lithium in many batteries, which gives it a much stronger claim to sustainability.
– Space Efficiency: This might be the feature that gets CFOs most excited. According to ZincFive, their solution takes up just one-quarter to one-half the physical footprint of competing battery systems designed to handle the same AI power surges. In a data centre, floor space is money. Every square metre devoted to batteries is a square metre that can’t be used for revenue-generating servers and GPUs. A smaller footprint directly translates to a better bottom line.
As ZincFive CEO Tod Higinbotham put it, they are delivering a “safe, sustainable, and future-ready power solution designed to handle the most demanding AI workloads while continuing to support traditional IT backup.” It’s this dual-purpose capability that moves the battery from a simple insurance policy to an active, value-generating part of the power infrastructure.
Reworking the Economics of AI
The implications of this shift are profound. By effectively decoupling the spiky demand of AI from the main power supply, you no longer need to massively over-provision your entire electrical chain. This has a direct impact on capital expenditure (CAPEX). A data centre operator can now potentially avoid millions of pounds in upfront costs for bigger transformers, thicker cables, and more robust switchgear. This could dramatically lower the barrier to entry for companies wanting to build their own dedicated AI infrastructure.
Furthermore, it changes the relationship between the data centre and the electrical grid. A data centre that constantly sends shockwaves back to the grid is a problem customer. A data centre that internally manages its own peaks and presents a smooth, predictable load is a model citizen. In the future, these well-behaved data centres could even participate in demand-response programmes, getting paid by utility companies to help stabilise the grid by using their batteries to absorb excess power or supply it back during times of high demand. This creates a symbiotic relationship, turning a data centre from a simple power consumer into an active grid asset.
This is the cornerstone of building truly sustainable AI data centers. It’s not just about using renewable energy sources; it’s about being smarter with the power you have, improving energy efficiency from the inside out and integrating seamlessly with the broader energy ecosystem.
The Unsung Heroes of the Revolution
The story of the AI revolution is currently being written by the people creating the models and the companies fabricating the chips. But the next chapter will be defined by the engineers solving these less glamorous, but arguably more critical, infrastructure challenges. Technologies like ZincFive’s nickel-zinc batteries represent a crucial piece of the puzzle. They are the enablers, the quiet heroes working behind the scenes to ensure the entire enterprise doesn’t collapse under its own weight.
As we move forward, the performance of an AI data centre won’t just be measured in petaflops; it will be measured in its power usage effectiveness (PUE), its grid interaction, and its overall carbon footprint. The companies that figure this out first will not only be more sustainable, but they will also have a significant economic advantage.
The shift is underway. The “boring” stuff is suddenly becoming the most strategic part of the tech stack. So, while we watch the rock stars of AI take their victory laps, it’s worth paying attention to the roadies backstage who are making sure the whole show doesn’t come to a screeching, powerless halt.
What other “unsexy” technologies do you think will become unexpectedly critical as the AI industry matures?


