AI’s Power Problem and the Rise of Green Storage

Green TechnologyBy 3L3C

AI data centres are stressing the US grid—but they’re also creating a massive opening for battery energy storage to scale as a core green technology.

AI data centresbattery energy storagegreen technologygrid modernizationrenewable energyenergy policy
Share:

Featured image for AI’s Power Problem and the Rise of Green Storage

Most people hear “AI” and think about chatbots and image generators. Grid operators think about megawatts.

By 2030, AI and cloud data centres are projected to consume several percent of US electricity demand, with individual AI campuses drawing as much power as a mid-sized nuclear plant. That’s not an abstract trend anymore; projects like the 1GW “Barn” campus in Michigan make it painfully concrete.

Here’s the thing about that surge in demand: it can either lock us into more fossil generation for decades, or it can accelerate one of the most powerful green technologies we’ve got—grid-scale energy storage.

This post looks at how AI’s power problem is quickly turning into the US energy storage market’s biggest opportunity, and what that means if you care about clean energy, resilient grids, or investing in green technology.


How AI Data Centres Are Reshaping Power Demand

AI data centres are already changing how the grid operates because their electricity needs are huge, fast-growing, and extremely spiky.

A traditional cloud data centre already pulls a lot of power, but AI workloads push that to another level. The planned OpenAI–Oracle–Related Digital campus in Saline Township, Michigan, is a good example:

  • Over 1GW of computing capacity
  • About 1.4GW of electricity demand at full load, roughly 25% of utility DTE Energy’s current generation capacity
  • Comparable to the output of a mid-sized nuclear reactor

And that’s just one site.

Why AI load is so hard for the grid

AI data centres don’t just use a lot of power; they swing rapidly as workloads ramp up and down. Industry leaders describe these as “intense power fluctuations unlike anything we’ve seen in cloud data centres previously.”

For grid planners, that creates three problems:

  1. Capacity – ensuring there’s enough power available when AI loads spike.
  2. Flexibility – responding in seconds or minutes to big swings in demand.
  3. Reliability – keeping voltage and frequency stable under heavy, volatile load.

Traditional solutions—gas peaker plants and overbuilt transmission—are slow and expensive. This is where battery energy storage systems (BESS) come in.


Why Energy Storage Is Becoming the Default Solution

Energy storage is quietly becoming the go-to tool for managing AI-driven load growth because it solves several problems at once: capacity, flexibility, and decarbonisation.

Under recent US climate policy updates, solar and wind saw their incentives trimmed and timelines altered. Energy storage, however, kept its investment and production tax credits through the mid-2030s, albeit with new restrictions on equipment linked to “foreign entities of concern,” particularly China.

Despite the sourcing constraints, storage gained something crucial: recognition as “dispatchable” capacity, alongside nuclear and geothermal. That’s a big deal, because dispatchable resources are valued more highly in capacity markets and grid planning.

Why storage beats more fossil peakers

For AI and data centre developers, batteries are increasingly the fastest, cleanest economic option for new capacity:

  • Speed – BESS projects can move from design to operation in 18–36 months, much faster than new thermal plants.
  • Flexibility – Batteries can charge when power is cheap or renewable-heavy, then discharge during expensive, carbon-intensive peaks.
  • Cost trajectory – Lithium-ion costs have fallen dramatically over the last decade, and competition is pushing prices down further.
  • Policy tailwinds – ITCs and PTCs still make the financial math compelling, especially when paired with renewables.

The result: even if federal climate policy zigzags over the next few years, the basic economics of matching AI growth with BESS are strong and getting stronger.


Case Study: “The Barn” and Grid‑Connected Green AI

The Michigan project known as The Barn is a template for how AI, utilities, and green technology can work together if they’re smart about it.

How the project is structured

The Barn is a multi-billion-dollar AI data centre campus developed by OpenAI, Oracle, and Related Digital. Key features:

  • Over 1GW of computing on site
  • 1.4GW of electricity demand at full use
  • A closed-loop cooling system to keep water consumption comparable to an office building
  • Planned LEED certification for the three 550,000 square-foot buildings
  • More than 2,500 union construction jobs and hundreds of long-term operations roles

On the power side, utility DTE Energy will:

  • Supply 100% of the project’s power using existing resources
  • Add new battery storage capacity funded entirely by the project
  • Structure it so there’s no impact on existing customers’ energy supply or rates

That last point is what many communities worry about—will a giant AI campus raise my bills or strain my local grid? DTE is betting that if the data centre pays for its own battery energy storage, the project can actually help spread grid fixed costs over more load, benefiting other customers rather than hurting them.

Why this matters for green technology

From a green technology lens, The Barn checks several boxes:

  • Decoupling AI growth from new fossil build-out by relying on batteries and existing generation
  • Designing for water efficiency via closed-loop cooling in a sector notorious for water use
  • Showcasing storage as “grid infrastructure,” not just a niche add-on

If this template spreads—AI load plus utility-scale BESS plus efficiency—it can significantly speed up storage deployment across the US.


Inside the New Storage Stack: Prevalon, Emerson, EPC Power, ON.energy

The AI–storage story isn’t just about big campuses; it’s about the technology stack that makes grid-scale batteries reliable enough for mission-critical computing.

Prevalon + Emerson: Brains for the battery

Prevalon Energy, a dedicated BESS company spun out of Mitsubishi Power Americas, is teaming up with Emerson to support hyperscale, colocation, and enterprise data centres.

They’re combining:

  • Prevalon’s HD5 energy storage platform
  • insightOS, Prevalon’s energy management system (EMS)
  • Emerson’s Ovation automation platform, widely used in power and industrial assets

Here’s why that matters: large AI data centres may use 500–600MW of batteries with roughly two hours of duration, sitting beside gas turbines running at base load. The turbines stay steady; the data centre load swings wildly; the batteries “ride the wave,” matching real-time demand.

For that to work, the EMS and automation layer have to be boringly reliable. As Prevalon’s CEO puts it, the biggest differentiator is simple: “It works.” In high-stakes environments, stability beats hype.

Prevalon’s track record isn’t theoretical. Its 80MW/320MWh Happy Valley BESS for Idaho Power is already in commercial operation, with a larger 200MW/800MWh project scheduled for 2026. Those kinds of deployments build confidence that similar systems can anchor AI loads.

EPC Power + ON.energy: Making AI “grid-safe”

On the hardware side, converter manufacturer EPC Power is partnering with ON.energy, a system integrator and project developer that started in Latin America and has expanded into the US.

ON.energy is building an “AI UPS” platform that:

  • Scales from megawatts to gigawatts
  • Provides up to 8 hours of runtime, far beyond typical backup UPS systems
  • Handles rapid AI load swings, voltage sags, and frequency events
  • Protects both grid-connected and off-grid AI infrastructure

Under the hood, EPC Power’s converters sit between batteries and the grid, managing power quality, ride-through, and interconnection requirements. In markets like Texas—where ON.energy is deploying its 160MWh Palo de Agua portfolio—this kind of grid-friendly behaviour is essential.

The pattern here is clear: AI isn’t just buying batteries; it’s buying integrated storage, control, and power electronics solutions that can keep mission-critical workloads running through grid disturbances.


Environmental Risks: Water, Carbon, and the AI Bubble Question

This isn’t a clean, simple success story. AI data centres come with environmental baggage that can’t be ignored.

Water use and local impacts

The Environmental and Energy Study Institute estimates that:

Large data centres can consume up to 5 million gallons of water per day, similar to a town of 10,000–50,000 people.

That’s a serious concern in water-stressed regions. Projects like The Barn, which use closed-loop cooling to keep water use comparable to an office building, show a better path—but not every project is that careful.

For communities, the right questions to ask are:

  • How much water will this data centre actually use, and in what cooling configuration?
  • Where is that water coming from, and what’s the impact in drought years?
  • Are there commitments to alternative cooling or efficiency upgrades over time?

Will the AI bubble burst—and strand assets?

There’s also a macro risk: what if AI demand has been overhyped, and we’re in a bubble?

Here’s where energy storage has an advantage. If AI demand plateaus or drops, a well-sited BESS is still valuable as grid infrastructure. Batteries can:

  • Shift solar and wind energy into evening and peak periods
  • Provide frequency regulation, spinning reserve, and black start
  • Defer transmission upgrades by managing local congestion

In other words, storage attached to AI is not a single-use asset. Unlike a bespoke compute chip, a grid-connected battery can be repurposed for other loads and services if AI growth cools.


What This Means for Utilities, Businesses, and Green Tech Strategy

The reality is simpler than it looks: AI is forcing the grid to modernise faster, and energy storage is becoming the keystone technology in that transition.

If you’re a utility or grid planner

  • Treat AI projects as an opportunity to co-fund BESS capacity that also serves broader system needs.
  • Prioritise storage configurations that can handle both fast local fluctuations and system-level services.
  • Structure tariffs so large AI loads pay their fair share of grid upgrades without punishing existing customers.

If you’re a data centre or AI developer

  • Assume you’ll be expected to bring your own flexibility—typically through BESS.
  • Look for integrated solutions (like Prevalon/Emerson or ON.energy/EPC Power) that couple hardware with robust EMS and controls.
  • Design for water efficiency and transparency, especially if you want community and regulatory support.

If you’re investing in green technology

  • Watch the convergence of AI, storage, and grid modernisation rather than treating them as separate markets.
  • Focus on companies that can solve both the physical power problem and the software/control problem.
  • Expect demand for dispatchable, low-carbon capacity (including storage) to stay strong even if AI hype cools.

For our broader Green Technology series, AI-powered data centres are a litmus test. They can either become symbols of reckless energy use—or proof that digital growth can align with a cleaner, smarter grid.

The direction we choose depends on how aggressively we pair AI growth with energy storage, intelligent controls, and resource-efficient design.


AI’s power problem isn’t going away. But if we get this right, the surge in AI demand could accelerate deployment of battery energy storage systems, strengthen the grid, and create a faster path to a low-carbon energy system.

The next few years will decide whether AI becomes a drag on the energy transition—or one of its most unlikely accelerators.