AI data centers are straining the US grid, but they’re also creating a massive opening for battery energy storage to power growth while keeping it green and reliable.

AI data centers are on track to become some of the hungriest power users on the US grid, with single campuses now planning to draw more electricity than mid-sized nuclear plants can produce. That’s not a future scenario — it’s being designed into projects breaking ground before 2026.
Here’s the thing about that “AI power problem”: for green technology, it’s also an enormous opportunity. Battery energy storage systems (BESS) are moving from “nice-to-have grid flexibility” to core infrastructure for AI, cloud, and hyperscale data centers. And if you work in clean energy, infrastructure, or digital services, the choices made over the next 2–3 years will shape both your carbon footprint and your competitiveness for a decade.
This post looks at how AI-driven load growth is reshaping the US energy storage market, why storage has quietly dodged some of the policy hits renewables took, and how real projects in Michigan, Texas, and beyond show what practical, grid-safe AI infrastructure looks like.
AI’s Power Problem: Why Data Centers Need Storage Now
AI data centers are creating a new category of load: huge, spiky, and always-on. Traditional cloud data centers already pushed utilities; AI takes that stress and multiplies it.
Industry engineers are seeing AI facilities with “intense power fluctuations” unlike anything in previous cloud data centers. Think multi-hundred-megawatt swings in load as AI clusters ramp up or down. Conventional generation and transmission weren’t built for that kind of volatility.
Battery energy storage solves three immediate problems for AI campuses:
-
Stabilizing wild load swings
Storage can absorb and inject power in milliseconds, smoothing AI demand so the grid “sees” a calmer, predictable load instead of constant spikes. -
Deferring or avoiding expensive grid upgrades
Instead of waiting years and spending billions on new peaker plants or transmission lines, utilities can deploy BESS to support local capacity faster and often at lower total cost. -
Enabling cleaner power portfolios
As more solar and wind come online, storage keeps those resources relevant to 24/7 AI operations by shifting clean energy into peak hours and providing dispatchable capacity.
This is why storage has been labeled a “dispatchable” technology in US policy conversations, alongside nuclear and geothermal, rather than getting lumped in with variable solar and wind. That label matters because it’s shaping incentives, financing decisions, and project timelines.
Policy Reality: Storage Survives Where Solar and Wind Stumble
Most companies assume every form of clean energy rides the same policy roller coaster. That’s not what’s happening in the US.
While solar PV and wind have been hit by changes under the so-called One, Big, Beautiful Bill Act (OBBBA) and tighter investment tax credit (ITC) structures, energy storage walked away in better shape:
- ITC/PTC support for BESS still extends into the mid-2030s.
- New rules restrict tax benefits if projects involve certain Chinese “foreign entities of concern” (FEOC), which is a serious challenge given how much of the battery supply chain is China-centric.
- Even with that constraint, storage retains a clearer, longer runway of incentives than many variable renewables.
Why? Because in a grid dominated by rising AI load, dispatchability is king. Storage can respond in sub-seconds, turn “too much solar at noon” into “firm capacity at 7 p.m.”, and mimic some of the grid-stabilizing roles that gas turbines and coal plants used to play — without locking in long-lived fossil assets.
The economic reality is blunt:
Regardless of climate politics, demand growth is real, and storage is usually the fastest deployable, lowest-friction way to meet that demand without blowing up reliability metrics.
For developers, utilities, and data center operators, that means:
- BESS projects can still pencil out with strong tax support — if the supply chain meets FEOC rules.
- Long-term asset plans should assume storage as core infrastructure, not an optional add-on.
- Procurement teams need a strategy to diversify beyond Chinese-origin components over the next 3–7 years.
Case Study: “The Barn” AI Campus in Michigan
The most vivid example of AI-driven storage demand right now is the proposed 1GW+ data center campus in Saline Township, Michigan, backed by OpenAI, Oracle, and Related Digital.
What makes The Barn so important?
- Planned computing capacity: over 1GW
- Expected grid draw: about 1.4GW at full utilization
- That’s roughly 25% of utility DTE Energy’s entire current generation capacity, comparable to a mid-sized nuclear reactor.
You don’t plug that kind of project into the grid and hope for the best.
DTE’s approach is telling:
- The data center’s power needs will be met 100% by DTE using existing resources plus a new battery storage investment.
- That BESS investment is financed entirely by the project, not ratepayers.
- DTE argues that other customers will actually benefit as The Barn helps shoulder fixed grid costs.
This is the new template for grid-safe AI infrastructure:
- Massive load commits to local storage so it doesn’t destabilize the system.
- The utility gets new capital investment paid by private projects, not households.
- The campus itself targets high standards — in this case, LEED certification and a closed-loop cooling system that uses water more like an office building than a traditional mega–data center.
For green technology advocates, this isn’t perfect — water, land use, and embodied carbon are still real issues. But it’s a lot better than building another giant gas peaker just to feed GPUs.
What about water and sustainability concerns?
Groups like the Environmental and Energy Study Institute point out that large data centers can use up to 5 million gallons of water per day, matching the needs of a town of 10,000–50,000 people. AI-optimized campuses risk pushing those numbers even higher.
That’s why details like closed-loop cooling, location choice, and grid integration design matter. If your AI strategy ignores:
- Local water stress,
- Long-term emissions from backup generation,
- And community impacts,
…you’re building a backlash machine, not a future-proof asset.
How BESS Actually Supports AI: From Theory to Control Systems
Behind the scenes, the real enabler of AI-friendly, grid-friendly data centers isn’t just battery hardware. It’s control.
Prevalon + Emerson: Control Systems Built for AI Load
Prevalon Energy — a standalone BESS company spun out of Mitsubishi Power Americas — is pairing its HD5 energy storage platform and insightOS energy management system (EMS) with Emerson’s Ovation automation platform.
Their joint focus: hyperscale, colocation, and enterprise data centers.
Prevalon’s team is blunt about what sets their EMS apart: it works and it’s robust. That sounds simple, but in power systems, reliability under weird edge cases is everything.
In large AI-linked projects they’re targeting:
- Battery size: often 500–600MW with 2-hour durations
- Gas turbines at base load, providing steady generation
- Batteries handling the AI load swings, matching real-time demand as computing intensity changes
This hybrid architecture — steady thermal plus fast storage — is one stepping stone toward an eventual low-carbon grid where renewables plus storage do most of the balancing.
Prevalon’s track record reinforces that this isn’t just marketing:
- An 80MW/320MWh BESS is already operating at Idaho Power’s Happy Valley project.
- A 200MW/800MWh BESS is contracted with Idaho Power, with long-term maintenance and remote monitoring built in and deployment scheduled for 2026.
If you’re evaluating EMS vendors or grid integrations for data centers, the lesson here is clear: treat control software as strategically as the batteries themselves. Bad controls turn a 600MW BESS into a very expensive decoration.
EPC Power + ON.energy: Building the “AI UPS” Layer
Another angle is the power conversion and UPS layer. EPC Power is supplying power converters to ON.energy to support an “AI UPS” platform:
- Scalable from megawatts to gigawatts
- Supporting runtimes up to 8 hours
- Designed to absorb rapid load fluctuations while offering voltage and frequency ride-through
Why this matters:
- Traditional UPS systems were built for minutes of runtime and brief outages.
- AI data centers now need hours of resilience, both for grid disturbances and for complex power quality events.
- A storage-backed UPS can double as grid support, not just internal backup.
ON.energy is a good example of how storage players are shifting:
- Started in Latin American C&I storage
- Expanded into US distributed and utility-scale portfolios
- Operates as both a system integrator/EPC for others and a project owner-operator in front-of-the-meter markets
Its 160MWh Palo de Agua portfolio in Texas — financed with a US$77.6 million construction credit agreement — spreads BESS projects (9.9MW/20MWh each) across multiple sites. This distributed model is attractive for:
- Supporting data centers in regions with weak grid nodes
- Creating flexible capacity blocks that can be aggregated or monetized in ERCOT and similar markets
What This Means for Businesses Building or Serving AI
Most companies get this wrong: they treat energy as an afterthought to AI strategy. That’s how you end up with stranded assets, furious regulators, and sky-high operating costs.
The reality? If you’re touching AI infrastructure in any serious way, you need a storage strategy. Here’s what that looks like in practice.
For data center operators and cloud providers
- Bake BESS into your master plan: Don’t treat storage as an optional Phase 3 add-on; assume multi-hundred-MW BESS requirements from day one.
- Partner early with utilities: Co-design projects like The Barn, where new storage is privately financed but grid-integrated from the start.
- Insist on EMS depth, not glossy dashboards: Ask vendors how their systems handle AI load transients, grid faults, and cyber-physical security.
For energy developers and utilities
- Position storage as AI-enabling infrastructure, not just a grid asset. That framing opens up new revenue models and partnerships.
- Strengthen non-Chinese supply chains: FEOC restrictions won’t loosen soon. Start qualifying alternative suppliers and chemistries now.
- Use AI loads to justify broader grid upgrades that also support electrification of transport and industry.
For sustainability and ESG leaders
- Don’t ignore AI in your climate strategy: A single 1GW campus can blow past your Scope 2 assumptions.
- Push for:
- 24/7 carbon-aware procurement,
- Storage-backed clean power contracts,
- And transparent reporting on water, land, and community impacts.
Where Green Technology Goes Next With AI and Storage
AI isn’t slowing down. Even if some of the hype cools or an “AI bubble” deflates, the projects already announced add gigawatts of new, persistent load to the US grid over the next decade.
That’s why, in this Green Technology series, I keep coming back to the same point: AI can either accelerate decarbonization or lock in another generation of fossil assets. The deciding factor is how we power it.
Battery energy storage systems — from utility-scale BESS in Idaho and Texas to AI-specific UPS platforms and EMS software — are turning AI’s power problem into a bridge toward a lower-carbon, more flexible grid. They make it possible to:
- Match volatile AI demand with stable supply,
- Scale renewables without sacrificing reliability,
- And design data centers that are powerful, profitable, and defensible from a climate perspective.
If your organization is planning AI capacity, expanding data centers, or investing in grid assets, this is the moment to act. Build storage into your roadmap, not your regrets.
The next wave of winners in both AI and energy won’t just be the ones with the fastest chips — they’ll be the ones with the most intelligent, resilient, and sustainable power systems behind them.