AI is becoming the quiet enabler of low-carbon steel—through clean power coordination, hydrogen optimization, and verifiable emissions data. Learn what utilities can do next.

AI’s Hidden Role in Scaling Low-Carbon Steel
Steel is responsible for about 8% of global greenhouse gas emissions, yet demand isn’t slowing down. Annual production sits around 1.8 billion metric tons, and projections put demand near 2.5 billion metric tons by 2050—roughly a 39% increase. Those numbers should land differently if you work in energy and utilities, because steel’s next chapter isn’t only a materials story. It’s a power system and data problem.
Most people frame “green steel” as a choice between hydrogen, electrification, or more recycling. That misses the operational reality: the winners will be the producers (and their energy partners) who can run new processes reliably, cheaply, and with verifiable emissions data. That’s where AI shows up—quietly, but decisively—through forecasting, optimization, predictive maintenance, and carbon measurement.
In this installment of our AI in Energy & Utilities series, I’ll connect what’s happening in next-gen steelmaking to the tools energy teams already use for grid optimization and asset performance. The steel sector is becoming a stress test for industrial decarbonization: high heat, tight margins, huge capex, and nonstop reliability expectations. If AI can make this work for steel, it can generalize across heavy industry.
Steel decarbonization is an energy system problem first
The key point: steel’s carbon footprint is dominated by coal-based blast furnaces, and replacing that pathway changes the energy inputs from “coal in a furnace” to “electricity and/or hydrogen in a plant.”
Today, integrated blast furnace–basic oxygen furnace (BF-BOF) facilities produce roughly 70% of global crude steel and generate around two tons of CO₂ per ton of steel. That’s not a rounding error—it’s structural. And because steel assets commonly run for decades, bad investment timing can lock emissions in for a generation.
When steelmakers shift toward electric arc furnaces (EAFs), direct reduced iron (DRI), hydrogen-DRI (H2-DRI), and novel low-temperature processes, they’re also shifting their dependency to:
- Clean firm power (EAFs are only as clean as the grid)
- Low-cost clean hydrogen (for H2-DRI and several novel processes)
- High-quality ore and pellets (DR-grade feedstock is constrained)
- Operational flexibility (to ride variable renewable output without ruining product quality)
Here’s the practical implication for utilities: steel is moving from being a “big industrial customer” to being a co-optimizer of the grid—a flexible load, a hydrogen offtaker, and sometimes a co-located generation partner.
Why 2026 matters (and why it’s not enough)
A landmark moment is expected in Northern Sweden in 2026, where a new entrant plans to sell commercial volumes of near-zero emissions steel. The project’s scale—5 million metric tons—sounds enormous until you compare it to global production. It’s a sliver of the total.
The signal isn’t the percentage. It’s the playbook: premium contracts tied to emissions intensity, offtake-backed financing, and a long list of real-world commercialization hurdles (including uncommitted future production). That “messy middle” is where AI can reduce cost and risk.
The three decarbonization pathways—and where AI fits
Steel decarbonization isn’t one technology. It’s a portfolio. A helpful way to think about it is Make Less, Make Better, Make New—and each category has AI opportunities that energy and utility teams recognize instantly.
Make Less: circularity, smarter design, better sorting
The fastest emissions reduction is often not producing a ton at all. Scrap-based steelmaking (secondary production) can be significantly lower-emissions, but scrap availability and quality vary by region. Steel products last 10–30 years, so scrap markets mature slowly.
AI shows up here in a very direct way: computer vision + sensors + optimization for scrap sorting and quality grading. Several innovators are pairing advanced optical sensors with AI models to:
- Increase sorting yields and reduce contamination (copper and tin are repeat offenders)
- Reduce manual inspection and safety exposure
- Create more uniform feedstocks for EAFs
From an energy angle, this matters because higher-quality scrap enables more predictable EAF operation—meaning more stable load profiles, fewer quality rejects, and less wasted electricity per ton.
Make Better: upgrading existing assets without locking in coal
Retrofitting BF-BOF sites is tricky. There’s no credible path to net-zero blast furnace steelmaking, so retrofit options should be low-capital and low-disruption—otherwise you risk “coal lock-in” through sunk-cost thinking.
This is where AI earns its keep in heavy industry: operational efficiency and downtime avoidance. If you’re trying to reduce emissions at an existing plant while keeping throughput, you need:
- Advanced process control to trim energy use while holding quality targets
- Predictive maintenance to prevent minute-by-minute downtime losses
- Digital twins for “try it in software first” retrofit planning
Utilities see the same pattern on the grid. You don’t rebuild the network every time you add renewables—you squeeze more performance out of existing assets with forecasting, automation, and better controls.
Make New: low-temperature and modular iron reduction
The most interesting wave of innovation in steel is happening below the traditional temperature assumptions. Some emerging approaches reduce iron at under ~350°C, which makes them easier to pair with variable wind and solar. Low-temperature processes can tolerate cycling better than ultra-high-temperature systems.
This is a big deal for energy planners. It creates a world where certain “green steel” units can behave like:
- Flexible industrial loads that ramp with renewable output
- Co-located anchor customers for new solar/wind buildouts
- Distributed production nodes near ports, mines, or demand centers
Modularity is the second shift. Some innovators are targeting units around 50,000 metric tons per year, far smaller than conventional DRI plants. Smaller units can mean faster deployment and siting options that weren’t realistic before.
AI becomes the operating system for this model—because distributed assets don’t scale with headcount. They scale with:
- Remote monitoring and anomaly detection
- Fleet-level optimization across many similar units
- Automated quality control and dispatch scheduling
Clean firm power: the bottleneck utilities can actually solve
The key point: EAF-based steelmaking is constrained by clean electricity, not metallurgy.
Steelmakers like EAFs because they can reduce capex and operating cost, and they pair naturally with scrap recycling. But there’s a hard constraint: EAFs need substantial electricity and their thermal requirements don’t naturally align with intermittent supply without storage or balancing.
For utilities and energy providers, this is familiar territory. It’s the same system problem as high-renewables grids:
- Forecast the net load
- Manage congestion
- Dispatch storage and firming capacity
- Maintain reliability and power quality
Where AI slots into utility–steel collaboration
If you’re selling power to (or developing behind-the-meter power with) a steel producer, AI can support bankability and operations in concrete ways:
-
Load forecasting at sub-hourly resolution
- Align procurement, hedging, and dispatch with real process needs.
-
Demand response and flexibility orchestration
- Identify which parts of the process can shift without creating scrap, rejects, or safety issues.
-
Power quality monitoring and root-cause analysis
- EAFs can introduce harmonics and flicker; machine learning can flag patterns before they become penalties or equipment damage.
-
Asset health for shared infrastructure
- Predictive maintenance for transformers, rectifiers, and high-duty-cycle switchgear pays back quickly at these load levels.
A sentence worth keeping: Green steel scales fastest where the grid operator and the plant operator share the same forecast.
Hydrogen-DRI: why it stalls—and how AI reduces the pain
H2-DRI is the only commercially available pathway that can reach net-zero direct emissions in the critical reduction step—if the hydrogen is produced with renewable electricity. But projects have slowed in parts of the US and EU due to hydrogen cost, hydrogen availability, and balance-sheet realities.
AI won’t magically make hydrogen cheap. What it can do is reduce the “extras” that make hydrogen projects fail in committee:
Lower the hydrogen requirement per ton
Several approaches aim to recycle or loop process gases back into the reduction system. In practice, that requires tight controls and continuous monitoring.
AI helps by:
- Optimizing recycle ratios in real time
- Detecting sensor drift and recalibrating models before off-spec product appears
- Predicting when feedstock changes (ore chemistry, pellet quality) will increase hydrogen demand
Make electrolyzers run like grid assets
Electrolyzers are increasingly treated like controllable loads. With the right controls, they can ramp to absorb low-cost renewable output and ramp down during scarcity.
AI supports:
- Price-aware dispatch (when paired with market access)
- Degradation-aware scheduling (minimize wear while meeting supply needs)
- Co-optimization with storage (battery and hydrogen storage acting as one portfolio)
For utilities, this is the bridge: hydrogen for steel looks a lot like grid balancing with an industrial purpose.
Measurement and offtake: “green steel” needs proof, not promises
Steel decarbonization is increasingly driven by contracts: premium pricing for low-emissions material, procurement standards, and product-level claims. But buyers are getting sharper. They want auditable emissions data at the product level.
AI-enabled measurement, reporting, and verification (MRV) is becoming non-negotiable because the data is messy:
- Multi-step supply chains (ore beneficiation, pelletizing, transport)
- Variable electricity emissions factors by time and location
- Process emissions that depend on operating conditions
A practical MRV stack for low-carbon steel typically includes:
- Automated data capture from SCADA, historians, and meters
- Emissions models that translate operations into COâ‚‚e by batch
- Anomaly detection to flag “too good to be true” readings
- Reporting outputs aligned to customer requirements
This parallels what progressive utilities are already building for hourly carbon accounting and clean energy claims. The steel sector is simply forcing the discipline faster.
What energy and utility leaders should do in 2026 planning cycles
Steel demand is growing fastest in regions where energy systems are still being built out. That’s not a threat; it’s an opening. The most useful stance for utilities and energy developers is to treat green steel as a long-lived anchor load that can justify new clean capacity.
Here are practical moves I’d prioritize if you’re building an AI roadmap for industrial decarbonization customers:
-
Offer an “industrial flexibility package,” not just an electricity tariff
- Bundled forecasting, curtailment logic, and availability guarantees can beat a cents-per-kWh race.
-
Build a joint digital twin with the customer
- Model grid constraints, plant ramp limits, and storage options in one place. This reduces the friction that kills projects late.
-
Design contracts around measured performance
- Tie incentives to verified load shifting, power factor compliance, uptime, or carbon-intensity windows.
-
Treat data integration as first-order engineering
- If you can’t ingest plant data securely and in near real time, you can’t optimize anything that matters.
-
Plan for power quality from day one
- Especially with EAFs. Monitoring + mitigation beats retrofits after complaints and penalties.
If you want a single thesis for the board slide: AI reduces the cost of coordination between clean power, hydrogen, and industrial process control—and coordination is the real bottleneck.
The steel lesson for AI in Energy & Utilities
Green steel is often described as a technology race. I think it’s more accurate to call it an operations race—who can produce consistent quality at scale while power prices, hydrogen supply, and policy signals remain volatile.
That’s why this topic belongs in an AI in Energy & Utilities series. The same capabilities that keep grids stable—forecasting, optimization, predictive maintenance, and measurement—are now the enabling layer for industrial decarbonization.
If you’re an energy provider, the opportunity isn’t just selling electrons. It’s helping industrial customers run clean processes profitably, then proving the carbon claims with data. Which part of that stack—forecasting, flexibility, hydrogen optimization, or MRV—are you building first for your largest industrial loads in 2026?