AI data centers are exploding in numberâand in water use. Hereâs how smart siting, dry renewables, and better cooling can make them truly green.

Most people know AI is energy-hungry. Fewer realize itâs also waterâhungry. In 2023, U.S. data centers used roughly 17 billion gallons of water onâsite for cooling â but more than 211 billion gallons indirectly through the power plants that feed them. That imbalance is the real story: over 70 percent of a data centerâs water footprint usually comes from the grid, not the cooling towers.
This matters because AI adoption is exploding right now, and so is the global push for green technology. If your company is betting on AI and digital infrastructure, how you build and where you build will decide whether youâre part of the climate solution or quietly worsening water stress.
The reality? Designing waterâsmart data centers is simpler than you think â if you stop treating water, energy, and location as separate decisions.
In this post, Iâll break down what the latest research is telling us about âthirstyâ data centers, why location and energy mix are the biggest levers you have, and what practical steps operators, policymakers, and sustainability teams can take to make AI infrastructure genuinely sustainable.
Why data centers use so much water
Data centers consume water in two main ways: direct cooling and indirect power generation. The second one is usually much bigger.
Direct water use: keeping servers alive
On-site, water is mostly used for cooling:
- Highâdensity servers generate a lot of heat.
- Chilled water or evaporative cooling systems pull that heat away so equipment doesnât fail.
- Some sites also use water for humidification and building HVAC.
This is the part communities see: new data centers applying for water permits, or local headlines about millions of gallons per day. Itâs real, and in some regions itâs already political.
Indirect water use: the hidden footprint in the grid
The larger slice of the pie is offâsite water use embedded in electricity. Hereâs where the numbers get uncomfortable:
- In many regions, electricity still comes predominantly from thermoelectric power plants (coal, gas, nuclear).
- These plants burn fuel to heat water into steam, spin turbines, then cool and condense that steam. That cooling step takes huge amounts of water.
- Hydropower has its own water footprint: large reservoirs lose enormous volumes to evaporation.
Cornell researchers found that more than 70% of a typical data centerâs water use comes from electricity generation, not cooling towers on the property. So if you only optimize onâsite water use, youâre working on the smaller half of the problem.
If your sustainability dashboard only tracks âliters per kWhâ on-site, youâre undercounting your real impact.
For green technology to live up to its name, AI infrastructure has to account for both.
The core insight: where you build changes water use by 100x
The Cornell study on U.S. data centers makes one thing crystal clear: location is a climate and water decision.
The researchers modeled direct and indirect water and energy use for data centers across the country and found that environmental impacts can vary by up to a factor of 100 between locations. Same size data center, same load â radically different outcomes.
Why âdry renewablesâ beat âwet powerâ
The lowestâwater data centers share one trait: they sit on grids dominated by wind and solar, sometimes backed up by gas or other sources, but with minimal reliance on waterâintensive plants.
Wind and solar are often called âdry renewablesâ because they produce electricity with virtually no ongoing water use. Thatâs a massive advantage over:
- Coal and gas plants with onceâthrough or recirculating cooling
- Nuclear plants with large cooling needs
- Hydropower with big reservoir evaporation losses
So the trick isnât just âuse renewables.â Itâs use lowâwater, lowâcarbon renewables.
The surprising winners: West Texas and the âwind beltâ
Cornellâs analysis highlights some locations that donât fit the traditional data center playbook:
- West Texas â One of the lowest gridârelated water footprints in the country thanks to huge wind buildâout and relatively low population density. Groundwater can support cooling if managed responsibly.
- Nebraska, South Dakota, Montana â Similar profile: high wind and solar potential, low current water stress in many areas, and space to grow.
From a pure energy + water efficiency perspective, these states are among the best places in the U.S. to host AI servers.
The counterintuitive losers: the hydropowerârich Northwest
By contrast, much of the Pacific Northwest scores poorly on water footprint despite cheap, lowâcarbon electricity. The culprit is hydropowerâs evaporation losses from large reservoirs.
So you end up with this tradeâoff:
- Carbon: Northwest hydropower looks great.
- Water: Itâs often worse than a windâdominated grid.
This is where a mature green technology strategy has to move beyond single metrics. Carbon alone isnât enough; carbon + water + location is the new baseline.
How AI and cloud leaders are already shifting
The big players arenât ignoring this. If anything, theyâre a few steps ahead of most policymakers.
Siting around water stress
A separate study from Purdue University looked at Googleâs U.S. data centers and overlaid them with current and projected water stress. Their conclusion: most of Googleâs existing data centers are already in lowerâstress regions.
That doesnât happen by accident. When sites represent billions in capital and decades of operation, you donât want to gamble on running out of water or facing community backlash.
Iâve seen more RFPs and siteâselection briefs in the last two years that explicitly ask for:
- Longâterm water availability and rights
- Projected water stress under climate scenarios
- Local groundwater and surface water governance
- Nearby renewable energy buildâout potential
Itâs not just about risk avoidance either. Thereâs a brand and talent dimension: âthirsty AIâ is bad PR in 2025, especially if youâre building in regions already battling drought.
Waterâpositive, not just carbonâneutral
The next frontier for credible green technology is going beyond neutral:
- Carbonâneutral â 24/7 carbonâfree power
- Waterâneutral â waterâpositive, where companies restore more water to stressed basins than they withdraw
Several hyperscalers are already committing to balancing or exceeding their water withdrawals through watershed restoration, agricultural efficiency projects, and urban reuse. Siting in lowâstress, windâheavy regions makes those commitments far cheaper to fulfill.
Practical steps to make data centers less thirsty
If youâre planning, operating, or regulating AI and cloud infrastructure, you have more levers than you might think. Hereâs how to use them.
1. Treat water footprint as a firstâclass siting criterion
Site selection usually starts with land prices, tax incentives, fiber routes, and grid capacity. Those still matter. But the research is blunt: if you ignore water and energy mix, youâre locking in avoidable impacts for decades.
At minimum, put these questions on page one of any siteâselection brief:
- Whatâs the current and projected water stress at this location?
- Whatâs the grid mix now and in 10â20 years? How much is wind/solar vs. coal, gas, nuclear, hydro?
- Whatâs the water use per MWh of the local grid compared with other candidate sites?
- Can we sensibly add new wind/solar capacity tied to this site (PPAs, direct investment, or coâlocation)?
If two sites are similar on cost and latency, the one with a drier grid is almost always the smarter longâterm bet.
2. Decarbonize and dry out your energy supply
Green technology strategies that focus only on carbon can unintentionally increase water risk â for example, by leaning heavily on nuclear in arid regions or oversizing new reservoirs.
A better approach is to prioritize âdryâ lowâcarbon power:
- Wind and solar as the primary growth sources where feasible
- Battery storage to firm those renewables instead of defaulting to new gas peakers
- Gridâaware workloads that shift flexible compute to regions with surplus wind/solar
For AI workloads, this is especially powerful. Training runs and some inference jobs are timeâflexible and locationâflexible. Thatâs a gift: you can literally move part of your water footprint to a better grid.
3. Use smarter cooling, especially in dry, windy regions
Siting in West Texas or the northern plains doesnât give you a free pass on direct water use. You still have to be thoughtful.
Options that work particularly well in dry, windy climates:
- Airâside and adiabatic economization â Use outside air when conditions allow; add minimal water when you need extra cooling.
- Hybrid cooling systems â Switch between waterâbased and airâbased operation depending on temperature, humidity, and grid conditions.
- Closedâloop water systems â Reduce withdrawals and avoid onceâthrough cooling.
- Nonâpotable sources â Reclaimed wastewater, industrial effluent, or brackish groundwater instead of drinking water.
Pair those with tight monitoring, and you can support highâdensity AI racks without stressing local utilities.
4. Make water, carbon, and energy visible to decisionâmakers
One of the biggest gaps I see is organizational: sustainability teams have water and carbon dashboards; infrastructure teams have uptime and cost dashboards; finance has an ROI sheet. They often donât talk to each other early enough.
Fix that by building integrated metrics that matter to all three:
- kWh per AI inference or per training run
- Grams of COâe per kWh at the actual time and place workloads run
- Liters of water per kWh including gridâembedded water
- Projected water risk cost (e.g., probabilityâweighted cost of curtailment, permits, community pushback)
Once those numbers sit next to each other in board decks and capex approvals, siting and architecture decisions start to shift.
5. Use policy and incentives to steer the boom
For policymakers in states like Nebraska, South Dakota, or Montana, the Cornell results are basically a strategic memo.
If you want green technology jobs without draining rivers:
- Streamline permitting for data centers that commit to dry renewables + waterâefficient cooling.
- Offer tax incentives tied to water and carbon performance, not just headcount.
- Invest in transmission lines that unlock highâquality wind and solar resources near data center hubs.
- Require transparent reporting on water withdrawals, discharges, and grid mix.
Most companies building at scale will go where the policy environment is clear and supportive. If you design it around lowâwater AI infrastructure, thatâs what youâll attract.
Why this matters for the future of green technology
The green technology story isnât just about swapping fossil fuels for clean energy. Itâs about building digital infrastructure that respects physical limits â especially water.
AI is going to underpin smart grids, climate modeling, agriculture optimization, and more. It makes no sense for the core compute that drives those solutions to quietly worsen drought risk.
Thereâs a better way to approach this:
- Put data centers where wind and solar are abundant and water stress is low.
- Design them with efficient, flexible cooling systems that use nonâpotable water where possible.
- Power them with dry renewables so both carbon and water footprints drop together.
- Align corporate strategies and public policy so the AI boom strengthens, instead of strains, local communities.
If your organization is scaling AI right now, this isnât a theoretical question. The siting decisions you make over the next 12â24 months will shape your environmental footprint for decades.
So the real question is: Will your AI infrastructure be part of the water problem â or a model for how green technology can grow without drying out the places we live?