AI workloads turn 100% of power into heat. Learn how efficient data centre cooling in South Africa cuts cloud costs and supports sustainable e-commerce growth.

AI-Ready Data Centres: Cooling That Cuts Cloud Costs
Most online businesses never see the single largest “feature” that determines how expensive (and how green) their digital services will be: how efficiently a data centre removes heat.
That matters in South Africa right now. December is peak season for e-commerce, customer support volumes spike, fraud attempts rise, and recommendation engines run hotter than usual. If your checkout, search, or WhatsApp support bot slows down, customers don’t wait—they bounce. Behind that reliability is a simple equation: every watt your servers consume turns into heat, and that heat must be removed.
A recent look inside Africa Data Centres (ADC) operations in Johannesburg highlights something many teams underestimate: cooling design, temperature strategy, and airflow control are now core enablers for AI-powered e-commerce and digital services in South Africa—not background facilities work.
Cooling is the hidden tax on every AI workload
Answer first: In a data centre, the electricity bill isn’t just “computing.” A meaningful portion goes to moving heat out of the room, and the efficiency of that process shows up in the metric PUE (Power Usage Effectiveness).
Here’s the reality I wish more product and growth teams internalised: if your AI features drive more GPU usage—personalised recommendations, real-time pricing, demand forecasting, image search—you’re also buying more cooling.
ADC’s regional executive Angus Hay makes a point that’s easy to repeat in a boardroom: 100% of the power that goes into electronics comes out as heat. If you bring 1MW into IT equipment, you must extract about 1MW of heat. The variable cost is how much extra energy you spend to do that extraction.
PUE in plain language (and why it hits your margins)
Answer first: PUE = total facility power / IT equipment power. If a site runs at 1.3 PUE, it uses about 30% extra energy on top of IT load to keep everything running (cooling, power conversion, fans, pumps, etc.).
So if an e-commerce platform is budgeting for a bigger AI footprint in 2026, the real question isn’t only “How many GPUs?” It’s also:
- What PUE will the workload land on?
- Can the provider keep that PUE stable during hot weeks and peak traffic?
- What’s the plan for renewable supply as Eskom constraints and tariffs continue to bite?
Lower PUE doesn’t just sound good in ESG reports. It’s a direct lever on cloud cost, gross margin, and feature profitability.
Design beats heroics: why modern cooling wins
Answer first: You can’t “operate your way” to great efficiency. Data centre energy performance is mainly locked in by design—especially the cooling system.
ADC’s approach reflects what top-tier facilities teams have been saying for years: operations tweaks help, but the big gains come from how chillers are selected, placed, shaded, and controlled, plus how airflow is contained.
A few design choices from the ADC example matter for anyone building digital services at scale:
1) Closed-loop cooling means near-zero water waste
Answer first: Closed-loop systems circulate water in a sealed loop; they don’t rely on evaporating water as a primary cooling method.
Globally, water use by data centres has become a reputational and regulatory flashpoint. Some facilities improve cooling efficiency by evaporating water—effective, but controversial in water-stressed regions.
ADC’s closed-loop approach positions African facilities (including South Africa) differently: minimal water waste becomes a selling point. For digital businesses with ESG commitments—or those selling to corporate and government buyers—this can become procurement-critical.
2) Shade your chillers, don’t cook them
Answer first: Putting chillers in direct sun makes them less efficient because they must remove additional heat absorbed from the environment.
ADC’s JHB1 facility uses a soft-shell roof to shade chillers. It sounds obvious, yet plenty of sites still leave chillers “at the mercy of the sun.” In practice, shading reduces thermal load, helping stabilise performance and lowering energy spend.
For AI-heavy e-commerce platforms, that stability matters: cost predictability is part of uptime.
3) “Free cooling” is a Johannesburg advantage
Answer first: When the outside air is cool enough, facilities can reduce or switch off refrigeration compressors and use outside conditions to help cool the system.
ADC notes that when outside temperature is below about 17°C, it can turn off refrigeration units. Johannesburg reportedly gets around 180 days per year where free cooling can be used, translating to roughly 5%–10% less energy than always running chillers.
This is one of South Africa’s underappreciated infrastructure advantages: in the right climate zones, modern data centres can run AI workloads more efficiently for a large portion of the year.
Temperature set-points: stop treating data halls like fridges
Answer first: Running colder than needed is expensive. Modern facilities often target a mid-range set-point that keeps equipment safe while cutting cooling energy.
ADC targets around 23°C–24°C as a set-point, aiming for the mid-point of the commonly referenced ASHRAE recommended envelope (often cited broadly as 18°C–30°C, depending on class and equipment).
Older data centres sometimes run much colder—Hay describes them as “like fridges.” The problem is simple: the lower the set-point, the more work the cooling system must do, and the more energy you pay for.
For e-commerce and digital service leaders, this links directly to AI scalability:
- More AI inference (recommendations, search ranking, customer service copilots) increases compute density.
- Higher density increases heat load.
- If the facility insists on “fridge mode,” your cost per transaction rises.
A practical stance: when you’re choosing colocation or cloud regions, ask what set-points they run and how they manage hot spots—because your AI roadmap will amplify any inefficiency.
Airflow containment: the simplest way to waste less cooling
Answer first: Containment forces cold air through racks and prevents mixing with hot exhaust air. Without it, you’re cooling the room, not the servers.
Cold aisle/hot aisle containment is one of those topics that sounds like facilities trivia—until you realise it’s a multiplier on everything else. If the cold supply air leaks into the wrong places, you need more fan power, lower set-points, or more chiller effort to compensate.
ADC highlights two operational truths that are surprisingly common failure points:
- Gaps in racks must be sealed with blanking panels.
- Containment must be treated as a system, not a “nice-to-have” for a few cabinets.
If you’re running 300 racks, minor leakage becomes a major cost. For high-density AI racks, it becomes a risk.
Where AI and automation fit into containment
This is where the series theme comes in: AI isn’t only an application-layer story. It can also run the building.
Facilities teams increasingly use sensor data and control loops to automate:
- Dynamic fan speed control based on rack inlet temperatures
- Hot spot detection from thermal sensors
- Predictive maintenance for CRAC/CRAH units and pumps
- Automated alerts when a rack door is left open or a panel is missing
For South African operators facing tight energy constraints, this kind of automation is not about fancy dashboards. It’s about keeping PUE from drifting upward when conditions change.
The ESG side: energy source is the next battleground
Answer first: Efficient cooling reduces total energy, but cleaner supply determines carbon impact. Both matter to customers and to regulators.
ADC frames its ESG reporting using Scope 1, 2, and 3 language:
- Scope 1: on-site emissions (e.g., running diesel generators)
- Scope 2: emissions from purchased electricity
- Scope 3: supply chain emissions (equipment manufacturing, logistics)
For South African e-commerce brands and fintechs, this isn’t academic. Enterprise customers increasingly ask for emissions reporting as part of vendor due diligence. If your AI stack relies on third-party infrastructure, you inherit part of that conversation.
Wheeling and renewables: real progress, real friction
ADC notes it’s in RFP processes with renewable providers, and that the market has shifted: suppliers now actively pursue data centres as anchor customers. The harder part is implementation—especially wheeling arrangements involving municipalities and Eskom.
The takeaway for digital leaders: if your growth plan assumes “we’ll offset later,” you’re leaving risk unmanaged. Procurement timelines, wheeling agreements, and grid constraints move slower than product roadmaps.
What e-commerce and digital service teams should do next
Answer first: Treat data centre efficiency as a product dependency. Ask better questions before your AI roadmap forces you to.
If you’re building or buying AI-powered digital services in South Africa, here’s a practical checklist I’ve found useful when talking to hosting, colocation, or cloud partners:
-
Ask for PUE (and how it’s measured).
- What’s the PUE on new halls versus older halls?
- How does it vary seasonally?
-
Confirm temperature strategy.
- What set-point do they run (e.g., 23°C–24°C)?
- What’s the allowable range during abnormal heat events?
-
Understand cooling architecture.
- Closed-loop or evaporative?
- Any measures like shading or chiller placement optimisation?
-
Check containment discipline.
- Is containment standard?
- How do they prevent leakage (blanking panels, audits, sensors)?
-
Get an ESG-ready answer.
- Renewable procurement plans?
- Generator testing policies and reduction goals?
- Can they share Scope 2 reporting data on request?
-
Plan for AI density.
- Can the facility support higher rack densities without “emergency cooling” workarounds?
- What’s the path to scaling from today’s load to next year’s peak season?
These aren’t “facilities questions.” They’re unit economics questions.
Why this matters for South Africa’s AI commerce boom
South Africa’s digital economy is pushing more workloads into data centres: payments, identity verification, fraud detection, personalisation, and customer support automation. The more AI you deploy, the more compute you consume—and the more heat you must remove.
Cooling innovation—free cooling, smarter containment, sensible set-points, water-wise systems, and renewables procurement—is what makes that growth financially sustainable. If your AI roadmap is ambitious (and it should be), your infrastructure choices can either keep costs under control or quietly inflate every transaction.
If you’re planning 2026 campaigns, new AI features, or a bigger move into omnichannel commerce, what would change if you evaluated infrastructure the same way you evaluate marketing spend: by measurable efficiency and predictable outcomes?