Data Centre Growth: The Quiet Driver of AI Profits

AI Business Tools Singapore••By 3L3C

Data centre growth is shaping AI economics. Learn how Singapore businesses can plan AI tools with infrastructure-first thinking for better ROI.

data-centresai-operationsdigital-transformationai-governancecloud-costsbusiness-strategy
Share:

Featured image for Data Centre Growth: The Quiet Driver of AI Profits

Data Centre Growth: The Quiet Driver of AI Profits

A French electrical infrastructure company just raised its profit targets because of one thing: data centres.

On 12 Feb 2026, Legrand told investors that demand from data centres is strong enough to lift its medium-term profitability goals—aiming for average adjusted operating margins above 20% through 2030, up from “around 20%” in its 2024 plan. It’s not a flashy AI headline, but it’s a very real one.

For the AI Business Tools Singapore series, this matters because most AI adoption failures aren’t caused by “bad models.” They’re caused by boring constraints: power, cooling, uptime, security, integration, and cost control. The same infrastructure logic that’s helping Legrand hit higher margins is the logic Singapore businesses can use to scale AI without burning cash.

AI doesn’t scale on inspiration. It scales on infrastructure that stays up, stays secure, and stays within budget.

What Legrand’s data centre bet really signals

Legrand’s announcement is a signal about where money is flowing in the AI economy: into the plumbing.

The Reuters report (via CNA) highlights three concrete datapoints worth paying attention to:

  • Legrand’s 2025 revenue rose 9.6% to €9.48B.
  • Adjusted operating profit increased 10.5% to €1.96B.
  • Data centres contributed 26% of 2025 revenues, with potential to reach up to 40%.

Those aren’t “pilot project” numbers. That’s a company reshaping around the AI-driven buildout of compute infrastructure.

Why this isn’t just a hyperscaler story

It’s easy to think data centres only matter to cloud giants. The reality: data centre growth ripples outward into every supplier category—power distribution, UPS, racks, cabling, monitoring, building management systems, and security.

Legrand sits in that ecosystem, selling electrical and digital building infrastructure. When data centre demand spikes, it’s not just servers that get bought. It’s everything that keeps the servers stable.

For Singapore businesses, the parallel is straightforward: if you’re rolling out AI tools across customer service, sales ops, marketing automation, or internal analytics, you’re also increasing demand on your own “mini data centre”—your networks, identity systems, APIs, and governance.

The Singapore angle: AI adoption rises, so do infrastructure expectations

Singapore’s AI adoption trend keeps pushing upward across sectors (financial services, logistics, retail, professional services, public sector). But the less-talked-about story is what has to improve alongside that adoption.

If you’re using AI business tools in Singapore—whether it’s LLM-powered support, automated content workflows, forecasting, or fraud detection—you’ll feel these infrastructure pressures quickly:

  • Latency and reliability: AI features embedded into workflows can’t be “sometimes available.”
  • Security and compliance: More data pathways means more risk.
  • Cost discipline: AI usage-based pricing is sensitive to waste (bad prompts, duplicate queries, messy knowledge bases).
  • Integration: AI tools create new dependencies across CRM, ERP, helpdesk, data warehouses, and identity providers.

The point of Legrand’s story isn’t “copy their acquisitions.” It’s: treat infrastructure as a profit lever, not an IT expense.

A practical translation: infrastructure choices shape AI ROI

Two companies can deploy the same AI tool and get opposite outcomes.

  • Company A has fragmented data, weak access controls, and manual processes. AI adds cost and risk.
  • Company B has clean data flows, role-based access, monitoring, and clear “what good looks like.” AI drives throughput and margin.

Legrand’s raised margin targets are the public-market version of Company B.

Data centres are growing because AI workloads behave differently

AI workloads don’t just increase compute usage; they change how compute is used.

Traditional enterprise systems are relatively predictable. AI inference and training bursts are not. That creates demand for:

  • Higher power density (more power per rack)
  • More rigorous cooling
  • Better redundancy and monitoring
  • Faster deployment cycles (because AI demand changes fast)

Legrand’s CEO Benoit Coquart put it bluntly: the company has “just scratched the surface of AI.” That’s not marketing language; it’s an infrastructure forecast.

What this means for Singapore SMEs (yes, SMEs)

Most SMEs won’t build data centres. But SMEs will still face the downstream effects:

  • Cloud providers pass on infrastructure costs through pricing.
  • AI tools become more “always-on,” raising expectations for uptime.
  • Customers get less tolerant of slow responses in chat, support, and order updates.

So the real question becomes: how do you scale AI-powered operations without scaling chaos?

A better way to plan AI investments: start from constraints

Most companies plan AI like a feature roadmap: “We’ll add an AI chatbot, then an AI sales assistant.” That’s backwards.

Plan AI like Legrand plans growth: from constraints and capacity.

Step 1: Map your AI workloads to business outcomes

Be specific. “Improve customer experience” is not a workload. Try:

  • Reduce first-response time from 4 hours to 15 minutes
  • Increase lead-to-meeting conversion from 1.5% to 2.2%
  • Cut finance reconciliation time by 30%

Then map each outcome to the AI workloads required (search, summarisation, classification, extraction, recommendation).

Step 2: Decide where the workload should run

For most Singapore businesses, you’ll mix:

  • SaaS AI tools (fastest time to value)
  • Cloud AI services (more control and integration)
  • On-prem or private compute (only when regulation, latency, or cost justifies it)

Data centre growth is partly driven by companies shifting workloads to where they can control performance and cost. You don’t need to follow the same path—but you do need to choose intentionally.

Step 3: Put governance in place before rollout

If you’re trying to generate leads with AI tools while keeping brand risk low, governance is not optional.

At minimum:

  • Define approved tools and data types (what can/can’t be pasted into an LLM)
  • Enforce identity and access controls (SSO, role-based permissions)
  • Log usage for cost and risk management
  • Set review rules for customer-facing outputs

Governance is how you avoid the “AI tax” of rework, incidents, and legal panic.

Legrand’s acquisition strategy offers a useful lesson for ops teams

Legrand said it completed seven acquisitions in 2025 and continues to review roughly 400 companies as potential targets, with about half in energy and digital transition. In 2026 it bought Green4T (Brazil) and Kratos Industries (US), focused on data centre infrastructure and power distribution.

You’re not acquiring companies, but you are assembling a stack.

The stack approach: build capability clusters, not random tools

Here’s what works when selecting AI business tools in Singapore:

  1. One “system of record” per function (CRM for sales, helpdesk for support, ERP/accounting for finance)
  2. One integration layer (iPaaS or strong API discipline)
  3. One analytics layer (single source of truth for metrics)
  4. AI on top (assistants, copilots, automation) that reads from governed data

Random tool buying creates duplicate data, conflicting answers, and inflated subscriptions.

If your AI tool can’t reliably access the right data, it won’t produce reliable work—no matter how good the model is.

“People also ask” (and the answers are practical)

Do we need a data centre strategy to use AI tools?

No—but you do need a compute and cost strategy. For most teams, that means deciding what stays in SaaS, what needs cloud APIs, and how you’ll control usage.

Why do data centres affect AI tool costs?

Because AI workloads drive infrastructure spend (power, cooling, redundancy). Cloud and SaaS pricing eventually reflects that. Cost control becomes a competitive advantage.

What’s the first infrastructure upgrade to make AI work better?

Clean identity and access management (SSO + roles) and clean data foundations (a maintained knowledge base, consistent CRM fields). Those two reduce risk and waste quickly.

What to do next (especially if your 2026 plan includes AI)

Legrand expects 2026 sales growth of 10%–15% and operating margin 20.5%–21% after acquisitions. The detail isn’t the point; the mindset is: invest where demand is structural, and make margins a design goal.

If you’re leading growth or operations in Singapore, do this in the next 30 days:

  1. Pick one AI workflow tied to a measurable KPI (support deflection, lead qualification, renewal outreach).
  2. Audit the infrastructure dependencies (data source, permissions, integrations, monitoring, fallback process).
  3. Set a usage budget (token/seat/API limits) and a weekly review cadence.
  4. Instrument the outcome (baseline vs post-launch metrics).

AI business tools are easiest to buy and hardest to scale. Infrastructure is what makes the scaling part predictable.

Where do you expect your biggest AI constraint to show up first in 2026—data quality, security/compliance, or cost control?

🇸🇬 Data Centre Growth: The Quiet Driver of AI Profits - Singapore | 3L3C