Last Updated 5 days ago by Kenya Engineer
As artificial intelligence continues its rapid march from research labs into mainstream business, the world’s data centres are undergoing one of the most significant transformations since the rise of cloud computing. According to a new report by Vertiv (NYSE: VRT), a global leader in critical digital infrastructure, the convergence of AI workloads, extreme densification, and unprecedented scaling requirements is fundamentally reshaping how data centres are designed, built, and operated.
The Vertiv™ Frontiers Report, released in January 2026, offers a forward-looking view into the technologies and macro forces redefining digital infrastructure. Drawing on Vertiv’s deep engineering expertise and global deployment experience, the report identifies a shift toward data centres that are no longer just buildings housing IT equipment, but integrated units of compute, engineered for speed, scale, and resilience in an AI-driven world.
Data Centres at a Tipping Point
For decades, data centres evolved incrementally—more racks, more servers, modest improvements in efficiency. AI has changed that trajectory entirely.
Large language models (LLMs), high-performance computing (HPC), and GPU-dense workloads are pushing power densities to levels that legacy architectures were never designed to handle. At the same time, organisations are demanding faster deployment timelines, greater energy autonomy, and higher reliability, even as grid constraints tighten across many regions of the world.
“The data centre industry is rapidly evolving how it designs, builds, operates, and services facilities in response to the density and speed demands of AI factories,” said Scott Armul, Vertiv’s Chief Product and Technology Officer. “Extreme densification is driving transformative changes—such as higher-voltage DC power architectures and advanced liquid cooling—that are essential for gigawatt-scale AI innovation.”
The Macro Forces Redefining Data Centre Design
The Vertiv Frontiers report highlights four major macro forces currently shaping the global data centre landscape:
1. Extreme Densification
AI and HPC workloads have dramatically increased rack-level power densities. Where 5–10 kW per rack was once standard, AI deployments now routinely exceed 30–60 kW, with future designs targeting even higher thresholds.
2. Gigawatt Scaling at Speed
Hyperscale and AI-focused data centres are now being deployed at unprecedented scales—hundreds of megawatts, moving rapidly toward gigawatt-class campuses. Speed of deployment has become as critical as capacity itself.
3. Data Centre as a Unit of Compute
In the AI era, facilities must be engineered as tightly integrated systems where power, cooling, IT, and controls are co-designed and deployed together, rather than as loosely connected subsystems.
4. Silicon Diversification
Data centres must now support a wide range of compute architectures—from CPUs and GPUs to custom accelerators and specialised AI chips—each with unique power and thermal characteristics.
These forces, Vertiv argues, have directly given rise to five key technology trends that will define data centre infrastructure over the coming decade.
1. Powering Up for AI: Rethinking Electrical Architectures
Most existing data centres rely on hybrid AC/DC power distribution architectures, with electricity converted multiple times between the grid and the IT rack. While proven, this approach introduces inefficiencies that become increasingly problematic at higher power densities.
As AI workloads push power consumption upward, Vertiv points to a growing shift toward higher-voltage DC power architectures. By reducing current levels, these systems enable smaller conductors, fewer conversion stages, and improved overall efficiency. Centralising power conversion at the room or facility level also simplifies infrastructure and improves scalability.
While hybrid AC/DC systems remain widespread today, the report suggests that full DC architectures will become more common as standards mature and rack densities continue to rise. This transition is further accelerated by on-site power generation, microgrids, and energy storage systems—technologies that naturally align with DC-based distribution.
2. Distributed AI and the Rise of Private Inference
To date, billions of dollars have been invested in massive AI data centres designed to train and serve large-scale models. However, Vertiv believes the future of AI will be more distributed and contextual.
For many organisations—particularly in regulated sectors such as finance, defence, healthcare, and government—data residency, security, and latency concerns make exclusive reliance on public AI platforms impractical. These organisations are increasingly adopting private or hybrid AI environments, delivered through on-premise or edge data centres.
This shift creates new demand for flexible, high-density power and liquid cooling systems, capable of supporting AI inference workloads within existing facilities or through modular expansions. For regions like Africa, this trend also opens opportunities to localise AI infrastructure closer to users, reducing latency while maintaining regulatory compliance.
3. Energy Autonomy Moves from Backup to Strategy
Traditionally, on-site power generation in data centres served a narrow purpose: short-term backup during grid outages. Today, power availability itself has become a limiting factor for data centre growth.
Vertiv’s report highlights a growing move toward extended energy autonomy, particularly for AI data centres. Investments in natural gas turbines, alternative fuels, and hybrid energy systems are increasingly driven not just by resilience, but by the inability of existing grids to meet demand.
Concepts such as “Bring Your Own Power (and Cooling)” are gaining traction, allowing operators to deploy capacity independent of local grid constraints. While these approaches introduce new operational complexities, they also offer strategic advantages in speed of deployment and long-term scalability.
4. Digital Twins: Accelerating Design, Deployment, and Operations
As AI infrastructure grows more complex, traditional design and commissioning methods are proving too slow. Vertiv sees digital twin technology as a critical enabler of next-generation data centres.
Using AI-driven modelling tools, operators can create virtual replicas of entire facilities—integrating IT loads, power systems, cooling infrastructure, and controls into a single digital environment. These models allow engineers to test configurations, optimise performance, and identify bottlenecks before physical deployment begins.
Vertiv estimates that digital twin-driven design and prefabricated modular construction can reduce time-to-token by up to 50%, a critical advantage in a competitive AI landscape. At gigawatt scale, such efficiencies are not just beneficial—they are essential.
5. Adaptive, Resilient Liquid Cooling Takes Centre Stage
Liquid cooling has moved from a niche solution to a mission-critical technology for AI data centres. High-performance GPUs and accelerators generate heat densities that air cooling alone can no longer manage efficiently.
However, Vertiv notes that AI is not only driving the need for liquid cooling—it is also improving it. By integrating AI-driven analytics with advanced sensors and control systems, liquid cooling solutions can become adaptive and predictive.
These systems can anticipate failures, optimise fluid flow, and dynamically respond to changing workloads, significantly improving reliability and uptime. For operators managing high-value AI hardware, such capabilities are rapidly becoming non-negotiable.
A Global Perspective with Local Implications
Vertiv operates in more than 130 countries, delivering power, thermal, and IT infrastructure solutions across data centres, communication networks, and industrial facilities. Its integrated portfolio—from cloud to edge—positions the company at the centre of the AI infrastructure transition.
For emerging markets like Kenya and the broader African region, the trends outlined in the Vertiv Frontiers report carry important implications. As AI adoption grows, the need for locally deployed, energy-efficient, and resilient data centres will increase—creating opportunities for engineers, policymakers, and investors to shape the continent’s digital future.
The Bottom Line
AI is not just changing what data centres compute—it is redefining what data centres are. From higher-voltage power systems and liquid cooling to digital twins and energy autonomy, the infrastructure of the AI era demands a holistic, systems-level approach.
As Vertiv’s report makes clear, the winners in this next phase will be those who recognise that the data centre is no longer just a facility—it is the engine of the AI economy.





















