Industry5 min readWired AI

The AI Race Is Pressuring Utilities to Squeeze More From Europe’s Power Grids

P
Redakcja Pixelift0 views
Share
The AI Race Is Pressuring Utilities to Squeeze More From Europe’s Power Grids

Foto: Wired AI

Nearly 160 gigawatts of additional capacity – this is how much Europe will need by 2030 to meet the rapid development of artificial intelligence, according to Goldman Sachs forecasts. The arms race in the AI sector is causing data centers to consume energy at a rate that traditional transmission networks cannot match. As a result, operators such as National Grid and TenneT face the necessity of radically modernizing infrastructure that dates back to the last century. The problem lies not only in a power shortage but in the throughput of "bottlenecks." In some regions, the waiting time to connect a new Data Center to the grid has extended to several or even a dozen years. To salvage the situation, energy companies are implementing Dynamic Line Rating (DLR) technologies, which use IoT sensors to transmit more energy on cooler or windier days, optimizing cable performance in real time. For users and creators of creative technologies, this means an inevitable increase in operating costs. Pressure on the energy ecosystem will force giants like Microsoft and Google to invest in their own Small Modular Reactors (SMR) and energy storage systems. Without a fundamental shift in grid management, ambitions regarding the development of generative artificial intelligence could be stalled by the physical limitations of copper wires. The only way forward is a smart grid that learns as fast as the models it powers.

The race for dominance in the field of artificial intelligence has ceased to be solely the domain of programmers and software engineers. Today, its front line is shifting toward critical infrastructure, specifically—toward European power grids. Data centers, serving as the hearts of models such as GPT-4 or Claude 3, are exhibiting an unprecedented appetite for energy, placing transmission system operators against a logistical wall.

Across Europe, data center developers are lining up in massive queues for power allocations that are often already exhausted. Instead of waiting decades to build new high-voltage lines, grid operators are beginning to experiment with radical methods of "squeezing" additional efficiency out of existing, often outdated infrastructure. It is a technological gamble where the stake is maintaining the pace of innovation without causing blackouts.

The bottleneck of the digital revolution

The traditional approach to power grid planning assumed stable, predictable growth in demand. The emergence of generative artificial intelligence has disrupted this order. It is estimated that a query to an AI model consumes, on average, ten times more energy than a standard Google search. For grid operators, this means the necessity of connecting facilities with power consumption measured in hundreds of megawatts, equivalent to the demand of medium-sized cities.

The problem lies in the fact that physical grid expansion—erecting new poles and laying cables—takes between 7 and 15 years due to bureaucracy and public protests. Meanwhile, tech companies want to launch their server rooms within 18-24 months. This time discrepancy forces utility providers to seek solutions in the area of Smart Grids and dynamic load management to avoid investment paralysis.

  • Connection queues: In some regions, the waiting time for grid access has extended to over a decade.
  • Infrastructure saturation: Major communication hubs in Amsterdam, Frankfurt, and London have reached the limits of technical transmission capacity.
  • AI pressure: Training next-generation models requires GPU clusters whose power density exceeds standard cooling and power supply designs.

Algorithms instead of copper and steel

Since new lines cannot be built quickly, operators are turning to Dynamic Line Rating (DLR) technology. Traditionally, power line capacity is determined based on rigid, conservative weather assumptions—the worst-case scenario of a hot, windless day. DLR systems use IoT sensors and real-time weather data to monitor wire temperature. If a cool wind is blowing, the line can safely transmit up to 30-50% more energy than its nominal value.

Another innovative step is the implementation of so-called "flexible connection contracts." Instead of guaranteeing constant 24/7 power consumption, operators offer data center developers lower rates in exchange for the ability to limit supply during peak demand moments. For AI operators, who can shift less critical processes (e.g., long-term model training) to nighttime hours, this is an arrangement that allows them to jump the queue of applicants.

Advanced Grid Enhancing Technologies (GETs) are also being implemented, using software to reroute energy flows from overloaded lines to underutilized ones. This is a digital management layer that allows the power grid to be treated like a packet network, optimizing every available unit of energy in the system.

The efficiency paradox and local constraints

Despite technological optimism, physics remains relentless. Data centers are becoming increasingly efficient (the PUE - Power Usage Effectiveness ratio is approaching the ideal 1.0), but their total number and scale are growing faster than the savings. A phenomenon is emerging where local communities begin to compete for energy with tech giants. In Dublin or the Amsterdam area, temporary moratoria on the construction of new facilities have already been introduced, forcing the industry to seek locations in regions with lower population density but better access to renewable energy.

The AI arms race is also forcing a change in the architecture of data centers themselves. Increasingly, they are being integrated with BESS (Battery Energy Storage Systems). Large-scale energy storage allows data centers to act like giant power banks that stabilize the grid instead of just taxing it. In this way, server rooms are ceoring to be seen as a problem and are beginning to play the role of an active participant in the energy market, capable of feeding power back during crisis situations.

"The power grid is no longer just a passive cable. In the age of AI, it is becoming an intelligent platform that must learn to predict load faster than the computers it powers."

The end of the era of unlimited power

In my assessment, Europe is on the threshold of a fundamental shift in the perception of digital infrastructure. The era in which a data center's location depended solely on the proximity of fiber optics and low taxes is gone forever. Now, it is energy topography that dictates the terms. The companies that will win the AI race are not just those with the best algorithms, but those that most quickly secure access to stable and flexible power sources.

I predict that within the next five years, we will see deep vertical integration of tech giants with the energy sector. It would not surprise me to see companies like Microsoft or Google investing directly in small modular reactors (SMRs) at their computing campuses. The public grid will remain the foundation, but for the most demanding AI tasks, it will become merely a backup rather than the primary source of power. This is the only way to avoid a scenario where the lack of electricity becomes a "safety brake" for the global development of artificial intelligence.

Source: Wired AI
Share

Comments

Loading...