The race to dominate artificial intelligence in 2026 is no longer confined to algorithms, model size, or breakthrough capabilities. A quieter, more physical constraint is coming into focus, and it hums beneath every data center and server rack: energy.
As AI systems grow more powerful, they are also becoming far more demanding. Training frontier models and running them at scale requires vast amounts of electricity, pushing infrastructure to its limits. What once felt like a purely digital competition is now colliding with the realities of power generation, grid capacity, and long-term sustainability.

Industry leaders are beginning to sound the alarm
Speaking at the CERAWeek conference in Houston, Google President and Chief Investment Officer Ruth Porat delivered a blunt assessment of the situation:
“We are concerned that we are not full throttle on energy.”
Her remarks, as noted by Reuters, underscore a growing trend across the tech sector. The US, despite its leadership in innovation, may not be expanding its energy infrastructure quickly enough to keep pace with AI’s accelerating demands. Data centers are multiplying, chips are becoming more power-hungry, and the grid is being asked to shoulder a load it was never originally designed to carry.
This is not a distant or abstract problem.
Modern AI workloads consume significantly more energy than traditional computing tasks. Training a single large model can require enormous computational cycles, and once deployed, these systems continue to draw power at scale as millions of users interact with them daily.
Multiply that across hyperscale data centers worldwide, and the energy footprint becomes staggering.
The new race: power, not just performance
The tension is becoming impossible to ignore. Tech companies are investing billions into AI development, but without parallel investments in energy infrastructure, those ambitions risk hitting a hard ceiling.
What emerges is a shift in how we define leadership in the AI era. It is no longer just about who can build the smartest models. It is about who can sustain them. Reliable energy, grid resilience, and even access to clean power are becoming strategic advantages, as critical as talent or compute.
In that sense, the future of AI may hinge less on lines of code and more on lines of transmission.
The next chapter of artificial intelligence will not be written by software alone. It will be powered, quite literally, by whoever can keep the lights on.
Meanwhile, Kevin O’Leary is betting big on AI infrastructure with a massive data center project in Utah, signaling just how high the stakes have become in the race for power.
Read the full article here