Feeding the AI Beast – Nov. 10, 2025
Data centers to drive artificial intelligence require massive amounts of compute power, which in turn requires massive amounts of electrical power. Can utilities keep up?
Third-quarter earnings results from four of the world’s largest data center tenants—Alphabet, Amazon, Meta and Microsoft—all cited plans to become top dog among compute power providers. “The key takeaway was AI-driven investments are accelerating, as top tenants move quickly and aggressively to secure data center capacity,” Green Street reported last week.
There’s no question that artificial intelligence is increasing the appetite for data center capacity. Morgan Stanley estimates that global spend on AI data centers could reach $3 trillion by 2028.
There’s also no disputing that the four tech giants are willing to spend billions on their own capacities: Meta’s $27-billion joint venture with Blue Owl Capital to develop a mega-campus in rural Louisiana isn’t even factored in as part of its increased CapEx guidance, reported Green Street. And Microsoft plans to double its total data center capacity over the next two years.
What is in question is how to ramp up the energy infrastructure needed to keep those data centers humming. The Texas Tribune reported that the Electricity Reliability Council of Texas, the state’s grid operator, sees large load demands increasing faster than traditional transmission planning can manage.
In September 2024, ERCOT tracked 56 gigawatts’ worth of large load interconnection requests. This past September, that figure had nearly quadrupled to 206 gigawatts.
“When you look at companies that want to come and consume more power than cities in a single location, it’s hard to forecast and plan for that in the future,” Kristi Hobbs, VP of system planning and weatherization at ERCOT, said at a meeting of the Public Utility Commission of Texas late last month.
The Trump administration is eager to solidify the U.S.’ status as a global leader in the AI boom. Reuters reported that David Sacks, the administration’s czar for AI and crypto, said on his personal X account that the U.S. wants to make permitting and power generation easier, with a goal of “rapid infrastructure buildout without increasing residential rates for electricity.”
Keeping utility costs down is a dilemma with no easy solutions, given the rapid scale-up in demand. We may see this conflict play out in Virginia, currently home to 600 data centers and likely the future location of many more. Its newly elected governor, Abigail Spanberger, wants to see data centers “pay their own way and their fair share” of costs. Inc. reported that some estimates have projected AI-driven increases in energy bills of 25% by 2030.
“The next governor has a chance not just to champion data-center growth, but to broaden that growth into a distributed, enterprise-grade compute foundation that powers the AI future,” Andrew Sobko, the founder and CEO of Argentum AI, a decentralized marketplace for compute, told Inc. “The election outcome matters for every business that’s thinking about tomorrow’s infrastructure strategy.” Experts say Virginia could serve as a model for other states in dealing with AI growth.
However, when it comes to managing energy costs along with accommodating data center development, another jurisdiction has thrown its hat into the ring. Saudi Arabia has ambitions to become the world’s third largest player in the AI space, behind only the U.S. and China, CNBC reported.
“Here, if you want renewable, you will find the lowest cost renewable,” Aramco CEO Amin Nasser told CNBC in an interview. “If you want [natural] gas, you will find the lowest cost gas. Energy is available and land is also available to build all these things.”


