A fraction of proposed data centers will get built. Utilities are wising up.

The U.S. grid is flooded with data center proposals that will never get built. That’s making it much more difficult for utilities and grid operators to plan for the future.
“Conservatively, you’re seeing five to 10 times more interconnection requests than data centers actually being built,” said Astrid Atkinson, a former Google senior director of software engineering and now co-founder and CEO of grid optimization software provider Camus Energy.
Even relatively short-term data center load growth forecasts are all over the map.
Last year, RAND Corporation’s “upper confidence” forecast projected 347 GW of AI-sector power consumption by 2030. But Schneider Electric called that prediction “extreme” in a whitepaper on AI’s potential grid impacts last month, which cited more down-to-earth forecasts — under 100 GW — from other reputable observers.
Schneider’s own 2030 AI power demand scenarios range from 16.5 GW to 65.3 GW, with 33.8 GW the optimal outcome under a sustainable AI framework that balances AI growth with grid stability.
The wild divergence in near-term AI power demand forecasts hints at a fundamental challenge facing utilities, grid operators and power system regulators today: speculative load interconnection requests, or what Bianca Giacobone of Latitude Media in March called “phantom data centers.”
Experts like Atkinson advise power system stakeholders to take utility forecasts, like Exelon’s expectation for 11 GW of “high-probability” data center load over 10 years, with a grain of salt.
A 2018 Lawrence Berkeley National Laboratory study that compared load forecasts and actual growth for 12 Western U.S. utilities in the mid-2000s and found most overestimated future demand.
But experts say it’s very difficult for utilities to tell in advance which data center interconnection requests will pan out, or how much potential load to discount in the aggregate. This is a problem because, as Giacobone noted, excess requests sap utilities’ limited study resources, cause delays for others in the interconnection queue and distort long-range resource planning, raising the risk of costly system overbuilding.
Utilities are trying a few tactics to mitigate the risk. Some have rolled out standardized large-load interconnection processes. Others are asking data center developers for bigger financial commitments upfront. In some cases, utilities have asked state policymakers for help.
The phantom load problem is, in part, a problem of transparency.
Loath to tip off competitors or local NIMBYs, data center developers and their agents conceal land acquisitions and other early development activities behind vaguely-named LLCs and non-disclosure agreements. Developers relentlessly winnow early-stage projects, but not to the point that every publicly-announced proposal is a done deal, Atkinson said.
Microsoft, for example, abandoned up to 2 GW of data center capacity reservations since January, while Tract killed a 30-building Phoenix-area proposal last year amid local opposition.
Even seasoned data center customers like Microsoft, Meta, Amazon and Google propose several times more projects than they’re likely to need due to uncertainty around power availability and permitting at any given site, Atkinson said. Less sophisticated developers abandon proposed projects at an even higher rate, she added.
Accurately assessing future data center power demand could get harder in the near future as lengthy waits for grid interconnection push developers and operators toward behind-the-meter primary power generation sources, Atkinson said.
Elon Musk’s Memphis-area xAI hub, where its Grok model trains, runs 35 gas turbines behind the meter, according to a lawsuit filed in April by an environmental group. Energy Secretary Chris Wright’s former company, Liberty Energy, could eventually deliver 1 GW of off-grid gas-fired generation to data centers and other large industrial loads at a planned business park near Pittsburgh. Data center customers account for about a third of gas turbine manufacturer GE Vernova’s 21-GW reservation pipeline, CEO Scott Strazik said in April.
Zachary Ruzycki, the director of resource planning for Minnesota-based cooperative Great River Energy, said the utility has received “more than a handful” of large-load interconnection requests recently, but he worries about sinking staff time into projects that might not materialize.
“How much work we want to undertake on it is something we’ve been thinking about,” Ruzycki said.
Still, the potential investment is driving plans for more generation. Great River Energy will use a $812 million federal grant to procure nearly 1.3 GW of renewable power to serve new load in the coming years, CEO David Saggau said in January.
The cooperative isn’t alone in facing this issue. Great River Energy shares its home state with investor-owned utilities like Xcel Energy. Together, Xcel and Great River Energy member cooperatives near the Minneapolis-St. Paul metro area have drawn proposals for at least 11 data center campuses since 2020, including one each from Amazon, Microsoft and Meta and three 500-MW schemes from Tract.
Some requests could be duplicates, but there’s no good way to tell which, Ruzycki said. Of the 11 proposals in or near its territory, only Meta’s had begun construction as of earlier this year, according to the Minnesota Star Tribune.
“This is a challenge across the industry,” said Patricia Taylor, director of policy and research at the American Public Power Association. Data center developers are “shopping around both within your community and next door.”
When it’s cheaper, “You’ll buy queue positions all day long”Some load-serving entities are trying to keep data center power demand expectations in check, according to the Electric Power Research Institute.
Of 25 large utilities EPRI surveyed in September 2024, 48% expected data centers to account for at least 10% of peak load by 2030. Twenty-six percent expected double that share.
But the EPRI respondents were generally skeptical that all proposed data center load — or even close to all — would materialize. Of the 10 utilities that said aggregate data center requests accounted for 50% or more of present peak load, none expected an actual five-year share above 35% of peak load. That included respondents with the highest proportion of data center requests.
Utilities take different approaches to derating proposed data center loads, or assuming that they would use less than their proposed nameplate capacity, EPRI found. About 30% of respondents took proposed loads at face value but assumed they would ramp over time. Another 30% derated loads based on apparent project maturity, using benchmarks such as public announcements, land acquisitions, permitting progress, company maturity and signed load-serving agreements.
Those factors can help determine how “quote-unquote ‘real’ a project is,” said Great River Energy’s Ruzycki.
”But in a sense, it doesn’t matter [because] we have an obligation to serve,” he continued. “If they have the land and the ability to build and ramp, they can do that and we have to find the assets to serve them.”
Great River Energy also bills petitioners for the staff work related to vetting large-load requests so the cost doesn’t fall on members. APPA’s Taylor said utilities can protect existing customers by taking substantial deposits for interconnection studies and inking service agreements that ensure data centers pay their fair share for infrastructure upgrades — and even new generation resources — while guaranteeing minimum load.
Former Federal Energy Regulatory Commissioner Allison Clements and former Meta Director of Energy Strategy Peter Freed, in a February op-ed for Utility Dive, argued for a standardized process across the country that could reduce speculative data center requests and shorten interconnection wait times.
The process proposed by Clements and Freed could involve standardized interconnection queues across utilities within the same planning region and anonymized visibility into queued projects’ attributes and status. It could also require developers to meet commercial readiness tests, pay phased fees that increase as projects progress and include a mechanism for removing nonviable projects from the queue.
But even that may not be enough.
Data center developers are adept at playing utilities off one another to manufacture price elasticity, said Karl Rábago, principal at Rábago Energy and a former commissioner at the Texas Public Utility Commission.
“The phantom load problem arises because the cost of getting in a queue is lower than the weighted likelihood that they’ll want to use their position,” Rábago said. “When it’s cheaper to buy a queue position than not to use your queue position, you’ll buy queue positions all day long.”
He was also skeptical of some state legislative efforts to address the issue.
A high-profile Texas bill that would require data center developers to pay some interconnection costs and disclose certain duplicative requests is not specific enough to have much impact, Rábago said. Instead, he favors a “reverse auction” framework to determine which data center “needs the fewest goodies” to connect to the grid, he said.
Recent moves by three utilities in Virginia, the country’s biggest data center market, hint at a possible path forward.
This year, Dominion Energy, Appalachian Power and Rappahannock Electric Cooperative all proposed new large-load rate classes that would apply to data centers. Dominion’s and Appalachian Power would require data centers to pay at least 60% and 80% of contracted demand, insulating existing ratepayers, according to the Virginia Mercury.
Member-owned Rappahannock’s proposal would require new data centers to put up collateral, cover some infrastructure upgrades, pay up to 100% of contracted load and deal directly with special-purpose subsidiaries to protect existing customers.
utilitydive