Odd Lots artwork

Odd Lots

The Utilities Analyst Who Says the Data Center Demand Story Doesn't Add Up

Feb 2, 2026Separator20 min read
Official episode page

Andy DeVries, the head of utilities and power at CreditSights, examines the massive energy demand forecasts for AI data centers.

He explains why current utility projects might overbuild infrastructure by twice the amount actually needed.

His analysis challenges popular investment narratives and highlights the financial risks facing the energy sector as it adapts to the AI boom.

Key takeaways

  • The utility sector has transitioned from a stable, bond-like investment into a secular growth story driven by the power demands of AI data centers.
  • Higher earnings growth from AI and data centers may help decouple utility stocks from their traditional sensitivity to interest rates.
  • Utility companies are currently working on connecting 110 gigawatts of capacity, which significantly exceeds third-party demand estimates of 50 gigawatts for data centers through 2030.
  • Data centers are narrowing the price gap between peak and off-peak electricity because they consume power constantly.
  • The forward curve for natural gas is inverted, which is counterintuitive given that export demand is expected to rise by 12 billion cubic feet per day.
  • Big tech companies are paying a significant premium of ninety-five dollars per megawatt hour to secure reliable and carbon-free electricity for data centers.
  • Large cash collateral postings from data center developers indicate that the projected surge in energy demand is material rather than speculative.
  • Utilities must create specific rate structures to ensure that big tech companies, rather than residential customers, bear the risk of infrastructure overbuilds.
  • The NiSource deal with Amazon serves as a gold standard for the industry by returning one billion dollars to local ratepayers over fifteen years.
  • Data centers take about three years to build while power plants take six to seven, creating a significant supply-demand mismatch.
  • Big tech companies are insensitive to rising energy costs because the cost of building a data center is over ten times higher than the cost of the power plant supporting it.
  • Energy availability is unlikely to be the primary factor that stops AI progress in the United States.
  • The US has significant latent capacity, with states like Texas potentially able to power nearly all current AI chips with existing infrastructure.
  • Tech companies use off-balance sheet financing for data centers to keep depreciation costs off their income statements and hide debt from quantitative screens.
  • The AI industry is seeing a return to circular vendor financing where companies like Nvidia invest in the same cloud startups that buy their chips.
  • Small modular reactors are unlikely to reach final investment decisions unless Big Tech companies provide both the capital and the guaranteed demand through equity investments.
  • AI may follow the historical pattern of being simultaneously underhyped in its long-term impact and overvalued in its current market state, similar to the internet and railroad bubbles.
  • AI efficiency gains are happening much faster than predicted, which may offset the massive power demand many analysts expect.
  • Off-balance sheet financing gives cash-rich tech firms an implicit call option to distance themselves from liabilities in the future.
  • Fixed income investors should prioritize capital preservation over sector hype since they do not share in the exponential upside of a growing industry.

Podchemy Weekly

Save hours every week! Get hand-picked podcast insights delivered straight to your inbox.

The transformation of the utilities sector in the AI era

00:00 - 03:02

The utility sector is experiencing a significant shift in its market role. Historically, utilities were viewed as stable investments that functioned much like bonds. Analysts focused primarily on yields and how they compared to Treasuries. This perception is changing as the industry moves from being stable or cyclical to becoming a secular growth story.

One of the themes of the last few years has been old industries that were either stable or cyclical becoming secular in the way they grow.

The rapid build-out of AI data centers is the primary driver of this change. These facilities require immense amounts of energy, which places new demands on the power grid. As a result, utilities analysts are no longer working in a quiet corner of the market. They are now at the center of the conversation regarding technology and infrastructure. Their expertise is essential for understanding the energy constraints that could impact the AI build out. This transition is similar to how internet analysts became central to the market during the late 1990s boom.

The evolving landscape of utility investment analysis

03:02 - 05:36

Andy DeVries has spent twenty-five years analyzing the utilities sector. He began his career during the first PG&E bankruptcy and has witnessed the industry's most significant financial milestones. These include the Enron collapse and the largest private equity return ever seen with Calpine. For most of his career, utility analysis was a meticulous process of studying local news and dry regulatory documents.

Pre data centers, you're looking at a lot of rate cases, studying a lot of local news. You're looking at legislation, you're reading dry regulatory documents. And then you're tracking natural gas prices because that's setting the price of power.

The work requires understanding a complex patchwork of rules at both the state and federal levels. Analysts must track how renewables are displacing coal and how different pricing models affect the market. While utilities are often viewed as simple bond-like instruments, the reality involves navigating intricate policy shifts and significant financial volatility.

Utility growth driven by data centers

05:36 - 07:46

The mood in the utility sector is currently very optimistic. A recent industry conference was packed with investors and analysts. Growth in this sector usually sits between four and seven percent. Now, some companies are seeing growth as high as eight percent because of the demand from data centers. This shift marks a change in how people view utilities.

Utilities have generally grown around 4 to 6% a year. Now, certain names are up to 8% a year, and that is driven by data center growth.

Historically, utility stocks behaved like bond proxies. They were very sensitive to changes in interest rates. When interest rates rose, these stocks often struggled. However, the surge in data center needs and the arrival of ChatGPT have introduced a new growth driver. Andy notes there is an ongoing debate about whether these higher earnings will make utilities less dependent on interest rate trends.

The industry argues that as this EPS growth rate goes up to the high single digits, it should not be as interest rate sensitive. That is the big debate going on with investors right now.

The gap between data center demand and utility supply

07:46 - 10:36

Andy uses simple math to compare data center power demand with the supply utilities are preparing to provide. Current data center consumption sits at around 45 gigawatts, with estimates suggesting that figure will reach 90 to 95 gigawatts by 2030 and potentially 160 gigawatts by 2035. While many analysts focus purely on the massive demand, the supply side offers a different perspective. Utilities are currently tracking about 140 gigawatts of near-term supply through connections to the grid.

You look at the supply and these utilities are tracking all these data centers connecting to the grid because they've got to do a lot of work, spend a lot of money in transmission, distribution, new substations, transformers. It's a lot of work, but it boosts their earnings growth. So they're happy to talk about this.

A critical part of this analysis involves distinguishing between firm, committed contracts and the general pipeline. Because developers often contact multiple utility companies for a single project, the pipeline often suffers from double or triple counting. To get an accurate picture, Andy focuses on the 140 gigawatts of firm supply. He also notes that grid connections must account for the Power Usage Effectiveness (PUE) of a facility. While many third-party estimates focus solely on raw compute power, utilities must provide enough capacity for the lights and cooling systems required to keep those servers running.

The risk of data center oversupply in the power market

10:36 - 13:39

There is a significant gap between demand estimates and the supply of power being prepared for data centers. Third-party estimates suggest that data center demand will grow by 50 gigawatts between now and 2030, moving from 45 to 95 gigawatts. However, utility companies are already working on connecting 110 gigawatts of capacity. This suggests that the industry may be headed toward a state of oversupply where capacity is built far beyond what the market actually requires.

The utilities are working on already connecting almost as much as you need by 2035. There is a lot of supply of data centers coming and it is very unclear if there is going to be demand for this. We are going to overbuild these things.

The Texas power market serves as a primary example of this aggressive growth. As a walled-off market with an 87-gigawatt peak, estimates indicate that an additional 30 gigawatts could be added by 2030. While some skeptics doubt these high demand figures, industry insiders like the CFO of Encore suggest the final number will be much closer to 30 gigawatts than to zero. Andy notes that current forward power curves do not reflect this massive projected growth, suggesting that the market may be mispriced.

Texas is a 87 gigawatt peak market and the demand estimates are they are going to add 30 gigawatts by 2030. The forward power curves do not reflect that at all. If they do not, then they are mispriced.

Tracking these trends requires a mix of automated alerts and manual research. Andy relies on a combination of digital notifications for demand-side tracking and a junior analyst to monitor utility calls and firm commitments. Firm commitments represent signed agreements to build capacity, providing a more concrete look at the supply pipeline than mere potential projects.

13:40 - 15:38

Forward power curves represent the prices traders expect for electricity in the future. These curves usually distinguish between peak and off-peak hours. In regions like Texas, the price difference between these times is shrinking because data centers operate 24/7. This constant demand makes power needs more consistent across the whole day.

The difference between peak and off-peak is actually narrowed because data centers run 24/7.

Natural gas is the primary factor influencing power prices. Curiously, the forward curve for natural gas is inverted. This suggests that prices are expected to drop slightly by the end of the decade. This market behavior is unexpected because exports are projected to grow from 18 billion cubic feet per day to 30 billion. Andy points out that a curve reflecting such high demand would typically trend upward.

The forward curve for gas is inverted. It goes from 370 to 360 by the end of the decade. You would think that curve would be at least upward sloping by 25 or 30 cents.

Disconnects between power price curves and data center demand

15:38 - 20:20

Forward price curves for gas and electricity remain surprisingly flat despite clear signs of growing demand. The physical reality on the ground suggests a massive surge is coming. For example, Andy notes that the CFO of Oncor is holding two and a half billion dollars in cash collateral from companies wanting to connect data centers to the grid. These are not small startups. They are major firms putting up significant money to lock in their power needs.

The reason I was talking to the Oncor CFO and asking him about this is he said he is holding two and a half billion dollars of cash collateral postings from some of that demand. If you are posting two and a half billion dollars, that is real demand that is coming. It is material.

In Texas, the difference between market prices and what tech companies pay is striking. Market rates might sit around sixty dollars for peak times, but tech companies are signing deals for ninety-five dollars per megawatt hour. This premium covers the desire for carbon-free energy and the need for guaranteed supply. Even with improvements in chip efficiency, the total power required for data centers is massive. One gigawatt can power about one million homes, and the industry expects nearly one hundred gigawatts in additional capacity demand.

Vistra just did a deal for ninety-five dollars a megawatt hour for round the clock. Big tech is paying a very pretty penny. Some of that is for the CO2 free aspect of it, and some of it is just to lock in the supply.

Protecting ratepayers from data center infrastructure costs

20:20 - 23:05

There is a growing concern that average utility customers might end up paying for massive data center expansions. If a utility spends heavily on wires and transformers but the expected demand never arrives, residential rates could spike to cover the costs. This creates a significant political risk because regulators cannot allow ordinary families to effectively subsidize the infrastructure for massive tech companies.

Andy highlights that some companies are already solving this problem. He points to NiSource in Northern Indiana as the gold standard for protecting the public. They structured a deal with Amazon that will return one billion dollars to ratepayers over fifteen years. Instead of debating who pays for what, the utility ensured that the local community receives sixty-seven million dollars annually. This setup protects the public and provides them with a direct financial benefit from the tech presence.

The point is someone, if it turns out that there is an overbuild and there is not as much demand for it, someone's paying for it. And it either it's going to be the customers or perhaps utility shareholders. You just the political risk of having mom and pop bail out Mark Zuckerberg or Jeff Bezos is just, you can't have that happen.

While some utilities like PG&E and Ameren have similar protections in place, many others do not. This shift in focus is relatively new. Only six months ago, these issues were barely mentioned at the end of earnings calls. Now, utility CEOs are addressing ratepayer protections at the very beginning of their prepared remarks to satisfy regulators and the public.

Timing challenges in data center and energy expansion

23:05 - 24:14

Andy explains that the build out of data centers faces a significant timing challenge compared to the energy infrastructure needed to power them. While a data center typically takes two to three years to complete, building a new power plant is a much longer process that can take six to seven years. This mismatch creates a bottleneck where demand for power may far outpace the supply available on the grid.

A data center takes two, three years. Even if that slips to four or five years, the power plants take like six, seven years. And as you know, you can't get a G Vernova gas turbine for years and years, which is obviously a bullish backdrop here.

Supply chain issues further complicate the situation. Essential equipment like gas turbines is currently very difficult to acquire, with backlogs lasting for years. There is also a financial risk if companies overshoot on production. Building too much capacity during a period of high inflation in the construction sector could lead to significant economic problems for energy providers.

The cost and capacity of powering data centers

24:15 - 27:25

The cost to build a combined cycle gas plant has surged from $1,200 to $3,000 per kilowatt over the last decade. While this price jump shocks utility analysts, it is a small fraction of the cost of the data centers themselves. A data center costs about $40,000 per kilowatt. Because the infrastructure is so expensive, big tech companies are happy to pay a premium for power to lock in their energy source. They are currently paying nearly $95 for power when the market rate is closer to $60.

You could add 10 gigawatts to Texas tomorrow, which would be the equivalent of sending every single Nvidia chip for an entire year to Texas and running them 24/7. That is 10 gigawatts because you could run it right now on the existing grid and existing plants for all but 50 hours a year.

There is a mismatch between construction timelines. It takes about two years to build a data center but up to five years to build a new power plant. Despite this, the grid might be more resilient than expected. Andy notes that instead of spending billions on new wind farms, it is often more efficient to pay a refinery or chemical plant to stop running for the few hours a year when demand peaks. The current grid capacity and the steady addition of new solar and gas power suggest the system can handle the coming AI demand.

Regional transmission challenges in the US power grid

27:25 - 28:31

The United States struggles with power transmission. Regional markets face different pressures. MISO, the Mid Continent ISO in the Midwest, is likely in the worst shape because it retired a large amount of coal power. Texas also faces challenges. New England has the most expensive power prices in the country. Building a data center there is difficult because the supply is so tight.

MISO retired the most amount of coal so they are going to be in the worst shape. After that it is Texas. New England has the far away most expensive power prices at seventy dollars. If anyone builds a data center in New England, they are going to be the tightest.

During a recent cold snap, New England had to rely on oil for 40 percent of its power. This highlights the fragility of the local system. Transmission is the key factor because renewable energy sources are often located far from where the power is needed. These sources must be connected to the grid to be effective.

Power constraints in the AI race

28:31 - 29:41

The debate over the AI race often centers on whether the US or China can better provide the massive power required for new technology. Some believe China has the advantage because they can build power grids more quickly. Andy disagrees with the idea that energy is a hard limit that will stop US progress. He points out that the US actually has a lot of existing capacity that could be utilized. In fact, Texas could theoretically power every existing Nvidia chip right now for all but a few hours of the year.

It is going to be a little tight, but I am not one of these doomsdayers. It is not the absolute gating factor where it is all going to stop.

While the situation might be tight, it is not a reason for doomsday predictions. Meanwhile, the reality of automation is already visible in places like Shenzhen, where robots are already performing complex tasks like navigating elevators. The focus should be on how existing capacity is managed rather than assuming a lack of power will end the race.

The risks of private credit in the data center boom

29:56 - 31:50

Private credit is increasingly involved in data center construction, moving into territory once dominated by the public bond market. This trend accelerated after a notable deal where PIMCO made two billion dollars on the first day of lending to a Meta data center in Louisiana. This success has sparked a rush of interest from other private credit firms, which often signals a shift toward looser covenants and lower rates.

We all know how this ends. Covenants start falling, rates start falling. If you are big tech, who cares if you overspend? You think AI is the be-all, end-all, you are going to overspend.

While massive tech companies can absorb the costs of overspending on AI, smaller players face more scrutiny. Andy points out that companies like CoreWeave might have high market caps, but the bond market is demanding a ten percent yield for long-term loans. This suggests a disconnect between equity valuations and credit risk. Current forecasts suggest that while data center growth will continue for now, a significant oversupply could hit the market around 2030.

Off balance sheet financing for AI data centers

31:50 - 35:32

A recent deal between Pimco and Meta highlights a curious trend in corporate finance. Meta is a highly rated company that could borrow cheaply in public markets. However, they chose a private credit arrangement with a much higher interest spread. Andy suggests this decision likely stems from a desire to keep large data center assets off the main balance sheet.

Did you kick it off your balance sheet because you didn't want to damage your balance sheet, but the agencies are imputing it? But maybe quant funds running their screens, they don't impute that, so maybe that helps. Or maybe you didn't want the depreciation running through your income statement, maybe that helps, or maybe you want to walk from this thing in five years.

The structure involves a separate vehicle rather than direct corporate debt. While credit agencies often count these leases as debt, the arrangement might still appeal to companies trying to manage their financial ratios or income statements. The credit documents show that Meta has guaranteed this debt even if the data center shuts down. However, that guarantee might disappear if the asset is sold. This creates unique risks that analysts must watch closely.

There is also a growing trend of circular financing within the AI industry. Major players like Nvidia and OpenAI are investing in the equity of new cloud companies. These cloud companies then use that capital to buy Nvidia chips or provide computing power back to OpenAI. This practice resembles the vendor financing models used by companies like Nortel decades ago.

OpenAI or Nvidia is going out and buying equity in these NEO cloud companies. So then they can go out and either supply the compute to OpenAI and buy the chips from Nvidia. It is all very circular.

Big Tech and the rise of small modular reactors

35:32 - 39:46

Large nuclear power projects face immense financial risks, as seen with the Vogel plant, which finished ten years late and significantly over budget. Consequently, utilities are unlikely to undertake such massive projects without a government or corporate backstop. The conversation is now shifting toward small modular reactors, or SMRs, as a more feasible alternative. Andy suggests that the success of SMRs depends on Big Tech companies stepping in to provide capital and guaranteed demand. These companies may need to invest equity directly into manufacturers to fund the high cost of building new capacity.

The only way we think a small modular reactor goes final investment decision is if big tech agrees to do two things. They agree to buy some SMRs and they invest equity in those SMR manufacturers to give them the capex to build.

This shift represents a significant change in the utility sector. While the United States has successfully used small reactors in nuclear submarines for decades, commercial adoption remains complex. Companies like NuScale and Oklo are currently central to these discussions, though their market valuations already reflect significant optimism. Joe reflects on how this AI-driven energy demand fits into historical patterns. Transformative technologies, like railroads or the internet, often become bubbles where they are overvalued in the short term even as they remain underhyped in their long-term ability to change the world. The challenge lies in the timing, as energy demand from data centers may not align perfectly with current production incentives and policies.

The gap between AI optimism and energy market reality

39:46 - 42:54

There is a strong belief among AI optimists that energy demand will only increase as software integrates with tools like Claude. However, these systems are becoming efficient faster than anyone anticipated. The cost of processing a token is dropping rapidly, and technological benchmarks are being met years ahead of schedule. While optimists often find themselves surprised by how quickly the technology advances, the markets do not seem to reflect this massive expected surge in energy consumption.

The pace of efficiency gains for the cost of processing a token is dropping faster than people expected. Even the optimists keep getting surprised to the upside.

A notable disconnect exists between the popular narrative of high energy needs and actual market signals. For instance, natural gas is projected to be cheaper in a few years than it is today, despite the rise of power-hungry data centers. It is difficult to make the math work when estimating total power demand. While high-profile tools like ChatGPT require significant energy, there are not enough similar projects to easily reach the massive numbers predicted by some analysts.

The strategic use of off-balance sheet financing

42:54 - 44:02

Large technology companies with significant cash reserves often choose to finance projects off-balance sheet to maintain future flexibility. This strategy can act like a call option, allowing a company such as Meta to potentially walk away from a liability several years down the line. Because of this, the legal documentation surrounding these deals is essential for investors to understand. These documents define the specific scenarios where a company might exit its obligations.

The suggestion there was, well, maybe at some point in the future, like five years down the line, you need to get rid of this liability. You don't want to deal with it.

Credit ratings agencies treat these financial structures with a specific nuance. While they may not officially categorize the financing as debt, they still use the information to judge the health of a company. Andy points out that these agencies back out lease costs to inform their perspective on a firm's overall credit sustainability.

The ratings agencies, while they don't look at it as debt, they do back out a lease cost and therefore it can inform their overall credit sustainability.

The risks of hype in data center debt

44:02 - 45:10

Investor demand for data center debt is rising, making the market significantly more competitive. In typical credit cycles, this increased competition leads to weaker documentation and fewer protections for the lender. Andy finds this trend surprising because debt does not offer the same upside as equity investments. While an equity investment in AI might grow a hundred times over, debt only offers a fixed yield.

I find the existence of hype cycles for debt to be a little bit weird. If I have a fixed income allocation, all I care about is minimizing downside and I don't really care what sector it is. I am not participating in the upside. I just do not want to lose my money.

Fixed income should focus on protecting capital rather than chasing trends. Since debt yields are fixed, there is no reason to get greedy over a few extra basis points if the underlying protections are being stripped away. A simple approach to fixed income prioritizes security over sector hype.

Podchemy Logo