Live
OpenAI announces GPT-5 with unprecedented reasoning capabilitiesGoogle DeepMind achieves breakthrough in protein folding for rare diseasesEU passes landmark AI Safety Act with global implicationsAnthropic raises $7B as enterprise demand for Claude surgesMeta open-sources Llama 4 with 1T parameter modelNVIDIA unveils next-gen Blackwell Ultra chips for AI data centersApple integrates on-device AI across entire product lineupSam Altman testifies before Congress on AI regulation frameworkMistral AI reaches $10B valuation after Series C funding roundStability AI launches video generation model rivaling SoraOpenAI announces GPT-5 with unprecedented reasoning capabilitiesGoogle DeepMind achieves breakthrough in protein folding for rare diseasesEU passes landmark AI Safety Act with global implicationsAnthropic raises $7B as enterprise demand for Claude surgesMeta open-sources Llama 4 with 1T parameter modelNVIDIA unveils next-gen Blackwell Ultra chips for AI data centersApple integrates on-device AI across entire product lineupSam Altman testifies before Congress on AI regulation frameworkMistral AI reaches $10B valuation after Series C funding roundStability AI launches video generation model rivaling Sora
Industry

Meta, Microsoft, and Google Are Betting Big on Natural Gas to Power AI. They May Regret It.

Three of the world's largest technology companies are constructing dedicated natural gas power plants to meet the electricity demands of their AI data centers. Energy analysts warn the strategy locks in high-emission, high-cost infrastructure at precisely the moment when the economics of renewables are improving fastest — and when carbon regulation is accelerating.

D.O.T.S AI Newsroom

D.O.T.S AI Newsroom

AI News Desk

4 min read
Meta, Microsoft, and Google Are Betting Big on Natural Gas to Power AI. They May Regret It.

Meta, Microsoft, and Google have each committed to building new natural gas power plants dedicated to meeting the electricity demands of their AI data center buildouts, according to reporting from TechCrunch. The decisions reflect a pragmatic response to a real problem — AI workloads require more power, more reliably, than the current grid or existing renewable procurement strategies can provide. They also reflect a set of assumptions about energy economics and regulatory trajectory that may not age well.

The core logic is straightforward: AI training and inference require enormous, consistent power draw. Large language model training runs can consume hundreds of megawatts for months at a time. Renewable energy sources — solar, wind — are intermittent by nature, and battery storage at the scale required for large data center operations is not yet economically viable. Natural gas can be dispatched on demand, at the scale these operations require, with supply chains that are well-understood. For a data center operator who needs power now and cannot afford downtime, it is the path of least resistance.

The Lock-In Problem

The decision to build dedicated natural gas generation rather than procure power through utilities or long-term renewable contracts creates a specific kind of strategic risk: lock-in. A power plant represents a capital investment with a 30-to-40-year expected operational life. The companies making these investments are betting that the economics of natural gas remain competitive — and that the regulatory environment does not impose costs that make early decommissioning necessary — across a multi-decade horizon.

That is a significant bet to make in 2026. The Inflation Reduction Act's incentive structures have accelerated renewable deployment and battery storage economics in ways that are compressing the cost advantage of gas-fired generation. Carbon pricing mechanisms, which were speculative five years ago, are now operational in multiple major markets and expanding. The companies building natural gas capacity today may find themselves holding stranded assets before those plants reach mid-life.

The Community Opposition Variable

A parallel TechCrunch analysis of public polling data shows that community opposition to data center siting is running higher than industry projections anticipated. Survey respondents would prefer an Amazon warehouse in their neighborhood over a data center — a counterintuitive finding that reflects concerns about noise, water consumption, visual impact, and grid demand that data centers impose on local infrastructure. Natural gas plants add air quality and emissions concerns on top of those existing objections.

Permitting timelines for new power generation capacity are already long and are extending as local opposition becomes better organized. The companies planning dedicated gas plants are assuming permitting processes that may prove more contentious and time-consuming than their build schedules assume.

The Honest Assessment

The AI infrastructure buildout is consuming energy at a rate that the current grid cannot absorb cleanly. Natural gas is the near-term answer to a real problem. The question is whether "near-term" means two years or twenty — and whether the companies making these investments have built enough flexibility into their infrastructure strategies to pivot when the economics change. The ones that haven't will have bought themselves power at the cost of a long-term liability.

Back to Home

Related Stories

AWS Has Billions in Both Anthropic and OpenAI. Its Boss Explains Why That's Not a Problem.
Industry

AWS Has Billions in Both Anthropic and OpenAI. Its Boss Explains Why That's Not a Problem.

Amazon Web Services CEO Matt Garman defended the company's parallel multi-billion dollar investments in both Anthropic and OpenAI in a wide-ranging interview this week. The explanation reveals a cloud strategy built on AI model agnosticism — and a bet that AWS wins regardless of which AI lab dominates, as long as the compute runs on its infrastructure.

D.O.T.S AI Newsroom
Anthropic Poaches Microsoft's Azure AI Chief to Fix Its Infrastructure Problem
Industry

Anthropic Poaches Microsoft's Azure AI Chief to Fix Its Infrastructure Problem

Anthropic has recruited Eric Boyd, a senior Microsoft executive who led Azure AI services, as its new head of infrastructure. The hire is a direct response to the scaling bottlenecks that have limited Claude's availability during peak demand — and signals that Anthropic is treating infrastructure as a first-tier strategic priority heading into 2026.

D.O.T.S AI Newsroom
Intel's Nerdy Bet on Advanced Chip Packaging Could Decide Who Wins the AI Infrastructure Race
Industry

Intel's Nerdy Bet on Advanced Chip Packaging Could Decide Who Wins the AI Infrastructure Race

As the AI buildout pushes the limits of what individual chips can do, the unglamorous discipline of chip packaging — connecting multiple dies into a single system — is emerging as a genuine competitive moat. Wired reports that Intel is making an aggressive bet on advanced packaging technology that could position the company at the center of the next phase of AI hardware scaling, even as it struggles to compete on raw process technology.

D.O.T.S AI Newsroom