Live
OpenAI announces GPT-5 with unprecedented reasoning capabilitiesGoogle DeepMind achieves breakthrough in protein folding for rare diseasesEU passes landmark AI Safety Act with global implicationsAnthropic raises $7B as enterprise demand for Claude surgesMeta open-sources Llama 4 with 1T parameter modelNVIDIA unveils next-gen Blackwell Ultra chips for AI data centersApple integrates on-device AI across entire product lineupSam Altman testifies before Congress on AI regulation frameworkMistral AI reaches $10B valuation after Series C funding roundStability AI launches video generation model rivaling SoraOpenAI announces GPT-5 with unprecedented reasoning capabilitiesGoogle DeepMind achieves breakthrough in protein folding for rare diseasesEU passes landmark AI Safety Act with global implicationsAnthropic raises $7B as enterprise demand for Claude surgesMeta open-sources Llama 4 with 1T parameter modelNVIDIA unveils next-gen Blackwell Ultra chips for AI data centersApple integrates on-device AI across entire product lineupSam Altman testifies before Congress on AI regulation frameworkMistral AI reaches $10B valuation after Series C funding roundStability AI launches video generation model rivaling Sora
Industry

Meta Buys Tens of Millions of AWS Graviton 5 CPU Cores — a New Front in the AI-Infrastructure Arms Race

Meta has agreed to purchase tens of millions of AWS Graviton 5 ARM CPU cores from Amazon, joining the world's largest Graviton customers. The deal — disclosed Friday alongside Meta's existing Nvidia Grace CPU rollout and its in-house ARM 'AGI CPU' project — signals that the next phase of the AI infrastructure buildout is about CPUs orchestrating agentic workloads, not just GPUs training models.

D.O.T.S AI Newsroom

D.O.T.S AI Newsroom

AI News Desk

4 min read
Meta Buys Tens of Millions of AWS Graviton 5 CPU Cores — a New Front in the AI-Infrastructure Arms Race

Meta and Amazon disclosed on Friday that Meta will purchase tens of millions of AWS Graviton 5 processor cores — Amazon's fifth-generation in-house ARM CPU — to power agentic AI workloads inside Meta's data centers. The deal makes Meta one of the largest Graviton customers in the world, on par with the very largest AWS internal services, and it is structured to be expandable as Meta's needs grow. Meta will run agentic-AI orchestration workloads on the Graviton 5 cores initially, with the option to migrate to its own ARM-based silicon — a project called the "AGI CPU" that Meta unveiled with ARM in March — as that program matures. Neither company disclosed pricing or contract duration.

The CPU Side of the AI Stack

The conventional narrative of the AI infrastructure boom is a GPU story: Nvidia, AMD, and a long tail of custom accelerators competing to train and serve frontier models. Friday's deal is a reminder that agentic AI — systems that plan, call tools, coordinate sub-agents, and run for minutes-to-hours of wall-clock time per task — generates a different bottleneck. The orchestration layer (planning, tool dispatch, memory management, context construction, retrieval) is dominantly CPU-bound, and ARM cores designed for cloud workloads are a better fit for that workload than either GPUs or general-purpose x86 server chips on a per-watt and per-dollar basis. Meta's combined CPU strategy now spans three vendors: Nvidia's Grace ARM CPUs (deployed in February), AWS Graviton 5 (this deal), and Meta's own AGI CPU with ARM (in development with no shipping date yet). The company is hedging across all three because none of them is a perfect fit and because supply on any single vendor is constrained.

What This Says About Meta's Agent Roadmap

The scale of the Graviton 5 commitment — tens of millions of cores — is consistent with Meta planning a very large internal deployment of agentic systems, likely supporting both consumer-facing assistants across Meta's apps and internal automation workloads. CPUs at that scale also imply Meta expects most agentic compute to run on commodity hardware rather than on the GPU clusters reserved for frontier-model training. For the broader market, the deal is a significant validation of AWS's chip strategy: Amazon has spent eight years building Graviton against Intel and AMD on price-performance, and landing Meta as a flagship external customer at this scale puts Graviton on the same competitive footing as Nvidia's Grace and Google's Axion in the agentic-AI infrastructure race.

Back to Home

Related Stories

AWS Has Billions in Both Anthropic and OpenAI. Its Boss Explains Why That's Not a Problem.
Industry

AWS Has Billions in Both Anthropic and OpenAI. Its Boss Explains Why That's Not a Problem.

Amazon Web Services CEO Matt Garman defended the company's parallel multi-billion dollar investments in both Anthropic and OpenAI in a wide-ranging interview this week. The explanation reveals a cloud strategy built on AI model agnosticism — and a bet that AWS wins regardless of which AI lab dominates, as long as the compute runs on its infrastructure.

D.O.T.S AI Newsroom
Anthropic Poaches Microsoft's Azure AI Chief to Fix Its Infrastructure Problem
Industry

Anthropic Poaches Microsoft's Azure AI Chief to Fix Its Infrastructure Problem

Anthropic has recruited Eric Boyd, a senior Microsoft executive who led Azure AI services, as its new head of infrastructure. The hire is a direct response to the scaling bottlenecks that have limited Claude's availability during peak demand — and signals that Anthropic is treating infrastructure as a first-tier strategic priority heading into 2026.

D.O.T.S AI Newsroom
Intel's Nerdy Bet on Advanced Chip Packaging Could Decide Who Wins the AI Infrastructure Race
Industry

Intel's Nerdy Bet on Advanced Chip Packaging Could Decide Who Wins the AI Infrastructure Race

As the AI buildout pushes the limits of what individual chips can do, the unglamorous discipline of chip packaging — connecting multiple dies into a single system — is emerging as a genuine competitive moat. Wired reports that Intel is making an aggressive bet on advanced packaging technology that could position the company at the center of the next phase of AI hardware scaling, even as it struggles to compete on raw process technology.

D.O.T.S AI Newsroom