Anthropic’s announcement on April 6, 2026 was revealing not because it promised yet another better model, but because it exposed what the frontier-AI race is increasingly fought over: access to chips, data-center capacity, and above all electricity. The company said it had signed a new agreement with Google and Broadcom for “multiple gigawatts” of next-generation TPU capacity, expected to come online starting in 2027, while its revenue run rate had already climbed above $30 billion from about $9 billion at the end of 2025. Anthropic also said the vast majority of this new compute would be located in the United States, extending its earlier commitment to invest $50 billion in American computing infrastructure. (anthropic.com)
The scale is easier to grasp when one follows the numbers to their physical meaning. Broadcom’s April 6, 2026 filing says Anthropic will access approximately 3.5 gigawatts of next-generation TPU-based AI compute beginning in 2027. If such capacity ran continuously, that would equal about 30.66 terawatt-hours per year. As a rough comparison, Google reports that all of its data centers consumed 30.8 million megawatt-hours of electricity in 2024, or about 30.8 terawatt-hours. In other words, one customer’s future AI reservation is already in the same annual energy ballpark as Google’s entire reported data-center electricity use just two years earlier. (sec.gov)
This is why the contest can no longer be described simply as “whose model is smarter.” Google’s Ironwood TPU, introduced in April 2025, was marketed not only as more powerful but as more power-efficient: Google says it is the company’s first TPU designed specifically for inference, scales to 9,216 liquid-cooled chips per pod spanning nearly 10 MW, and delivers roughly twice the performance per watt of Trillium. When power availability becomes a binding constraint, efficiency stops being a technical footnote and becomes a strategic weapon. (blog.google)
The broader energy backdrop makes the point even sharper. The IEA said in April 2025 that global electricity consumption by data centers is projected to more than double to around 945 TWh by 2030, largely because of AI. So the real chokepoint in AI supremacy may not be algorithmic brilliance alone, but the ability to secure long-term supply chains for advanced accelerators and enough reliable power to keep them running. In that sense, Anthropic’s TPU deal is less a chip story than a geopolitical and industrial one. (iea.org)










