Google to Unveil Two TPU v8 Chips This Week — MediaTek for Inference, Broadcom for Training
What if the future of AI hardware isn't just about Nvidia?
This week, Google is set to launch not one but two TPU v8 variants, each built with a different silicon partner for a very specific job.
The first chip, co-developed with MediaTek, is optimized for inference — the everyday task of running AI models that power Search, Gemini, and billions of API calls. Think fast responses, low power, massive scale.
The second, built with Broadcom, is a training powerhouse designed to connect thousands of chips together to build the next generation of AI models from scratch.
🎯 Why this matters:
- Google is vertically integrating its AI stack — designing custom silicon instead of relying solely on Nvidia
- Splitting inference and training into separate chips means each can be optimized without compromise
- MediaTek and Broadcom gain a seat at the AI hardware table, diversifying the supply chain
- This signals a broader industry shift: hyperscalers are building their own chip ecosystems
Think of it like this: one chip is a delivery van — fast, efficient, serving millions of users. The other is a construction crane — building the AI factories of tomorrow.
As the AI chip war heats up, Google just made it clear they're not just a customer — they're a competitor.
📄 Source
technews-tw