Investors are still pondering who will win the AI chip war, but this is actually just the first half. Nvidia has dominated with GPUs, AMD is trailing behind, and Broadcom is helping others create custom chips… It all looks very lively.
But the truth is: energy efficiency is the next decisive point.
Why? Because power consumption is critical.
GPUs are great during the AI training phase—after all, they just run once. But what about after training? The inference phase is where the long-term costs come in. This is when the high power consumption costs become apparent.
Alphabet (Google) has an absolute advantage here - they have spent over 10 years developing their own TPUs (Tensor Processing Units), and they are now on the seventh generation. These chips are tailored for their own TensorFlow framework and are deeply adapted to Google Cloud infrastructure. The result is: stronger performance, lower power consumption.
For comparison: OpenAI and Perplexity mainly rely on Nvidia GPUs, which are more expensive and consume more power. Google trains Gemini using its own TPUs, significantly reducing operational costs.
The Strategy is Deeper
What's even more extreme is that Alphabet does not sell TPUs to outsiders. If you want to use their chips, you have to run your business on Google Cloud, which brings Google multiple revenue streams: chips, cloud services, software… a complete ecosystem.
What about Nvidia's recent crazy investments and acquisitions? That's because they heard that OpenAI is testing Google's TPU, and Nvidia panicked, quickly signing a big deal with OpenAI. This indirectly indicates that the industry recognizes Alphabet's integration plan.
Who will win the next phase
What does Google have now?
Gemini 3 Model (Analysts say certain indicators surpass those of its generation)
Vertex AI platform (enables customers to quickly build applications)
Proprietary fiber optic network (reduces latency)
Also looking to acquire the cybersecurity company Wiz
No company's AI technology stack can integrate so comprehensively. From chips to software to infrastructure, it's a complete package.
This is the power of vertical integration—while others are waging point wars, Alphabet is waging an ecosystem war.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
The chip war is outdated; the real competition in AI lies in "efficiency".
You are all looking in the wrong direction.
Investors are still pondering who will win the AI chip war, but this is actually just the first half. Nvidia has dominated with GPUs, AMD is trailing behind, and Broadcom is helping others create custom chips… It all looks very lively.
But the truth is: energy efficiency is the next decisive point.
Why? Because power consumption is critical.
GPUs are great during the AI training phase—after all, they just run once. But what about after training? The inference phase is where the long-term costs come in. This is when the high power consumption costs become apparent.
Alphabet (Google) has an absolute advantage here - they have spent over 10 years developing their own TPUs (Tensor Processing Units), and they are now on the seventh generation. These chips are tailored for their own TensorFlow framework and are deeply adapted to Google Cloud infrastructure. The result is: stronger performance, lower power consumption.
For comparison: OpenAI and Perplexity mainly rely on Nvidia GPUs, which are more expensive and consume more power. Google trains Gemini using its own TPUs, significantly reducing operational costs.
The Strategy is Deeper
What's even more extreme is that Alphabet does not sell TPUs to outsiders. If you want to use their chips, you have to run your business on Google Cloud, which brings Google multiple revenue streams: chips, cloud services, software… a complete ecosystem.
What about Nvidia's recent crazy investments and acquisitions? That's because they heard that OpenAI is testing Google's TPU, and Nvidia panicked, quickly signing a big deal with OpenAI. This indirectly indicates that the industry recognizes Alphabet's integration plan.
Who will win the next phase
What does Google have now?
No company's AI technology stack can integrate so comprehensively. From chips to software to infrastructure, it's a complete package.
This is the power of vertical integration—while others are waging point wars, Alphabet is waging an ecosystem war.