Google Reclaims Ground in the AI Competition
- VinVentures
- 7 minutes ago
- 5 min read
For much of the past two years, the AI conversation has orbited around OpenAI, Anthropic, and Nvidia. Google, the original pioneer of the transformer architecture, was cast as the sleeping giant: powerful, well-funded, but slow. In 2025, that narrative is reversing.
Built on a decade of deep research investment, strengthened by rapid advances in foundation models, supported by massive infrastructure expansion, enabled by Google’s proprietary Tensor Processing Unit (TPU) platform, and unified through an end-to-end AI value chain, Google is re-emerging with a credible path to leadership across the full AI stack, from silicon to cloud to models to consumer applications. In the sections below, we take a closer look at each of these pillars and how they collectively shape Google’s position in the next phase of AI.

Image: Artificial Analysis
1. Advances in Foundation Models
Google’s latest AI model, Gemini 3, was released on Nov. 18 and drew immediate attention for improvements in reasoning, coding and overall reliability. Days after the launch, Bloomberg reported that Meta Platforms Inc. is in discussions to spend billions on Google’s AI chips, a move that would further validate Google’s hardware strategy and signal rising interest in alternatives to Nvidia Corp.
Before the release of Gemini 3, Google’s earlier flagship model, Gemini 2.5 Pro Experimental, had already shown competitive strength in reasoning benchmarks. In the Artificial Analysis Intelligence Index, which aggregates seven evaluations across math, coding, science, and logic, Gemini 2.5 scored 68, ahead of comparable models from OpenAI, Anthropic, Meta, and DeepSeek. This positioned Google near the top of the field even before its latest model upgrade. The launch of Gemini 3 now builds on that foundation, aiming to extend these gains with improved performance in complex reasoning and applied tasks.

2. Expansion of AI Infrastructure
Regarding infrastructure strength, Alphabet is planning a substantial expansion of its infrastructure spending, with capital expenditures now projected at $91–93 billion in 2025. The increase, from an earlier estimate of $85 billion, reflects growing demand from cloud and AI customers and the need to scale capacity for large training and inference workloads.
Alphabet reported third-quarter revenue of $102.34 billion, a 16% year-on-year increase. For context, the company’s capex was $32.25 billion in 2023, and its initial 2025 outlook of $75 billion as of February has been revised sharply upward, illustrating how quickly infrastructure needs are accelerating.
Alphabet CFO Anat Ashkenazi said the company is “investing aggressively” to meet this demand and noted that capex is expected to rise further in 2026. The revised spending plan highlights a clear trend: AI adoption is driving a sustained increase in demand for Google’s compute and cloud capacity, pushing the company to expand its infrastructure faster than previously anticipated.
3. Strength in Proprietary Silicon: Google’s TPUs
Tensor Processing Units (TPUs) have been under development at Google for more than a decade, and the company’s latest generation, Ironwood, builds directly on that long-running effort. Ironwood is Google’s 7th gen TPU and representsa significant step forward in both performance and efficiency. According to the company, the new chip delivers up to 10x the compute power of the previous generation and operates about 50% more efficiently, lowering overall cost per operation and reducing heat output compared with traditional GPU-based systems.
Initially introduced for testing in April, Ironwood is expected to become broadly available for public use in the coming weeks. The chip is designed to support a wide range of workloads, from large-scale model training to real-time inference for chatbots and AI agents. A single Ironwood pod can link up to 9,216 chips, a configuration intended to minimize data bottlenecks and enable the training and deployment of the largest, most data-intensive models.
Ironwood has already attracted major customers. Anthropic, the developer of the Claude model family, recently agreed to use up to 1 million TPUs under a multiyear deal valued in the tens of billions of dollars. Recently, Bloomberg reported that Meta Platforms Inc. Is also in discussions to spend billions on Google’s AI chips, The agreement underscores growing interest in alternatives to Nvidia GPUs, which remain both costly and supply constrained.
4. Deep Research Root as the long-term edge
Google’s AI capabilities didn’t emerge overnight. The company acquired DeepMind in 2014, and for years invested heavily in ambitious, non-commercial research, from protein folding (AlphaFold) to reinforcement learning. Some of these efforts contributed little to immediate revenue, but they built institutional knowledge that is now differentiating Google’s foundational models.
After restructuring all AI efforts under Demis Hassabis, Google aligned its scientific research, engineering, and compute infrastructure into a single vertical engine, something no other company has achieved at comparable scale.
5. Integration Across the Full AI Value Chain
A key component of Google’s position in the AI market is its ability to operate across the entire computing stack. Few companies can match this level of vertical integration:
AI apps (Gemini, image generation tools, Nano Banana)
AI models (Gemini family)
Cloud computing infrastructure (Google Cloud)
Silicon (TPUs)
Data flywheel (Search, Android, YouTube, data Google often keeps proprietary)
Humanoid robot (Gemini Robotics)
Google develops consumer-facing AI applications such as Gemini and its image-generation tools; it builds foundation models through the Gemini family; it operates one of the world’s largest cloud computing platforms; it designs its own custom silicon through the TPU program; and it benefits from a substantial data ecosystem spanning Search, Android, and YouTube, much of which remains proprietary and feeds directly into model improvement.

This integrated structure is reflected in Google’s operating scale. Gemini processes roughly 7 billion tokens per minute through its API, equivalent to handling the full text of the Library of Congress every six hours. The Gemini app has expanded rapidly, increasing from 450 million to 650 million monthly active users in a single quarter, a 44% rise. On the enterprise side, Google Cloud reports that nine out of ten AI research labs use its platform in some capacity.
6. Conclusion: Google’s Compounding Advantage in AI
Google’s momentum in AI is being driven by a reinforcing flywheel rather than any single product launch. Years of deep research investment, spanning DeepMind’s scientific breakthroughs to Google’s long-term work in reinforcement learning, continue to strengthen the foundations of its models. Stronger models create demand for more compute; Google’s TPUs provide that compute at lower cost and higher efficiency; better performance attracts more enterprise adoption; enterprise usage produces more data; and that data feeds the next round of model improvement. Rising demand then justifies further capex, now projected at $91–93 billion for 2025, which expands the infrastructure that powers the entire cycle.
This dynamic is visible in the numbers. Gemini processes 7 billion tokens per minute, the Gemini app has grown to 650 million monthly active users, and nine out of ten AI labs rely on Google Cloud. These inputs reinforce one another, strengthening Google’s position across the stack.
With capabilities spanning custom silicon, cloud infrastructure, foundation models, applications, and proprietary data, Google is one of the few companies operating across the full value chain. As compute, data, and distribution become the defining constraints of the next AI wave, Google is increasingly positioned at the center of the ecosystem.
References list:
Bergen, M. (2025, November 25). Google, the sleeping giant in the global AI race, is now fully awake. Bloomberg. https://www.bloomberg.com/news/articles/2025-11-25/google-the-sleeping-giant-in-global-ai-race-now-fully-awake
Eadicicco, L. (2025, October 31). Big Tech keeps splurging on AI: The pressure is ramping up to show why. CNN. https://www.cnn.com/2025/10/31/tech/microsoft-amazon-meta-google-earnings-ai
Parekh, M. (2025, June 29). AI: Google TPU sees notable OpenAI win vs Nvidia. RTZ #766. AI: Reset to Zero. https://michaelparekh.substack.com/p/ai-google-tpu-sees-notable-openai
Sigalos, M., & Elias, J. (2025, November 6). Google’s rolling out its most powerful AI chip, taking aim at Nvidia with custom silicon. CNBC. https://www.cnbc.com/2025/11/06/google-unveils-ironwood-seventh-generation-tpu-competing-with-nvidia.html
Eadicicco, L. (2025, October 31). Big Tech keeps splurging on AI: The pressure is ramping up to show why. CNN. https://www.cnn.com/2025/10/31/tech/microsoft-amazon-meta-google-earnings-ai

