Alphabet Inc. may not often be mentioned in the same breath as Nvidia when it comes to semiconductors, but analysts at D.A. Davidson argue the Google parent company could be a formidable domestic rival in the artificial intelligence (AI) accelerator market.
The firm suggests that Alphabet’s growing tensor processing unit (TPU) business, combined with its DeepMind AI research arm, could be worth as much as $900 billion if spun off — a significant increase from the $717 billion valuation it had estimated earlier this year.
According to D.A. Davidson analysts led by Gil Luria, Google’s custom-designed TPUs are attracting growing attention from researchers and engineers working at frontier AI labs.
The firm noted “positive sentiment” around these accelerators, which are purpose-built for machine learning and AI workloads.
Google’s sixth-generation Trillium TPUs, which were made widely available in December, are already in high demand.
The upcoming seventh-generation Ironwood TPUs, designed specifically for large-scale inference — the process of running AI models after training — are expected to further accelerate adoption.
Performance metrics highlight their appeal: TPUs can scale up to 42.5 exaflops and are benefitting from improvements in high-bandwidth memory capacity.
Analysts also noted the cost efficiency of the chips, which is driving adoption beyond Google’s internal infrastructure.
Alphabet has so far partnered exclusively with Broadcom Inc. for its TPU production.
However, reports suggest the company is exploring a collaboration with Taiwan-based MediaTek Inc. for its Ironwood generation.
Proximity to Taiwan Semiconductor Manufacturing Co., coupled with MediaTek’s ability to provide chips at lower costs, is seen as a key motivation behind the potential shift.
Meanwhile, notable AI firms are increasingly adopting TPUs.
Startup Anthropic has been hiring TPU kernel engineers, signaling potential diversification away from Amazon Web Services’ Trainium chips, despite AWS’s $8 billion investment in the company.
Elon Musk’s xAI has also shown interest, driven by improvements in JAX-TPU tooling that make the ecosystem more accessible outside Google’s internal environment.
JAX, developed by Google, is a numerical-computing library in Python designed for high-performance applications.
D.A. Davidson’s data points to rising traction: developer activity around TPUs on Google Cloud grew nearly 96% between February and August, based on the firm’s DaVinci Developer Dataset.
Despite acknowledging Alphabet’s underappreciated position in AI semiconductors, D.A. Davidson analysts remain cautious on the likelihood of a near-term spinoff.
They argue that a “big-bang breakup” of Google could unlock shareholder value, but view such a move as unlikely in the current environment.
Absent a structural separation, Alphabet’s TPU and DeepMind operations remain embedded within its broader portfolio, which analysts believe leaves the business “well-undervalued.”
Still, D.A. Davidson raised its price target on Google shares to $190 while maintaining a neutral rating.
For now, Nvidia continues to dominate headlines and market share in AI hardware.
But as demand for large-scale inference and cost-efficient accelerators grows, Alphabet’s TPUs may represent a quietly emerging alternative for investors and AI developers alike.
The post Google’s TPU business seen as $900B opportunity amid growing AI demand appeared first on Invezz