January 28, 2026 /SemiMedia/ — Shipments of AI computing ASICs used in servers are expected to triple by 2027 compared with 2024 levels, as cloud companies expand the use of in-house chips to support growing AI workloads, Counterpoint Research said.
The research firm said demand is being driven by continued investment in Google’s TPU infrastructure, the expansion of AWS Trainium clusters, and rising shipments of internal chips developed by Meta and Microsoft as their product portfolios mature.
Google is expected to remain the largest supplier of AI server ASICs through 2027, supported by the growth of its Gemini models, Counterpoint said. However, its market share is forecast to decline as the overall market expands and other hyperscalers increase deployments of their own chips.
The broader AI server ASIC market is shifting away from a highly concentrated structure dominated by Google and AWS in 2024, toward a more diversified landscape. By 2027, shipments from Meta and Microsoft are expected to account for a larger share of the total.
Counterpoint said the trend highlights a strategic move by hyperscale data center operators to reduce reliance on off-the-shelf processors and focus on custom chips that can deliver better performance per watt for specific AI workloads.
All Comments (0)