The artificial intelligence data center buildout shows no signs of slowing down, creating significant opportunities for leading AI chipmakers. Industry analysts point to two semiconductor companies positioned to deliver substantial returns over the next five years as demand for AI infrastructure continues to surge. With multiple tech giants preparing to invest hundreds of billions in AI infrastructure, the AI chipmaker market remains poised for sustained growth through the end of the decade.
Nvidia and Broadcom have emerged as the top contenders in the AI semiconductor space, each offering distinct technological advantages. According to industry reports, five major companies alone are expected to spend $700 billion on AI infrastructure in the current year, with spending projections climbing higher in subsequent years.
Nvidia Maintains Dominant Position in AI Infrastructure
Nvidia continues to hold its position as the dominant player in AI infrastructure through its graphics processing units, which remain the primary chips used for training AI models. The company has built a substantial competitive moat through its CUDA software platform, upon which nearly all foundational AI code was written. Additionally, Nvidia’s proprietary NVLink interconnect systems allow its chips to communicate faster, enabling them to function as a unified powerful unit.
The AI chipmaker has not remained complacent with its current success. Flush with cash from strong revenue performance, Nvidia has been making strategic investments throughout the AI ecosystem. According to reports, one of the company’s smartest moves includes licensing Groq’s technology and hiring its employees, positioning Nvidia to better compete in the inference market where its CUDA advantage is less pronounced.
Broadcom Emerges as ASIC Technology Leader
Broadcom has established itself as a leader in application-specific integrated circuit technology, helping customers create custom hardwired AI chips designed for specific tasks. While these chips lack the flexibility of GPUs, they excel at performing their designated functions while offering superior energy efficiency. This specialization has made Broadcom an attractive partner for companies seeking optimized AI hardware solutions.
The semiconductor company helped Alphabet develop its highly regarded tensor processing units, which now run most of the tech giant’s internal workflows. Moreover, Alphabet has begun offering its TPUs to large customers through Google Cloud for their AI workloads. According to industry reports, this success led Anthropic to place a $21 billion TPU order through Broadcom for delivery this year.
Explosive Growth Projections for AI Chipmakers
After generating just over $20 billion in total AI revenue in fiscal 2025, including networking revenue, Broadcom appears set for remarkable expansion. Citigroup analysts have projected the company’s AI revenue could increase fivefold to $100 billion in fiscal 2027. However, these projections remain subject to market conditions and actual customer demand materializing as anticipated.
In contrast to traditional GPU applications, Broadcom’s networking opportunity through Ethernet switches holds a strong position in data centers. The company’s dual focus on custom AI chips and networking infrastructure provides multiple revenue streams within the expanding AI ecosystem. Meanwhile, Broadcom’s success with Alphabet’s TPUs has encouraged other companies to partner with it for developing their own custom AI ASICs.
Both AI chipmakers face continued demand as the artificial intelligence infrastructure buildout progresses through the remainder of the decade. However, investors should monitor whether actual spending matches current projections and how competition evolves in both the GPU and ASIC markets over the coming years.













