
Photo: Tom's Hardware
Nvidia has agreed to acquire key assets from AI chip startup Groq in a transaction valued at approximately $20 billion, marking the largest deal in the company’s history and underscoring its determination to dominate the fast-growing market for artificial intelligence inference. While the structure stops short of a full corporate acquisition, the scale, strategic importance, and talent transfer make it one of the most consequential AI hardware deals to date.
The agreement centers on Groq’s inference technology and intellectual property, along with the integration of its senior leadership into Nvidia. Groq, founded by engineers behind Google’s Tensor Processing Unit, has been viewed as one of the most credible challengers to Nvidia’s dominance in AI acceleration.
At roughly $20 billion in cash, the deal dwarfs Nvidia’s previous largest acquisition, the nearly $7 billion purchase of Mellanox in 2019. The size of the transaction reflects both Nvidia’s immense financial firepower and the premium now placed on advanced AI hardware technologies.
As of late October, Nvidia held $60.6 billion in cash and short-term investments, up sharply from $13.3 billion in early 2023, giving it ample flexibility to pursue large strategic moves. This Groq transaction represents a significant deployment of that capital toward securing long-term leadership in AI infrastructure.
Despite the size, Nvidia has emphasized that it is not acquiring Groq as a company. Instead, it is purchasing assets and licensing core technology, a structure that mirrors Nvidia’s recent approach to absorbing innovation while minimizing integration risk.
According to investors close to the deal, Nvidia is acquiring Groq’s core inference assets, including proprietary chip designs and low-latency processing technology. These capabilities are designed to accelerate the “inference” stage of AI models, when trained systems generate real-time outputs such as text, images, or decisions.
Groq’s cloud business, GroqCloud, is excluded from the transaction and will continue operating independently. Groq itself will remain an independent company, now led by finance chief Simon Edwards as CEO.
Groq characterized the transaction as a non-exclusive licensing agreement, though the economic and operational reality is far more expansive. Founder and CEO Jonathan Ross, president Sunny Madra, and other senior leaders will join Nvidia, significantly bolstering its engineering and research bench.
In an internal message to employees, Nvidia CEO Jensen Huang described the deal as a way to expand Nvidia’s AI factory architecture and better serve real-time and low-latency workloads.
Nvidia plans to integrate Groq’s processors into its broader AI platform, which already powers data centers, cloud providers, and enterprise AI deployments worldwide. The move strengthens Nvidia’s position not just in training massive AI models, but also in inference, which is widely expected to account for a growing share of AI compute spending over the next decade.
Inference workloads are especially critical as AI moves from experimentation into everyday products, from chatbots and search to robotics and autonomous systems, where speed, efficiency, and cost per query matter as much as raw power.
Founded in 2016, Groq emerged from a team of former Google engineers, including Ross, who helped design Google’s TPU chips. The company positioned itself as a specialist in deterministic, low-latency AI inference, an area where even small performance gains can translate into major cost savings at scale.
Just three months ago, Groq raised $750 million at a valuation of roughly $6.9 billion. That round was led by Disruptive and included high-profile investors such as BlackRock, Neuberger Berman, Samsung, Cisco, Altimeter, and 1789 Capital. Since inception, investors have poured more than $500 million into the company.
Groq was targeting approximately $500 million in revenue this year, driven by surging demand for AI accelerators as enterprises race to deploy large language models in production environments. According to investors, the company was not actively seeking a sale when Nvidia approached.
The Groq deal fits squarely into a broader industry pattern. Rather than traditional acquisitions, major technology firms are increasingly using licensing agreements and talent absorption to secure critical AI capabilities.
Nvidia itself executed a similar, though smaller, transaction earlier this year, paying more than $900 million to bring in leadership and technology from AI hardware startup Enfabrica. Across the sector, companies such as Meta, Google, and Microsoft have spent billions to recruit elite AI teams through unconventional deal structures.
Nvidia has also expanded its influence through strategic investments, backing companies such as Crusoe in AI and energy infrastructure, AI model developer Cohere, and cloud provider CoreWeave. It has publicly discussed plans to invest up to $100 billion in OpenAI infrastructure and committed $5 billion to Intel as part of a strategic partnership, highlighting the scale of its ambitions.
By absorbing Groq’s core inference technology, Nvidia neutralizes a potential rival while strengthening its own product roadmap. Groq had been one of the few startups widely regarded as capable of challenging Nvidia’s GPUs in specific AI workloads.
The move also sends a clear signal to the market that Nvidia intends to remain the central platform for AI computing, even as specialized chipmakers attempt to carve out niches. Other startups, such as Cerebras Systems, have also sought to compete with Nvidia but face increasing pressure as consolidation accelerates.
Cerebras, which had filed for an IPO amid the AI boom, withdrew its offering after raising more than $1 billion privately, underscoring how volatile and capital-intensive the AI hardware race has become.
This $20 billion Groq asset deal represents more than a single transaction. It reflects Nvidia’s broader strategy of combining massive capital resources, elite talent acquisition, and ecosystem control to shape the future of AI infrastructure.
As AI inference becomes a dominant driver of compute demand, Nvidia is positioning itself not just as a supplier of chips, but as the foundational platform on which real-time AI applications are built. The Groq deal accelerates that vision and raises the bar for competitors across the semiconductor and AI hardware industry.









