We noticed a big disparity in the AI semis world this week. The big news was Nvidia’s massive results, with probably the biggest jump in revenue we have ever seen for a company of its size. At the same time we also attended a briefing covering Qualcomm’s AI capabilities. Qualcomm actually has a fairly impressive strategy for AI. They have designed neural network blocs for their Snaprdragon processor and these are quite capable, but it is hard to see AI driving the same kind of revenue growth for them that it did for Nvidia. We have contrasted the two companies in the past, but here we think the difference in outcomes speaks to some difficult realities of the semis industry today.
The case for Nvidia is clear – they are in the dominant position for AI compute at just the moment when AI has become incredibly important to seemingly everyone. They have locked up the marketing for Training semis and have a clear lead in data center inference semis. The right product at the right time, all protected by a software barrier in the form of Cuda.
By contrast, Qualcomm has developed some very performative AI IP which they include in their Snapdragon processors. They ran demos of this at Mobile World Congress in March, and we suspect they have a very capable solution for the latest generative models like Chat GPT. Their chief competitor, Meidatek, has not demonstrated much on the AI front. Only Apple has anything comparable in the form of the AI blocs on their A-Series processor in the iPhone, but we suspect Qualcomm’s solution outperforms even that on many metrics. The trouble is Qualcomm has no easy way to monetize this advantage.
This is not caused by a failure in Qualcomm’s strategy, instead it demonstrates their position in the market. As we noted in our piece on the AI Semis landscape, the market for edge inference AIs is big, much bigger than the market for training semis, but these functions are unlikely to require standalone chips. Most of the market for edge inference semis is likely to come in the form of Systems on a Chip (SoCs) which incorporate a number of functions sometimes including AI math.
And there are many reasons to think that AI capabilities on the edge are going to be important. The economics of AI inference are proving to be punishing. The hyperscalers are having to invest massively to support these calculations. The market really needs these workloads to be pushed to the edge, and offloading GPT-type requests to on-device smartphone AI blocs is going to be critical as these systems get more widely deployed. There are also major security and privacy considerations pushing the importance of edge inference. Still, that is all going to occur in SoCs, which is already a fairly mature market.
At this stage, we think it is unlikely that customers are going to switch to Qualcomm processors because of their AI capabilities, at least for smartphones. These customers all have clear cut reasons for going for Qualcomm. Perhaps at the boundaries where Qualcomm faces off against Meidatek, say $400-$600 devices, the company can pick up some incremental share, but Qualcomm’s AI capabilities are not sufficient to push Apple to use Snapdragon. This probably helps Qualcomm broaden the appeal for its PC CPU offerings. This is still tiny revenue for the company, but their AI capabilities are ahead of what the competitors offer to PC makers. Probably the best hope is that AI encourages consumers to refresh their phones sooner sparking an upgrade cycle, but even this will take a while to play out – consumers first have to understand what AI can do, and that case is not entirely clear.
Qualcomm is not alone in this dilemma. For the time being, the AI market for semis looks set to remain a winner takes all market for Nvidia. The data center market will remain highly contentious, all the other edge markets like phones, IoT and industrial system offer far fewer gains for vendors, at least for the foreseeable future.