Gizcoupon reported on April 9th that Qualcomm beat Nvidia in power efficiency for AI chips, according to MLCommons’ latest test data. In image classification, Qualcomm’s Cloud AI 100 chip achieved 227.4 server queries per watt, while Nvidia’s H100 chip only achieved 108.4 queries per watt. Object detection, used in surveillance video analysis, was also more efficient on Qualcomm’s chip. However, Nvidia’s natural language processing performance and power efficiency remained superior. Both companies aim to gain a foothold in the growing data center market for AI chips.
Comparing Qualcomm’s Cloud AI 100 and Nvidia’s flagship H100 chip
AI chips are specialized hardware used for training and executing AI models, with Qualcomm and Nvidia leading the industry. AI models require significant amounts of data to improve accuracy, with inference adding to enterprise costs due to high electricity expenses.
Qualcomm drew on its experience designing chips for low-power devices to address this challenge and launched the Cloud AI 100. This chip is designed to offer high-performance, low-power AI processing for the cloud and edge. In MLCommons’ testing, the Cloud AI 100 outperformed Nvidia’s flagship H100 chip in two power efficiency metrics.
In image classification, Qualcomm’s chip achieved over twice as many server queries per watt as Nvidia’s. The technology can identify objects/scenes in pictures. Object detection can analyze retail surveillance videos to understand customer behavior. The Cloud AI 100 outperformed Nvidia in both metrics, with 3.8 queries per watt for object detection compared to Nvidia’s 2.4 queries per watt.
However, Nvidia’s natural language processing performance and power efficiency remained superior to Qualcomm’s Cloud AI 100. This technology is widely used in systems like chatbots, and Nvidia achieved 10.8 queries per watt versus Qualcomm’s 8.9 queries per watt.
Qualcomm and Nvidia both want to capture the AI chip market share in data centers. This is because more enterprises are incorporating AI technology. The market is expected to grow rapidly, and efficient AI chips will be essential to managing costs and improving performance.