November 14, 2024
Nvidia Partners With Microsoft to Build Massive AI Computer on Azure Cloud
The US chip designer and computing firm Nvidia on Wednesday said it is teaming up with Microsoft to build a “massive” computer to handle intense artificial intelligence computing work in the cloud. The AI computer will operate on Microsoft’s Azure cloud, using tens of thousands of graphics processing units (GPUs), Nvidia’s most powerful H100 and its A100 chip...

The US chip designer and computing firm Nvidia on Wednesday said it is teaming up with Microsoft to build a “massive” computer to handle intense artificial intelligence computing work in the cloud.

The AI computer will operate on Microsoft‘s Azure cloud, using tens of thousands of graphics processing units (GPUs), Nvidia‘s most powerful H100 and its A100 chips. Nvidia declined to say how much the deal is worth, but industry sources said each A100 chip is priced at about $10,000 (nearly Rs. 8,14,700) to $12,000 (nearly Rs. 9,77,600), and the H100 is far more expensive than that.

“We’re at that inflection point where AI is coming to the enterprise and getting those services out there that customers can use to deploy AI for business use cases is becoming real,” Ian Buck, Nvidia’s general manager for Hyperscale and HPC told Reuters. “We’re seeing a broad groundswell of AI adoption… and the need for applying AI for enterprise use cases.”

In addition to selling Microsoft the chips, Nvidia said it will partner with the software and cloud giant to develop AI models. Buck said Nvidia would also be a customer of Microsoft’s AI cloud computer and develop AI applications on it to offer services to customers.

The rapid growth of AI models such as those used for natural language processing have sharply boosted demand for faster, more powerful computing infrastructure. 

Nvidia said Azure would be the first public cloud to use its Quantum-2 InfiniBand networking technology which has a speed of 400Gbps. That networking technology links servers at high speed. This is important as heavy AI computing work requires thousands of chips to work together across several servers.

© Thomson Reuters 2022


Affiliate links may be automatically generated – see our ethics statement for details.