At its Ignites conference earlier this week, Microsoft unveiled two new chips: the Azure Maia AI Accelerator, which is built for AI applications, and the Azure Cobalt CPU, tailored for general-purpose computing tasks. The chips will roll out early next year to the company’s data centers, Microsoft said in a blog post.
Maia is designed to help AI systems process massive amounts of information more quickly. The AI chip is already being tested with the company’s new Bing and Office AI products as well as with OpenAI’s large language model, with Open AI CEO Sam Altman providing valuable feedback.
Tech giants want to meet the AI chip demand
The new chips are well-timed. Companies have been scrambling to secure Nvidia’s AI chips to power their own generative AI products, following the breakout of the AI chatbot ChatGPT nearly one year ago. The steep demand for its high-performing chips has been a revenue boon for Nvidia, the current leader in the AI chip market. And other companies want in on that.
Tech giants have been building their own Nvidia-like chip options tailored for AI applications. Amazon first introduced its AI chips Inferentia and Tranium in 2019; Google’s are Tensor Processing Units, which the company has been selling since 2018.
“At the scale we operate, it’s important for us to optimize and integrate every layer of the infrastructure stack to maximize performance, diversify our supply chain and give customers infrastructure choice,” Scott Guthrie, executive vice president of Microsoft’s Cloud and AI, said in a statement.
The proliferation of AI-tailored chips will help companies make progress toward building more ambitious AI productsAI systems that take in different modes of data including audio, video, and text to generate multisensory content.
Microsoft said that it is already working on building second-generation versions of Maia AI Accelerator and the Azure Cobalt CPU.