According to The Information, Microsoft has been working on a chip with the internal code name Athena since 2019, which could be used to train large language models and reduce the company’s reliance on Nvidia.
The chips have already been made available to certain Microsoft and OpenAI employees for testing purposes. These employees are testing the chips’ capabilities in running the latest large language models, including GPT-4.
Currently, Nvidia dominates the market for AI server chips. Companies are scrambling to acquire these chips, and OpenAI is estimated to require over 30,000 of Nvidia’s A100 GPUs to commercialize ChatGPT. The high demand for high-end chips that can support AI software is exemplified by the fact that Nvidia’s latest H100 GPUs are being sold for over $40,000 on eBay.
As Nvidia rushes to produce as many chips as possible to meet the demand, Microsoft is reportedly focusing on internal development to reduce costs in its AI efforts. The tech giant has allegedly ramped up its efforts on a project codenamed Athena, which aims to create its own AI chips.
While it’s uncertain whether Microsoft will offer these chips to its Azure cloud customers, the company plans to make them more widely available to its employees at Microsoft and OpenAI as early as next year. Furthermore, Microsoft has a roadmap outlining several future chip generations.
Although Microsoft’s AI chips are not intended to be direct substitutes for Nvidia’s, internal development efforts could significantly reduce costs as the company continues integrating AI-powered features into Bing, Office apps, GitHub, and other products.
Microsoft has been developing ARM-based chips for several years. Bloomberg reported in late 2020 that Microsoft was exploring the possibility of designing its ARM-based processors for servers and a potential future Surface device. While these ARM chips have yet to materialize, Microsoft has collaborated with AMD and Qualcomm to produce custom chips for its Surface Laptop and Surface Pro X devices.
Microsoft would be the most recent addition to a group of tech giants if it is indeed developing its own AI chips. Amazon, Google, and Meta have already created their own in-house chips for AI, but Nvidia is still the go-to choice for numerous companies seeking to power the most advanced large language models.
Related Stories: