Microsoft is reportedly developing its own AI server gear, including a new network card, to enhance the performance of its Maia AI server chip and reduce dependence on Nvidia’s hardware.
Satya Nadella, Microsoft’s CEO, has appointed Pradeep Sindhu, co-founder of Juniper Networks, to lead the network card development. Sindhu’s expertise comes from his background in networking gear and from his server chip startup, Fungible, which Microsoft acquired last year.
The new network card is likened to Nvidia’s ConnectX-7 card. However, Microsoft’s version aims to accelerate the training of OpenAI models on its servers and potentially reduce costs.
The report suggests that the network card’s development could exceed a year. If successful, it could speed up the time required for OpenAI to train models on Microsoft’s servers and make the process more economical.
Microsoft has heavily invested in OpenAI, the creator of ChatGPT, and has integrated its technology into a range of products. This strategy positions Microsoft favourably in the AI software market.
Microsoft introduced the Maia chip in November, which is specialized to run large language models and support AI computing, reinforcing the company’s commitment to AI development.
The development signals Microsoft’s strategic shift towards greater autonomy in AI hardware, potentially shaping the competitive landscape of AI infrastructure and software offerings.
Sources include: Reuters