With the AI industry moving towards homegrown solutions, it is now being reported that Microsoft & OpenAI will debut its AI chip soon, competing with the likes of NVIDIA's H100s.
Microsoft Aims At Reducing Dependency On NVIDIA & Others Through Developing Its Own AI Chips, OpenAI Amongst Major Partner
Everything was going smoothly for Team Green, with little to no competition and orders from the industry coming at a high rate, but suddenly, what has prompted companies to shift towards an "in-house" approach? Well, the answer is simple. You see, since the beginning of the "AI frenzy," NVIDIA has managed to maintain the spotlight in the industry, with its H100 AI GPU being the most demanded piece of tech.
With huge demand came problems such as order backlogs, inflated costs, and, instead, "exclusive" access, depending on the relation of companies with NVIDIA. This led to an "unbalanced" playing field for other companies, which is why we may witness a landscape shift. Now, Microsoft is regarded as a significant player in the AI industry and is rapidly developing generative AI resources to incorporate them into mainstream applications. Microsoft believes the AI chip industry's situation is hindering their goal's progress, which is why a homegrown "AI chip" is a viable solution.
While details regarding the specific chip are slim, we know that Microsoft plans it to be named "Athena," it is all set to be unveiled at Microsoft's Ignite conference, scheduled in November 2023. The company plans on setting its Athena chip on par with NVIDIA's H100 AI GPUs, but achieving this goal isn't as easy as it may sound. On a much broader scale, AI developers prefer NVIDIA for its "well-shaped" CUDA platform and the company's renowned software technologies that make the H100s much superior to its competitors.
While Microsoft may achieve similar computing power to Athena, developing software resources is something that will take time. However, Microsoft's primary aim with its own AI chips is to cater to the needs of the company's divisions and partners, such as Microsoft Azure and OpenAI. It can be said that Microsoft has the necessary resources to "perfect" its AI chip, the company just needs ample time.
So far, OpenAI has been using NVIDIA's AI and Tensor Core GPU hardware such as the A100 & H100 to operate its ChatGPT software but given this new development, we might see a drastic change in the AI field. OpenAI was running thousands of NVIDIA H100 GPUs to power ChatGPT.
I have always pointed out things in such coverages that with the rapid development of a particular industry comes a wave of "innovation" that is targeted toward breaking monopolies and reaching new heights. Today's development is similar, and NVIDIA's dominance over the AI industry is limited to a certain extent, given that new competition will always emerge. However, the decisive fact is how things pan out for Microsoft, given that the company is in the lead regarding AI developments on the consumer side.
WccftechContinue reading/original-link]