NVIDIA has bet on AI for its future, as it is now reported that the company plans to ship 1.5 million to 2 million H100 units by 2024.
NVIDIA Could Potentially Generate $80 Billion Revenue Through AI in 2024, Breaking All Records
Just a few hours ago, NVIDIA posted its record-breaking earnings with a staggering 171% increase in revenue for the Data Center segment coming in from the sales of AI GPUs and AI platforms that mainly include Hopper H100, Ampere A100, and HGX systems. The green giant has already slated another 170% increase in revenue growth for the current quarter (Q3) but this is just a taste of the things to come.
Our demand is tremendous. We are significantly expanding our production capacity. Supply will substantially increase for the rest of this year and next year. NVIDIA has been preparing for this for over two decades and has created a new computing platform that the world’s industry -- world’s industries can build upon.
Jensen Huang, NVIDIA CEO
Financial Times reports that NVIDIA is working to significantly improve its production facilities to produce high volumes of AI GPUs next year. To give you an idea of this increment, NVIDIA is set to ship 550,000 H100s GPUs this year, and the company plans almost to triple this amount next year. This will surely be a hard aim to achieve since Team Green is faced with prolonged order backlogs, delayed to as far as December.
Our supply over the next several quarters will continue to ramp as we lower cycle times and work with our supply partners to add capacity. Additionally, the new L40S GPU will help address the growing demand for many types of workloads from cloud to enterprise.
We do expect to continue increasing ramping our supply over the next quarters as well as into next fiscal year. In terms of percent, it's not something that we have here. It is a work across so many different suppliers, so many different parts of building an HGX and many of our other new products that are coming to market. But we are very pleased with both the support that we have with our suppliers and the long time that we have spent with them improving their supply.
Colette Kress, NVIDIA CFO
The AI boom has led all companies into a GenAI race, with NVIDIA capitalizing the most. It is said that AI chip orders are booked until 2024, and NVIDIA has already pledged to deliver a high volume of H100s. The big question is how NVIDIA would fulfill such vast orders, given that the company faces several issues. There could be two solutions to this; the first is that Team Green works on expanding current facilities.
TSMC, NVIDIA's leading partner, is responsible for producing AI GPUs. TSMC not only deals with NVIDIA but also has orders from Apple and AMD. AI GPU harvesting has been a problem for the Taiwan giant, especially in packaging. While TSMC plans on rapid expansion to cater to the industry needs, that won't become effective until 2024. NVIDIA has to adopt another plan, and the most suitable one is adopting a "dual-sourcing" approach.
We reported yesterday that Samsung is in talks with AMD to acquire orders for its MI300X AI accelerators. Samsung has gained the spotlight in the industry since it offers its clients a "hybrid" approach, taking responsibility for all development stages, unlike TSMC, which outsources components like HBM. Going to the Samsung route would be wise for NVIDIA here since it would distribute the order workflow and ensure a more streamlined supply, leading to more profits.
WccftechContinue reading/original-link]