Aiming to become the global leader in chip-scale photonic solutions by deploying Optical Interposer technology to enable the seamless integration of electronics and photonics for a broad range of vertical market applications

Free
Message: AMD references in Nvidia article.

AMD’s CEO Lisa Su believes the AI accelerator market can reach $400 billion by 2027, as demand continues to far outpace supply with cloud giants gobbling up GPUs as fast as possible. With accelerators alone, Nvidia can surpass Apple as the company is estimated to control at least 90% of the data center GPU market. Even if Nvidia’s share slips to approximately 80% by 2027, that would be $320 billion in revenue.

In terms of hardware, Nvidia has an ambitious AI GPU roadmap, and is expected to release the next-gen H200 and B100 GPUs later this year, just over one year after releasing the H100. The two GPUs are expected to offer another leap in performance for AI training and inference, and the H200 is already in demand by the leading CSPs – AWS will be the first to deploy the new GPU, but Microsoft, Google and Oracle will also be deploying the chips.

It’s easy to see why the cloud giants are eager to upgrade quickly — Nvidia says the H200 will boast reduced energy usage and thus a lower TCO, while the introduction of HBM3e memory will essentially supercharge the GPU’s performance. For GPT-3 175B, the H200 is expected to offer 1.4x to 1.9x faster LLM inference on the leading GPT and Llama models compared to the H100, and an 18x performance upgrade compared to the A100.

While it will be too soon to gauge what level of demand there is for the two new GPUs from a Q1 guide, a fiscal year guide could provide insight into whether demand for the H200 and B100 can match the H100, or if Nvidia will face initial supply constraints while ramping production. Additionally, Nvidia will face competition this year from AMD’s MI300s.

 

 

 

 

Share
New Message
Please login to post a reply