AMD launched a brand new artificial-intelligence chip on Thursday that’s taking direct intention at Nvidia’s knowledge middle graphics processors, generally known as GPUs.
The Intuition MI325X, because the chip is known as, will begin manufacturing earlier than the tip of 2024, AMD mentioned Thursday throughout an occasion saying the brand new product. If AMD’s AI chips are seen by builders and cloud giants as a detailed substitute for Nvidia’s merchandise, it may put pricing stress on Nvidia, which has loved roughly 75% gross margins whereas its GPUs have been in excessive demand over the previous 12 months.
Superior generative AI comparable to OpenAI’s ChatGPT requires large knowledge facilities stuffed with GPUs in an effort to do the required processing, which has created demand for extra firms to offer AI chips.
Prior to now few years, Nvidia has dominated nearly all of the info middle GPU market, however AMD is traditionally in second place. Now, AMD is aiming to take share from its Silicon Valley rival or not less than to seize an enormous chunk of the market, which it says will likely be value $500 billion by 2028.
“AI demand has truly continued to take off and really exceed expectations. It is clear that the speed of funding is continuous to develop all over the place,” AMD CEO Lisa Su mentioned on the occasion.
AMD did not reveal new main cloud or web prospects for its Intuition GPUs on the occasion, however the firm has beforehand disclosed that each Meta and Microsoft purchase its AI GPUs and that OpenAI makes use of them for some purposes. The corporate additionally didn’t disclose pricing for the Intuition MI325X, which is often bought as a part of a whole server.
With the launch of the MI325X, AMD is accelerating its product schedule to launch new chips on an annual schedule to higher compete with Nvidia and reap the benefits of the growth for AI chips. The brand new AI chip is the successor to the MI300X, which began transport late final 12 months. AMD’s 2025 chip will likely be known as MI350, and its 2026 chip will likely be known as MI400, the corporate mentioned.
The MI325X’s rollout will pit it in opposition to Nvidia’s upcoming Blackwell chips, which Nvidia has mentioned will begin transport in vital portions early subsequent 12 months.
A profitable launch for AMD’s latest knowledge middle GPU may draw curiosity from buyers which can be in search of extra firms which can be in line to profit from the AI growth. AMD is just up 20% thus far in 2024 whereas Nvidia’s inventory is up over 175%. Most trade estimates say Nvidia has over 90% of the marketplace for knowledge middle AI chips.
AMD inventory fell 3% throughout buying and selling on Thursday.
AMD’s largest impediment in taking market share is that its rival’s chips use their very own programming language, CUDA, which has change into customary amongst AI builders. That basically locks builders into Nvidia’s ecosystem.
In response, AMD this week mentioned that it has been enhancing its competing software program, known as ROCm, in order that AI builders can extra simply change extra of their AI fashions over to AMD’s chips, which it calls accelerators.
AMD has framed its AI accelerators as extra aggressive to be used instances the place AI fashions are creating content material or making predictions relatively than when an AI mannequin is processing terabytes of knowledge to enhance. That is partially as a result of superior reminiscence AMD is utilizing on its chip, it mentioned, which permits it to server Meta’s Llama AI mannequin quicker than some Nvidia chips.
“What you see is that MI325 platform delivers as much as 40% extra inference efficiency than the H200 on Llama 3.1,” mentioned Su, referring to Meta’s large-language AI mannequin.
Taking over Intel, too
Whereas AI accelerators and GPUs have change into probably the most intensely watched a part of the semiconductor trade, AMD’s core enterprise has been central processors, or CPUs, that lay on the coronary heart of almost each server on the earth.
AMD’s knowledge middle gross sales in the course of the June quarter greater than doubled up to now 12 months to $2.8 billion, with AI chips accounting for under about $1 billion, the corporate mentioned in July.
AMD takes about 34% of complete {dollars} spent on knowledge middle CPUs, the corporate mentioned. That is nonetheless lower than Intel, which stays the boss of the market with its Xeon line of chips. AMD is aiming to alter that with a brand new line of CPUs, known as EPYC fifth Gen, that it additionally introduced on Thursday.
These chips are available in a lot of completely different configurations starting from a low-cost and low-power 8-core chip that prices $527 to 192-core, 500-watt processors meant for supercomputers that price $14,813 per chip.
The brand new CPUs are notably good for feeding knowledge into AI workloads, AMD mentioned. Practically all GPUs require a CPU on the identical system in an effort to boot up the pc.
“Right now’s AI is basically about CPU functionality, and also you see that in knowledge analytics and a number of these sorts of purposes,” Su mentioned.
WATCH: Tech traits are supposed to play out over years, we’re nonetheless studying with AI, says AMD CEO Lisa Su