AI Computing Chips May No Longer Be Nvidia's Monopoly as AMD Enters the Arena
-
Computing chips may no longer be Nvidia's exclusive domain as AMD steps into the field.
On December 7, at the Advancing AI conference in San Jose, California, AMD officially unveiled two flagship AI chip products: the Instinct MI300X and MI300A. The launch of these products undoubtedly injects new vitality into the AI chip market, marking AMD's determination to challenge Nvidia.
Notably, the newly released MI300A chip is an APU product specifically designed for supercomputing applications. APU, a concept introduced by AMD in 2011, represents an upgraded version of the CPU that integrates both CPU and GPU on a single chip to meet the high-performance computing demands of supercomputers.
The release of the MI300A undoubtedly provides more robust hardware support for the development of AI-powered supercomputers like AI PCs.
Another released AI computing chip, the MI300X, is a data center GPU specifically designed for generative AI computing power, with NVIDIA's H100 as its main competitor. This product is based on AMD's latest third-generation CDNA architecture and manufactured by TSMC.
The two flagship AI chip products released this time represent a significant strategic move by AMD in the AI field. As a globally renowned semiconductor company, AMD has always been committed to providing the most advanced computing experience for users worldwide. The launch of the MI300X and MI300A undoubtedly marks a major breakthrough for AMD in the AI domain.
Crucially, global investment institutions are beginning to speculate whether NVIDIA's golden era in AI computing is coming to an end. With AMD's entry into the market, will there be too many changes in the form of distributed AI PCs? What impact will this have on the prospects of AMD, as well as AI computing, servers, and computer products?
From an analyst's perspective, this could be an epoch-making event. Consequently, following the product launch, AMD's stock price rose by nearly 10%.
AMD's two newly released chips include the MI300X AI computing chip, which features a staggering 153 billion transistors - the highest count in history. This significantly surpasses Nvidia's H100 with its 80 billion transistors.
In terms of the key AI chip performance metric - 8-bit floating-point (FP8) computing, the MI300X achieves 42 petaFLOPs (quadrillions of floating-point operations per second), compared to the H100's 32 petaFLOPs. This demonstrates the MI300X's outstanding AI computing capabilities.
Another major advantage of the MI300X is its memory capacity. Designed specifically for generative AI large language models, it offers 192GB of memory, exceeding the H100's 120GB. As AMD CEO Lisa Su previously stated, "More memory capacity means the chip can handle larger model sets."
As large models continue to grow in scale and complexity, trained models require greater computing power during inference, making memory expansion a crucial trend in AI chip evolution. Currently, an Instinct workstation equipped with eight MI300X chips can support the training and inference of large models like Llama2 (70 billion parameters) and BLOOM (176 billion parameters).
The launch of the MI300A has delivered more powerful hardware support for the advancement of supercomputers. As a representative of high-performance computing, supercomputers have long played a vital role across various fields. The introduction of the MI300A undoubtedly injects new momentum into supercomputing development while offering more efficient computational capabilities for related research and applications.
With AMD's recent release of its flagship AI chip, the MI300, analysts state that the company is now standing shoulder-to-shoulder with Nvidia as the "preferred choice" for tech firms.
According to Jefferies analysts, AMD's new product will create waves in the tech industry. Meanwhile, Bank of America noted in its latest report that AMD's offerings hold strong appeal for industry giants, enterprises, OEMs, and AI startups alike.
AMD projects that, with the MI300A as its flagship, its AI chip shipments next year will reach 10% of Nvidia's (based on CoWoS). If AMD's collaboration with Microsoft progresses smoothly and secures orders from Meta and Google, its AI chip shipments could potentially reach 30% of Nvidia's by 2025.
This forecast demonstrates AMD's ambitions in the AI chip market and the high expectations for its products.
According to predictions by Taiwan's Electronic Times, AMD's MI300 is expected to ship between 300,000 to 400,000 units next year, with Microsoft and Google anticipated as the largest customers. This projection reflects the strong market demand for the MI300 and AMD's competitiveness in the AI chip market.
AMD CEO Lisa Su has previously stated that multiple hyperscale cloud service providers have committed to deploying MI300 chips. She revealed that data center GPU products will generate $400 million in revenue for AMD in the fourth quarter, with 2024 revenue expected to exceed $2 billion. The MI300 is set to become the fastest product in the company's history to reach $1 billion in sales. Su's announcement further underscores AMD's confidence in the MI300's market prospects.
From analyst predictions to AMD management's statements, all indications suggest that AMD's strategic positioning in the AI chip market is gradually yielding results.
On November 1st, AMD released its Q3 2023 financial report, showing robust profit and revenue growth for the quarter. However, disappointing Q4 performance guidance led to a more than 5% drop in AMD's stock price during after-hours trading. Company executives quickly reassured investors by highlighting AMD's progress and prospects in artificial intelligence, helping to recover the losses.
The financial data revealed that AMD's Q3 net profit reached $299 million, a significant increase compared to $66 million in the same period last year. Excluding specific costs like stock compensation, earnings per share were 70 cents, slightly above analysts' consensus estimate of 68 cents. Additionally, AMD's revenue grew by 4% to $5.8 billion, surpassing Wall Street's expectation of $5.7 billion.
Lisa Su, AMD's Chair and CEO, attributed the strong performance to high demand for Ryzen 7000 series PC chips and record sales of server processors. She stated, 'The strength of our EPYC CPU portfolio and increased shipments of Instinct MI300 accelerators—supporting hyperscale, enterprise, and AI deployments—position our data center business for significant growth.'
However, AMD's Q4 revenue guidance midpoint of $6.1 billion fell short of Wall Street's $6.4 billion expectation, disappointing investors. The company projected Q4 revenue between $5.8 billion and $6.4 billion. As AMD's stock declined, executives shifted focus to its AI advancements to ease investor concerns.
During an analyst conference call, CEO Lisa Su spoke about AMD's "rapid execution" of its artificial intelligence roadmap and revealed purchase commitments from multiple cloud providers. She stated that AMD expects GPU sales for AI workloads to reach approximately $400 million in the fourth quarter alone, growing to $2 billion in fiscal year 2024.
This optimistic projection stems from AMD seeing "strong interest in MI300," with most interest concentrated in cloud services from "several large hyperscalers." CEO Lisa Su added that AMD is also very active across the enterprise sector. She explained: "Here's how we think about it: in Q4 our revenue is about $400 million, most of which is high-performance computing. As we enter Q1, we actually expect revenue to be in the $400 million range, with most of it coming from AI, so only a small portion is high-performance computing."
According to Bloomberg, the AMD CEO believes the AI chip industry "could be worth over $400 billion in the next four years." This figure is more than double the projection AMD management released in August.
AMD and Lisa Su are not resisting the trend of AI processors but are instead choosing to compete directly with Nvidia, actively advancing AI hardware technology. She stated, "This is not a passing fad."
Additionally, according to Reuters, the CEO confidently claimed that AMD has a "substantial" supply of AI chips worth "significantly more" than $2 billion. Therefore, it appears that AMD and its leadership are fully prepared to challenge Nvidia's dominance in the U.S. AI processor market.
According to financial reports, AMD's data center revenue for the quarter was $1.6 billion, flat year-over-year, meeting market expectations. However, sequential revenue growth exceeded 20%, indicating potential improvements in the near future. This trend aligns with the described significant growth in the data center business.
Furthermore, due to strong interest in the latest Ryzen chips, AMD's sales to PC chip manufacturing customers increased by 42% year-over-year to $1.5 billion, surpassing the expected $1.2 billion. This robust performance reflects AMD's growing competitiveness in the PC market. However, gaming revenue declined by 8% year-over-year to $1.5 billion, in line with expectations, while embedded chip sales fell by 5% to $1.2 billion, below analysts' expectations of $1.3 billion.
Wall Street observers were surprised by AMD's disappointing performance guidance for the current quarter, possibly due to their perceived growth potential.
AMD's future prospects appear promising, but there are also challenges and uncertainties.
First, AMD's performance in the server and PC processor markets has been strengthening. Products like AMD's EPYC CPUs and Ryzen series have gained widespread recognition for their performance and value, helping AMD capture more market share. Additionally, AMD is actively expanding its product line, including GPUs for artificial intelligence and high-performance computing, which will help maintain its competitiveness in the future.
However, AMD also faces some challenges. For example, compared to competitors like Intel, AMD's market share in the processor market remains relatively small. Moreover, although AMD has made progress in the data center business, competition in this field is fierce, requiring continuous innovation to stay competitive.
In the future, AMD needs to continue strengthening its R&D and innovation capabilities to address evolving market demands and technological trends. If AMD can maintain its technological and product leadership while expanding its market share, its future prospects will be positive.
Currently, investors should remember that Nvidia is not the only important AI chip manufacturer in the U.S. Additionally, AMD is launching new AI-compatible hardware models, which could bring substantial revenue.
Ultimately, while Nvidia may currently be the industry leader, AMD is also a valuable competitor.
Therefore, based on Wall Street analysts' views—with 24 buy ratings and 9 hold ratings over the past three months—AMD has received a consensus rating of "Moderate Buy." AMD's average target price is $128.77, implying a 0.8% upside potential for the stock.