Analysis of the Market Status and Industry Trends of China's AI Chip Industry
-
AI chips, also known as AI accelerators or computing cards, are modules specifically designed to handle large-scale computational tasks in AI applications (non-computational tasks are still managed by CPUs). Currently, AI chips are mainly categorized into GPU, FPGA, and ASIC. Much of AI's data processing involves matrix multiplication and addition. GPUs, with their massive parallel processing capabilities, offer a cost-effective solution but come with higher power consumption.
FPGAs, equipped with built-in DSP modules and local memory, are more energy-efficient but generally more expensive. Technically, the first generation of AI chips in the market included various combinations of off-the-shelf CPUs, GPUs, FPGAs, and DSPs. While new designs are being developed by companies like Intel, Google, NVIDIA, Qualcomm, and IBM, at least one CPU is still required to control these systems. However, when streaming data is parallelized, various types of coprocessors become necessary.
In recent years, as awareness of the importance of AI chips for computational power has grown, the number of players in the AI chip sector has increased significantly. The advancement of AI technology is ushering in a new era—the era where algorithms are chips.
It is worth noting that despite the abundance of players in this field and continuous updates to their products, very few AI chips fully meet the descriptions and benchmark tests so far. YITU Technology, not originally a chip startup, chose to enter the highly competitive and high-barrier chip industry two years ago by focusing on a niche with many top players—self-developed cloud AI SoCs. Their products directly compete with NVIDIA, indicating a bold move to tackle tough challenges. This suggests that as global AI enters its third wave of growth, Chinese AI companies are, to some extent, on the same starting line as global giants. Traditionally, most training and inference of neural networks have been conducted in the cloud or on servers. With the continuous improvement in the performance of terminal processors, many AI inference tasks, such as pattern matching, modeling detection, classification, and recognition, are gradually shifting from the cloud to the edge. There are three main reasons for this. First, the migration of AI capabilities to the edge is an inevitable result of user scenarios. Data is moving from the cloud to the edge. According to IDC statistics, edge-side data will account for 50% of total data volume in the coming years. This data is collected and generated by terminals and requires edge-side AI chips for localized analysis and processing. Second, the migration of AI capabilities to the edge is also an important way to enhance the user experience of AI. On the edge, the key advantages of AI include instant response, enhanced privacy protection, improved reliability, and the ability to ensure AI experiences even without network connectivity. Finally, the shift of AI processing to the edge is necessary for data privacy protection in AI.
The increasing layers of chip export controls imposed by the U.S. government on China present both challenges and opportunities for the domestic AI chip industry. From a holistic perspective of industrial development, there is an urgent need to establish a neutral, objective, and authoritative evaluation system to provide a fair competitive platform and scenario-matching channel for various chips.
At the 2023 World Artificial Intelligence Conference, the 'Zhiyue Plan,' jointly initiated by the People's Daily's National Key Laboratory of Communication Content Cognition (People.cn) and the China Electronics Standardization Institute, proposed to jointly promote the establishment of a comprehensive evaluation standard system for AI chips, including performance evaluation, scenario evaluation, and comprehensive evaluation. Scenario evaluation specifically tests the actual performance of different chips in various AI application scenarios.
The 'Zhiyue Plan' will ultimately produce comprehensive reports and product recommendation catalogs for specific application scenarios, thereby optimizing market supply-demand matching and providing important references and decision-making bases for governments, enterprises, and research institutions in selecting chips for intelligent computing centers. With the rapid advancement of AI technology, the market has higher demands for chip products' performance, stability, and applicability. A scientific and comprehensive evaluation system will effectively guide corporate R&D directions, promoting ecosystem prosperity and overall industry progress.
AI Becomes New Growth Driver for Baidu
On February 28, Baidu released its financial report for the fourth quarter and full year of 2023. The total revenue for 2023 reached 134.598 billion yuan, a year-on-year increase of 9%; Baidu's net profit (non-GAAP) was 28.7 billion yuan, with a year-on-year growth rate of 39%. In the fourth quarter, revenue was 34.951 billion yuan, a year-on-year increase of 6%, and Baidu's net profit (non-GAAP) was 7.755 billion yuan, a significant year-on-year increase of 44%. In 2023, Baidu's annual revenue and profit both exceeded market expectations.
Since its release, Baidu has continuously reduced the inference costs of the ERNIE large model, which have now dropped to 1% of the March 2023 version. With the reduction in inference costs, more and more users and enterprises have started using the ERNIE large model. In 2024, generative AI and foundational model business will bring us incremental revenue of several billion yuan, which will also have a positive impact on our total revenue.
Robin Li stated that currently, the chips in hand are sufficient to support the upgrade of ERNIE Bot 4.0 to a higher level. As mentioned before, we hope to continuously iterate ERNIE Bot through product applications, guided by user needs, and improve and adjust our model by collecting user feedback. Of course, the iteration direction can include multimodal capabilities, agents, reliability improvements, and more. AI has become a new growth driver for Baidu. Financial reports show that in the fourth quarter, Baidu's core revenue reached 27.5 billion yuan, a 7% year-on-year increase, surpassing market expectations of 27.31 billion yuan. Online marketing revenue was 19.2 billion yuan, up 6% year-on-year, while non-online marketing revenue reached 8.3 billion yuan, a 9% increase, primarily driven by intelligent cloud services.
"First AI Chip Stock" Cambricon (688256.SH) Reports Consecutive Annual Losses
On the evening of January 30, Cambricon released its 2023 annual performance forecast. The report indicates that Cambricon expects annual revenue between 680 million and 720 million yuan, slightly lower than the previous year. The projected net loss attributable to shareholders ranges from 756 million to 924 million yuan, representing a narrowed loss of 26.47% to 39.84% compared to the previous year. The adjusted net loss is expected to be between 945 million and 1.155 billion yuan, with losses narrowing by 26.87% to 40.17% year-on-year.
Overall, Cambricon's losses have decreased compared to previous periods, but its revenue still fails to cover the deficits. Additionally, this marks the first revenue decline since Cambricon's establishment seven years ago and its seventh consecutive year of losses.
A CITIC Securities report points out that currently, apart from Cambricon, the cloud and edge intelligent computing markets are predominantly occupied by companies like NVIDIA. In the intelligent computing cluster system market, clusters based on NVIDIA's GPU products hold a dominant position. Compared to industry giants like NVIDIA, Cambricon faces certain competitive disadvantages.
In recent years, with the rapid development of the AI industry, market competition has intensified. As the 'first AI chip stock', Cambricon still has a long way to go in terms of commercialization.