Guangdong to Build Integrated Computing Power Network Hub Node, Enhancing AI Industry's Foundational Infrastructure
-
Generative AI continues to advance, and the underlying computing power infrastructure supporting its development has drawn significant market attention. Currently, computing power demand is surging. According to Huawei's projections, by 2030, global general-purpose computing (FP32) capacity will reach 3.3 ZFLOPS, a 10-fold increase from 2020, while AI computing (FP16) capacity will reach 105 ZFLOPS, a 500-fold increase from 2020. Huatai Securities noted that computing power, as the foundational infrastructure of the AI industry, plays a critical role in its progress.
On November 21, the General Office of the Guangdong Provincial Government issued the Three-Year Action Plan for the Construction of the 'Digital Bay Area.' The plan proposes implementing the national 'East Data, West Computing' strategy and accelerating the construction of an integrated computing power network hub node in the Guangdong-Hong Kong-Macao Greater Bay Area. It also aims to establish the Guangdong-Hong Kong Innovation Center of the Next-Generation Internet National Engineering Center, creating a comprehensive and rationally distributed computing power infrastructure while promoting shared development across the Greater Bay Area.
On the same day, Huawei Chairman Liang Hua stated at the Huawei 2023 Sustainable Development Forum that in the era of the digital economy, computing power is a new form of productivity, playing an increasingly vital role in driving technological progress, enabling industry digital transformation, and fostering social development.
Guo Tao, Deputy Director of the China E-Commerce Expert Service Center, emphasized that addressing computing power shortages requires increased investment in infrastructure, including more data centers and improved AI chips and server performance. Additionally, distributed and edge computing technologies can distribute computing tasks across multiple nodes to enhance overall capacity. AI and machine learning can also optimize task scheduling to improve efficiency.
AI Computing Power Demand Grows Exponentially
Faster transmission rates and higher coverage are needed for optical modules, which are the fastest and most direct indicators of the global AI industry's rapid development. Since March, North American manufacturers have placed four additional orders for 800G optical modules, with total annual demand now exceeding 1.2 million units.
As one of the leaders in this AI wave, NVIDIA's recent financial results have been impressive. Public data shows that in Q1 of fiscal year 2024, NVIDIA achieved revenue of $7.19 billion, down 13% year-over-year but up 19% quarter-over-quarter. Net profit reached $2.04 billion, up 26% year-over-year and 44% quarter-over-quarter. The company also forecasts Q2 revenue of $11 billion with a gross margin of 68.6%, far exceeding Wall Street's expectation of $7 billion.
NVIDIA CEO Jensen Huang previously stated, 'The computer industry is undergoing two simultaneous transitions—accelerated computing and generative AI. As companies race to apply generative AI to every product, service, and business process, the $1 trillion global data center infrastructure will shift from general-purpose computing to accelerated computing.'
In reality, under the AI frenzy triggered by ChatGPT, Nvidia's GPUs for AI computing are in short supply, which has also driven the company's market value to soar, approaching $1 trillion.
Building on this, JPMorgan updated its forecast in a recent report, predicting that Nvidia will capture 60% of the 2023 AI product market share, primarily from its graphics processing units (GPUs) and networking products. Broadcom ranks second, with its application-specific integrated circuits (ASICs) expected to account for 13% of revenue share. TSMC ranks 17th, with a 3% revenue share.
Since ChatGPT's explosive debut last year and its rapid growth in monthly active users, AI has not only captured global investors' attention but also become a hot topic on social networks worldwide. Unlike the high-tech distance AI once projected, this wave of innovation has made people realize its potential to enter everyday life. Nvidia's earnings far exceeded expectations, confirming the accelerating trend of AI from the source of computing power. U.S. chip design giant Marvell also emphasized that AI-related product revenue will double annually over the next two years.
AI computing demand is enormous. AI is a critical productivity tool that empowers industries by integrating with various sectors. In emerging fields like autonomous driving, smart homes, security surveillance, robotics, medical devices, and smart classrooms, AI's technological innovation and application are driving industry intelligence. Additionally, AI interaction and AI creation scenarios are developing rapidly, such as the emergence of natural language processing tools like ChatGPT, which is expected to further enhance industry intelligence.
Over the past decade, the primary computing carriers in the AI field have been GPU devices provided by foreign chip manufacturers, widely used in cloud-based AI products. On the edge side, embedded AI computing carriers have evolved from CPUs and GPUs to DSPs and ASIC architectures, enabling the widespread application of deep learning-based technologies like speech recognition, facial recognition, image-text recognition, AIGC, object detection, super-resolution, and ADAS.
Computing power is the productivity of the digital economy era. "In a few years, computing power will be like electricity today—where it comes from won't matter. If you need to compute something, just connect to the computing network, pay, and use it."
Building a national computing network like a power grid and operating it like the internet will allow users to access computing power as easily as electricity. Currently, China's computing network has aggregated over 3E Flops of collaborative computing power, initially achieving nationwide large-scale computing coordination and efficient computation, laying the strongest foundation for the digital economy.
Targeting the "East Data West Computing" market opportunity, major players like the three telecom operators, Huawei, ZTE, Tencent, and GDS have set up in Shaoguan, becoming key forces in the initiative. Among them, state-owned enterprises have shown strong interest in Shaoguan. So far, China Mobile's Guangdong-Hong Kong-Macao Greater Bay Area National Hub Node Shaoguan Data Center, China Telecom's Greater Bay Area Integrated Data Center, and China Unicom's Greater Bay Area Hub Shaoguan Data Center have all been established in Shaoguan.
Flexible computing resource allocation emerged as a key focus at the conference. "Our network enables fully connected computing power that's accessible on demand," said Wang Xiaoyun, Chief Scientist at China Mobile. Regarding infrastructure, China Mobile has established a "4+N+31+X" computing power layout, where "N" includes Shaoguan, ultimately meeting the computing demands of more services.
As a hub node, Shaoguan serves both as a cluster for data centers and a radiating influence.
Tang Xiongyan, Chief Scientist at China Unicom, mentioned that the company has deployed a national-level computing center in Shaoguan, pioneering an integrated facility combining general, super, and intelligent computing. Through the "Greater Bay Area Computing Power Scheduling Platform" in Shenzhen, it achieves dual-center computing services ("Shenzhen Data, Shaoguan Computing"), establishing a computing-network integration capability rooted in the Greater Bay Area with nationwide reach. "We're exploring deep integration of computing and networks. Future 6G networks will feature built-in intelligence, where computing power will play a crucial role," Tang said.
With soaring demand for computing power, collaboration becomes increasingly vital. Huang Hongbo, Deputy General Manager of China Telecom's Tianyi Cloud, noted that as the digital economy grows rapidly, society's computing power demand is expected to increase by 55% annually, with over 70% coming from first-tier cities where energy consumption quotas are scarce. High-speed data transmission enables cross-regional "East Data West Computing" and "East Data West Storage," effectively alleviating some supply-demand imbalances.
The "East Data West Computing" project sets higher requirements for green and intensive computing power development. Wu Jianbin, Chairman and Chief Scientist of Changxing Taihu Energy Valley Technology, highlighted Shaoguan's abundant clean energy resources like hydropower, wind, and solar power, which can meet the "green electricity" needs of computing centers. "We aim to provide green energy support for Shaoguan's supercomputing centers. The potential for collaboration here is enormous," Wu said.
As telecom operators ramp up investments in computing infrastructure, communication equipment vendors see significant market opportunities. "The 'East Data West Computing' project will build a high-quality, low-latency national computing network, becoming the main artery for China's computing power circulation. We'll work with operators to ensure computing power and data reach every corner and scenario of urban governance," said Hu Xuemei, Vice President of ZTE.
Over the past year, Shaoguan Data Center Cluster has attracted 12 large-scale data center projects, with plans for 361,000 server racks and a total investment of 39.5 billion yuan. Accelerated construction of supporting infrastructure and the establishment of upstream and downstream industries have laid a solid foundation for Guangdong's computing infrastructure and industry layout.