Skip to content
  • Categories
  • Newsletter
  • Recent
  • AI Insights
  • Tags
  • Popular
  • World
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
  1. Home
  2. AI Insights
  3. AI Server Market Sees Frenzy of Order Grabbing and Price Hikes, Component Production Expands by About 200%
uSpeedo.ai - AI marketing assistant
Try uSpeedo.ai — Boost your marketing

AI Server Market Sees Frenzy of Order Grabbing and Price Hikes, Component Production Expands by About 200%

Scheduled Pinned Locked Moved AI Insights
techinteligencia-ar
1 Posts 1 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • baoshi.raoB Offline
    baoshi.raoB Offline
    baoshi.rao
    wrote on last edited by
    #1

    The massive computational demands brought by the wave of large models are triggering a frenzy of order grabbing and price hikes in the AI server market. Industry insiders stated on Wednesday that the surge in server demand is now a consensus, with strong future order growth momentum, and several of their AI server models have already begun shipping. Meanwhile, a Guosheng Securities research report on October 29 cited data showing that as acquiring GPUs becomes more difficult, the current price of H800 servers has surged to 3 million yuan per unit.

    AI Server Market Sees Frenzy of Order Grabbing and Price Hikes

    Industry analysts noted that the "Hundred Models War" is essentially a battle for AI servers. Faced with massive market demand, server companies, internet giants, and large AI model firms have accelerated their deployments this year.

    Ming-Chi Kuo, an analyst at TF International Securities, released the latest supply chain survey this week, pointing out that due to higher-than-expected demand for AI servers from enterprises, Dell has requested a significant expansion of production capacity for bottlenecked AI server components by about 200%. At the same time, cross-industry players like Hengrun Shares, Lotus Health, and Zhenshi Technology have also entered the fray, investing hundreds of millions of yuan to purchase GPU servers.

    NVIDIA has now developed a new series of modified chips specifically for the Chinese market. The news spurred a strong late-session surge in shares of Inspur Information, a leading AI server company, which hit the daily limit.

    An AI server is a data server capable of providing artificial intelligence (AI) services. It can support local applications and web pages, as well as deliver complex AI models and services for both cloud and on-premises servers. AI servers help provide real-time computational services for various AI applications.

    With the emergence of new application scenarios like AI, the demand for data centers is rising. At the same time, as computational density increases, the thermal density of computing equipment and data center racks has also risen significantly. Cooling systems are an essential part of the data center industry chain, and market demand for air-cooled and liquid-cooled systems will continue to grow.

    Annual Growth Rate of China's Liquid Cooling Server Market

    Data shows that in 2022, China's liquid cooling server market reached $1.01 billion, a year-on-year increase of 189.9%. IDC predicts that from 2022 to 2027, the compound annual growth rate of China's liquid cooling server market will reach 56.6%, with the market size expected to hit $9.5 billion by 2027.

    On June 5th, at the "Computing Power Innovation Development Summit" during the 31st China International Information and Communication Exhibition, China Mobile, China Telecom, and China Unicom—the three major telecom operators—jointly released the "Telecom Operator Liquid Cooling Technology White Paper" (referred to as the "White Paper") in collaboration with representatives from the liquid cooling industry chain. The White Paper proposes a three-year development vision: in 2023, the three operators will conduct technical verification; in 2024, they will carry out large-scale testing, with 10% of new projects piloting liquid cooling technology; and in 2025, they will implement large-scale applications, with over 50% of projects adopting liquid cooling technology. The goal is to establish a high-quality development framework featuring unified standards, a mature ecosystem, optimal costs, and widespread adoption.

    According to statistics, dozens of large models have been introduced by domestic companies, institutions, and research organizations. Xiang Ligang, Chairman of the Information Consumption Alliance, told reporters that many regions and enterprises are currently building algorithm platforms to lay a solid foundation for large model applications. With the gradual improvement of computing power networks and accelerated collaboration between industry, academia, and research to promote the transformation of scientific and technological achievements, AI large models are being rapidly deployed across multiple scenarios, revealing related investment opportunities. The market's focus on large models and computing power is shifting from "heated discussion" to "enthusiastic embrace."

    Large-Scale Application of Liquid Cooling Technology by 2025

    With the rapid development of China's digital economy, the demand for AI and intelligent computing is growing rapidly, and new digital and intelligent applications are emerging continuously. High-density, high-computing-power infrastructure is evolving, while under the "dual carbon" (carbon peak and carbon neutrality) policy, government requirements for data center PUE (Power Usage Effectiveness) are becoming increasingly stringent.

    In terms of energy saving and consumption reduction for computing power infrastructure, compared to traditional air cooling systems, liquid cooling technology fundamentally improves the cooling method for main equipment, better meeting the needs of high-density racks and chip-level precise cooling. It offers advantages such as higher energy efficiency, space savings, and lower noise. However, current liquid cooling technology still faces challenges such as an immature ecosystem and high investment costs. The White Paper identifies three major challenges for the development and promotion of liquid cooling technology.

    First, the liquid cooling industry ecosystem is not yet mature. Currently, there are no unified interface standards for servers and racks in the industry. Racks and servers are deeply coupled, and products such as server equipment, coolant, cooling pipelines, and power supply vary widely across manufacturers. Incompatible interfaces between products from different vendors inevitably limit competition and hinder high-quality industry development.

    Second, the architecture of liquid cooling systems is still evolving. Different liquid cooling system architectures exist in the industry, with distributed and centralized approaches for cooling and power supply. Some manufacturers have developed high-temperature servers that can reduce the need for chillers, further simplifying cooling system architecture and lowering costs while improving efficiency.

    Third, the cost of liquid cooling systems remains high. Compared to traditional air cooling products, liquid cooling still faces issues such as high initial investment and high lifecycle costs, which hinder large-scale adoption and promotion.

    The liquid cooling industry ecosystem spans the entire supply chain, including upstream component suppliers, midstream liquid cooling server providers, and downstream computing power users.

    Upstream suppliers provide components and cooling equipment, represented by companies like Envicool, 3M, and Green Cloud Map. Midstream includes server and chip manufacturers such as Huawei, ZTE, Inspur, and Intel. Downstream users comprise telecom operators and internet companies like Baidu, Alibaba, Tencent, and JD.com.

    There are different technical approaches to liquid cooling. Currently, Alibaba focuses on single-phase immersion cooling, while other users predominantly adopt cold plate cooling pilot applications. The White Paper suggests both technologies have their pros and cons and will coexist for some time as mainstream solutions.

    Joint Promotion of Liquid Cooling Technology

    To address challenges, China's three telecom operators (China Mobile, China Telecom, and China Unicom) are transforming from end-users to ecosystem leaders. They plan to collaborate with industry, academia, and research institutions to develop core technologies and build an open ecosystem. Key goals include decoupling cooling cabinets from servers, establishing unified standards, reducing PUE, and optimizing total cost of ownership (TCO).

    The White Paper outlines a three-year development vision:

    • 2023: Technology verification and standard formation
    • 2024: Large-scale testing with 10% adoption in new projects
    • 2025: Widespread application in over 50% projects

    Specific implementation plans include:

    • China Mobile's pilot projects in Hohhot Data Center
    • China Telecom's deployment of 30 liquid cooling cabinets in Beijing-Tianjin-Hebei region
    • Additional installations in Anhui and Guangzhou

    With advancements in liquid cooling and other key technologies, China's data center construction is accelerating. The country's computing power industry has grown nearly 30% annually, ranking second globally. Recently, China launched its first national computing power scheduling platform and implemented new green data center procurement standards.

    NVIDIA's earnings report reflects strong downstream demand in data centers, and other data center-related industrial chains, particularly in China, are also worth noting. According to data from the Information and Communication Development Department of the Ministry of Industry and Information Technology, the total number of data center racks in China was 1.66 million in 2017, with a projected increase to 6.7 million by 2022, representing a compound annual growth rate (CAGR) of 32.2% from 2017 to 2022E. As per the data center white paper released by the China Academy of Information and Communications Technology (CAICT), with the deepening digital transformation across various regions and industries in China, the market revenue of data centers is expected to maintain a growth trend.

    1 Reply Last reply
    0
    Reply
    • Reply as topic
    Log in to reply
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes


    • Login

    • Don't have an account? Register

    • Login or register to search.
    • First post
      Last post
    0
    • Categories
    • Newsletter
    • Recent
    • AI Insights
    • Tags
    • Popular
    • World
    • Groups