What Will Be Hotter Than Large Models in AI's 2024?
-
Currently, the AI battle continues to rage.
With the emergence of ChatGPT, major internet players have entered the fray, focusing on large models. Domestically, hundreds of large models have been released, showing a clear trend toward commoditization.
In contrast, the implementation of large models doesn’t seem to be going smoothly.
2024年,打破行业僵局势在必行:无论是互联网头部玩家的布局,或是WAVE SUMMIT+深度学习开发者大会2023的观点交锋,都折射出AI之争正在悄然生变,应用接棒大模型,将成为互联网新的“主战场”。
一言以蔽之,大模型的尽头是应用。
Betting on Applications: The New Consensus Among Tech Giants
The application layer is becoming the next 'gold mine' of the internet.
After the internet cooled down, it became clear that while large models are important, demonstrating commercial value is even more crucial—otherwise, the goal of truly democratizing AI cannot be achieved.
The transition from being technology-driven to value-driven, creating a prosperous ecosystem, has gradually become a consensus in the industry.
This is evident from the unanimous emphasis by internet leaders such as Robin Li, Zhou Hongyi, and Wang Xiaochuan on the importance of applications as fertile ground for AI.
"We must focus on developing native AI applications; only when these are created can models truly have value," said Robin Li. He believes that large models are not an opportunity for the vast majority of people. "In the era of AI-native, we need a million-level of AI-native applications, but not a hundred large models."
As a result, developing applications has become the common choice for tech giants.
ByteDance has launched products like "Doubao" and "Xiao Wukong," focusing on search and video editing; Tencent has introduced "Xiao Qin," "Weiban," and "AI Listen Together," emphasizing social and music features; JD.com has rolled out "Jingyan," while Alibaba has introduced "Taobao Wenwen," both prioritizing e-commerce shopping; Baidu has gone even further, not only revamping all its products but also launching AI-native applications like "Baidu GBI" and "Cloud Yiduo."
The application layer has become the next 'gold mine' for the internet.
An internet observer told Zinc Ke: "Technologically, Chinese internet companies may be half a step behind, but there is potential to surpass in application scenarios. Whether it's product reinvention or application innovation, there is a trend of diverse developments."
The observer further pointed out that although no killer application has emerged yet, it does not affect the giants' increased investments. After all, seizing the high ground will determine their ecological niche in the AI era.
Moreover, related AI startups are gearing up, with AI-native applications springing up like mushrooms, eager to replicate OpenAI's success. Meanwhile, tech giants are happy to bet on startups with unicorn potential, seeking win-win outcomes.
For instance, MiniMax is a well-known AI startup focusing on text-to-visual, text-to-speech, and text-to-text AI solutions. In its latest $250 million funding round, Tencent participated with a $40 million investment.
Clearly, players are moving forward at full speed.
Bet on the 'Track', Not the 'Horse'?
It's important to note that while the prospects for AI-native applications in 2024 are bright, the journey may not be smooth sailing.
First, the proliferation of concepts.
There is currently no unified industry standard for what constitutes an AI-native application. Some simply replicate the underlying logic of existing apps, while others superficially layer on AI technology.
As a result, users often find the new experience underwhelming, even leading to the misconception that it's 'nothing special.'
In fact, AI-native applications are built on AI technology as their foundation, fully incorporating the robust capabilities of large models—such as understanding, generation, logic, and memory—thereby unlocking entirely new user interaction experiences.
Only by addressing the root causes can we unleash greater replacement demand and lower migration costs.
Secondly, the development barrier is relatively high.
Large models are still a nascent technology, and applications based on them naturally come with high barriers, posing certain obstacles to attracting developer participation.
The issue is that without a continuous influx of developers, the prosperity of the ecosystem is unattainable.
Reducing the development barrier and enabling AI-native applications to transition from tech giants to the broader public tests the industry's wisdom, which will determine how fast and far the industry can progress.
Once again, a fleeting moment.
Previously, Miaoya Camera suddenly gained popularity, becoming the first AI application to break through in China's AIGC field. However, as time passed, its buzz gradually diminished, showing increasingly obvious signs of being just a flash in the pan.
An industry insider told Zinc Scale: "The technical barriers for tools like Miao Ya Camera are not high, leading to a constant stream of imitators. Coupled with insufficient user retention, it's understandable that they start strong but fade quickly."
The insider further pointed out that the industry not only needs killer applications but also long-lasting ones. Only in this way can users be propelled from the mobile era into the AI era.
These three hurdles are also being contemplated by the industry, with the "large model + plugin" approach sparking interest among tech giants.
Plugins are essentially variants of AI-native applications.
For example, Baidu has launched the Lingjing Matrix platform to attract third-party developers to create various plugins. By fostering a thriving plugin ecosystem, it indirectly achieves the prosperity of the AI ecosystem.
In simple terms, plugins are modified versions of AI-native applications, serving as both the value gateway and the traffic gateway for large models.
This way, the entry barrier for developers is lowered, while the platform can maintain the quality of applications, avoid homogeneous competition, and more importantly, engage in 'horse racing'.
After all, it's difficult for an application to remain popular for long, but there are always applications that do. The focus is on 'tracks' rather than 'horses'.
In summary, there will still be many large models in 2024, but what's even hotter are AI-native applications. Only by building a thriving application ecosystem can large models truly find their place.
Then, the giants have a new topic.