AI Enters the Customer Service Field: Enhancing Service Efficiency and User Experience
-
While large models are busy composing poetry and painting, we're stuck doing the hard work.
A widely circulated joke highlights the current implementation challenges faced by AI large models: as the cutting edge of technology, they urgently need real-world scenarios to demonstrate their value, justifying the human and financial investments in this arms race.
But jokes aside, implementation isn't as distant as it seems. In e-commerce, a ubiquitous part of modern life, large models are already making strides, reshaping the industry. The most prominent among these applications is generative content (AIGC), including but not limited to text-to-image, text-to-video, and human-computer interaction.
A simple overview reveals how AI is transforming the e-commerce landscape: B2B applications like smart customer service and digital livestreaming improve efficiency, while consumers enjoy 24/7 responsive support; AIGC generates low-cost, omnichannel content, and intelligent search and product selection streamline distribution, shortening transaction cycles and boosting ROI...
Yet, as a saying in deep learning goes, we can make machines talk like humans, but making them think like humans remains a challenge. In e-commerce, where interactions are frequent, decisions are critical, and connections are weak, mere "human-like" behavior isn't enough to form a robust product logic.
Thus, for AIGC in e-commerce, players often seek "closure within openness," adopting a bottom-up approach.
According to the latest "2023 China Intelligent Customer Service Market Report" by Frost & Sullivan, China's intelligent customer service market reached 6.68 billion yuan in 2022 and is expected to grow to 18.13 billion yuan by 2027, with a projected five-year compound annual growth rate exceeding 20%.
We are witnessing this niche sector advance toward a billion-scale market, with the widespread application of intelligent customer service in e-commerce being the primary driver of sustained high growth.
The foremost challenge is the unavoidable traffic peaks and high-concurrency pre-sales inquiries in e-commerce scenarios. Beyond major shopping festivals like Double 11 and 618, merchants face multiple concurrent inquiries daily. In such cases, both customer attrition due to slow response times and the high costs of human customer service are burdens the already saturated e-commerce market can ill afford.
Simply put, the widespread adoption of intelligent customer service by e-commerce platforms is an inevitable trend. Moreover, its prevalence predates the era of large language models. If large models represent the second leap for intelligent customer service, the first leap was NLP (Natural Language Processing) technology during the AI 1.0 era.
"Before the emergence of large model-driven AIGC, the industry already had mature NLP-based intelligent customer service with broad applications," Chen Zhe, VP of Products at Zhichi Technology, told Photon Planet. "Customer service scenarios typically handle closed-ended inquiries, making it easier to improve efficiency compared to open-ended scenarios."
Before NLP technology, online customer service relied on simple QA systems, providing mechanical responses based on pre-recorded keywords, phrases, or sentences. To draw an imperfect analogy: pre-NLP intelligent customer service was like an NPC in traditional RPGs giving scripted replies, while post-NLP, it resembles the dynamic, context-aware NPCs in modern AAA games.
In other words, NLP marked the beginning of intelligent online customer service and coincided with its market maturity. Large models now represent its next leap, enhancing efficiency, personalization, and overall intelligence.
Chen Zhe offered a rough analogy: if NLP enabled intelligent customer service to accurately answer 50 out of 100 queries, integrating large models has increased this to 75, with the added ability to switch contexts by altering databases.
"The absolute efficiency gain is about 20%~30%, or a 50% relative improvement," Chen noted.
The efficiency improvements brought by large models to intelligent customer service extend not only to the demand side but also to the supply side. The current paradigm of secondary development and external database integration with large models has drastically shortened the time required to build intelligent customer service products from scratch, reducing human and time costs by orders of magnitude. Meanwhile, the flexibility of switching databases and knowledge bases ensures product uniqueness.
While large models are still seeking practical applications, their 50% efficiency boost has already provided substantial certainty for the industry—whether integrated into existing intelligent customer service products or deployed directly in SaaS solutions as customer service tools.
The more pressing questions for the industry are: What technology stack is needed to build an intelligent customer service product, and how can it be commercialized?
Intelligent customer service is a pioneer in the application of AIGC in e-commerce, but integrating high-cost large model capabilities is not something that can be rushed. For major tech companies, customer service is often seen as a persistent cost center in e-commerce platforms, rarely warranting significant resource allocation. Meanwhile, small and medium-sized enterprises lack the capacity to build foundational models from scratch. For instance, Chen Zhe of Zhichi Technology openly stated that they do not develop proprietary large models but instead leverage leading models and internet data to build products at the application layer.
In other words, resource constraints are common in the intelligent customer service sector. Without foundational models, most solutions follow the workflow of "model selection and invocation → data collection and cleaning → fine-tuning → deployment." However, this approach introduces challenges, primarily centered around data quality.
Generally, intelligent customer service products are designed to meet clients' cost-reduction needs, making their own cost issues particularly prominent. While the industry's common practice of using mature databases can significantly accelerate prototype development, it often compromises the end-user experience—either due to accuracy issues from data bias or delays in data synchronization.
While data undergoes structured collection and cleaning by manufacturers, perfect alignment with specific industries or domains remains challenging due to unavoidable data bias-induced hallucination issues. Chen Zhe told Photon Planet: "The improvement in answer rate is accompanied by a slight decrease in accuracy, which many clients in fields like law, education, and finance find unacceptable."
Data synchronization focuses more on both the supply and demand sides of intelligent customer service. On one hand, clients need to promptly upload data for fine-tuning training, while manufacturers require frequent fine-tuning and product updates.
Chen Zhe stated that Zhichi Technology currently updates weekly. With open data interfaces, clients must transmit the latest data promptly, and the value of "latest data" only becomes apparent after a period of corpus learning.
"Your requirements can be at the second, minute, or hour level - data pushed to us one second becomes training corpus for our product the next second."
This represents an effective synchronization method, but it heavily relies on the learning capability of the called models and struggles to immediately "digest" the data's value.
As for initial cost concerns, they become relatively less important. The enclosed nature of intelligent customer service scenarios inherently limits data volume. From a non-leading manufacturer's perspective, intelligent customer service currently neither requires "hoarding chips" nor connecting to vector databases to ensure retrieval efficiency, nor does it need excessive consideration of token costs when calling models - pricing can simply reflect corresponding costs. In any case, the manpower efficiency saved by using intelligent customer service far outweighs current pricing.
Certainly, creating a demo for intelligent customer service is easy, but the gap to full implementation involves more than just deploying a called or self-developed model. The hard-to-quantify costs may become the moat for players in the future intelligent customer service field.
While discussing the possibilities of AIGC combined with intelligent customer service, we must consider that intelligent customer service isn't a new track pioneered by AI, but rather an established one with over a decade of history that's being reconstructed by large models.
For the intelligent customer service sector, industrial restructuring includes fundamental changes from NLP to large language models and functional leaps from semantic understanding to multimodal capabilities. However, the business model from a non-technical perspective has remained unchanged.
Simply put, intelligent customer service is a SaaS business with cost reduction as its core purpose. This is evident from the "2023 China Intelligent Customer Service Market Report," which shows that software accounted for 79.94% of China's intelligent customer service market in 2022. In other words, the survival space for intelligent customer service vendors lies in the distance between customers and achieving intelligent customer service capabilities, a point that hasn't changed even at critical junctures of technological transformation.
"If tech giants could have eliminated us in intelligent customer service, we would have been dead during the NLP era," said Chen Zhe.
Furthermore, since intelligent customer service is a type of SaaS business, its growth paradigm follows similar logic. For example, telecom operators like China Mobile and China Unicom, along with Ronglian Cloud, which have launched large models in the customer service field, primarily adopt a Product-led Growth model. In contrast, non-leading vendors without such capabilities tend to lean more toward an eXperience-Led Growth model.
It's not that mid-tier vendors and their clients don't care about product performance, but rather that these vendors, facing the technological and resource pressures from tech giants, need to build a second growth curve to expand their survival space. Typical strategies include "pre-processing" potential issues customers might encounter when using the product and expanding business routes beyond their main offerings as much as possible.
Take one mid-tier vendor as an example: they established a dedicated operations department for their product, going to great lengths to provide customer support and stay close to clients. The department's work includes writing prompts for customers, assisting with private domain operations, and even acting as an "intermediary" between clients and vendors to facilitate comprehensive digital solutions as team members.
Admittedly, what small companies can do, large corporations can also achieve, albeit with some investment of time and manpower. However, the differing perceptions of AI customer service and divergent business strategies between the two have carved out significant survival space for mid-tier players.
"Big tech companies have abundant resources and high investments, naturally aiming for big clients and lucrative deals. They also engage in some abstract practices, like having clients test models to 'learn' from their data. We, on the other hand, are more down-to-earth, ensuring clients clearly perceive cost-saving benefits even during pre-sales," said a product manager at a mid-tier firm.
Moreover, as one of many projects in corporate digital transformation, the AI customer service market isn't particularly large. Major clients often opt for bundled purchases from multiple vendors to mitigate risks, creating opportunities for non-leading players.
Currently, the AI customer service sector remains relatively open, with 'all creatures thriving under the same sky.' But as AI customer service becomes more integrated with AIGC, the competitive landscape may shift dramatically.
The most fundamental issue—hallucinations leading to unstable content quality—remains unresolved industry-wide. Meanwhile, as AI customer service combined with AIGC matures, the trend from cost reduction to value creation is pushing vendors to accelerate technological iterations. For instance, in e-commerce, AI customer service can evolve from mere support to shopping guidance.
Additionally, sources from a leading tech company reveal that AIGC applications in e-commerce customer service face latency issues. Pure semantic retrieval struggles to ensure user satisfaction, making vector databases seem like an inevitable future solution.
With its inherent cost-saving value and compatibility with large models, AI customer service has become one of the most promising areas for large model implementation. Yet, its development in the era of large models is just beginning. Having barely transitioned from 'dumb' to 'smart,' AI customer service still requires significant paradigm shifts to meet demands like repeat purchases and cross-selling.