Behind the Glory: Tech Giants Struggle with AI Profitability Challenges
-
Although large AI models have many advantages, relying on them for profitability is not easy at this stage. According to The Wall Street Journal, major tech companies like Microsoft and Google are struggling with the challenge of turning AI products like ChatGPT into profitable businesses. Despite heavy investments in AI technologies that can generate business memos or code, the cost of running advanced AI models has proven to be a significant obstacle. Some services, such as Microsoft's GitHub Copilot, are incurring substantial operational losses.
The operating costs for generative AI models that create text are not cheap. Large language models like those powering ChatGPT require powerful servers equipped with high-end, energy-consuming chips. For example, a Reuters report noted that each ChatGPT query may cost around 4 cents to run. As a result, AWS CEO Adam Selipsky told The Wall Street Journal that many enterprise customers are dissatisfied with the high operational costs of these AI models.
The current cost challenges are tied to the nature of AI computing. Unlike standard software that benefits from economies of scale, AI computing often requires new calculations for each query. This makes fixed-fee models for AI services risky, as increased customer usage can drive up operational costs and potentially lead to losses for companies.
Some companies are working to reduce costs, while others continue to invest heavily in the technology. Microsoft and Google have introduced more expensive AI-powered upgrades to their existing software services, while Zoom reportedly tries to cut costs by sometimes using less sophisticated in-house AI models for certain tasks. Adobe is addressing the issue with usage caps and pay-as-you-go pricing, whereas Microsoft and Google generally stick to fixed fees.
Chris Young, Microsoft's corporate strategy chief, believes it will take more time to see a return on AI investments as people figure out the best ways to use the technology. He told the media, 'Clearly, we now need to turn user interest into real adoption.'
Notably, a report from The Wall Street Journal reveals that Microsoft's GitHub Copilot, which assists app developers by generating code, has been operating at a loss despite attracting over 1.5 million users and being integrated into nearly half of all coding projects. According to an insider, while users pay a flat fee of $10 per month for the service, Microsoft incurs an average cost of over $20 per user monthly. In some cases, individual power users cost the company as much as $80 per month.
One reason for the high cost of AI services is that some companies insist on using the most powerful AI models available. For instance, Microsoft employs OpenAI's most advanced GPT-4 model for many of its AI features. GPT-4 is one of the largest and most expensive AI models, requiring significant computational resources. The Wall Street Journal humorously likened using this model for basic tasks like summarizing emails to "using a Lamborghini to deliver pizza," suggesting that deploying such powerful AI for simple tasks might be overkill.
In light of this, Microsoft has been exploring more cost-effective alternatives for its Bing Chat search assistant, including Meta's Llama 2 language model. Over time, advancements in AI-accelerated hardware may reduce the cost of running these complex models. However, it remains uncertain how long this transition will take.