Competition in China, more monetisation in AI, and the AI startup funding cycle
27 May 2024 | Issue #22 - Mentions $BABA, $BIDU, $META, $MSFT, xAI
Welcome to the twenty-second edition of Tech takes from the cheap seats. This will be my public journal, where I aim to write weekly on tech and consumer news and trends that I thought were interesting.
Let’s dig in.
Competition is ruthless in China, act IV
Following on last week’s article on different approaches to monetising AI, this week started with announcements from China’s tech giants significantly reducing the prices it charges its customers for use of its LLMs.
Chinese tech giants Alibaba (9988.HK), and Baidu (9888.HK), opens new tab slashed prices on Tuesday of large-language models (LLMs) used to power generative artificial intelligence products, as a price war in the cloud computing sector heats up in China.
Alibaba's cloud unit announced price cuts of up to 97% on a range of its Tongyi Qwen LLMs. Its Qwen-Long model, for instance, will cost only 0.0005 yuan per 1,000 tokens - or units of data processed by the LLM - after the price cut, down from 0.02 yuan per 1,000 tokens.
It was quickly followed by Baidu, which hours later announced that its Ernie Speed and Ernie Lite models would be free for all business users.
A price war in China's cloud computing space has been ongoing for the past few months, with Alibaba and Tencent (0700.HK), recently lowering prices of their cloud computing services.
Many Chinese cloud vendors have relied on AI chatbot services to boost sales, after China saw a wave of investment in large language models in response to the hit debut of U.S.-based OpenAI's ChatGPT in late 2022.
The price war in China's cloud computing space has now hit the large-language models that power these chatbots, threatening to lower companies' profit margins.
Baidu's Ernie Lite and Ernie Speed were released in March and until Tuesday corporate customers paid to use them.
Bytedance announced last week that the main model of its Doubao LLMs would be priced 99.3% lower than the industry average for business users.
Chinese LLM developers have focused on charging businesses as a way to monetize their investments in LLMs.
For context, Alibaba’s Qwen-Long model is supposed to be better performing than GPT-4 and was previously priced at US$2.8 per 1M tokens. This will now cost US$0.07 per 1M tokens, 60% cheaper than the cheapest LLM (Llama 3-8B) from the major players in the West.
Source: Artificial Analysis
Currently, we're witnessing a land grab similar to what occurred in China over the past two decades across sectors such as e-commerce, ride-hailing, food delivery, and electric vehicles. Many players are competing for dominance of a large total addressable market (TAM). The key difference is that there isn't a clear monetary prize yet. Chinese tech giants seem to be offering their large language models (LLMs) at a loss to 1. gain share in cloud computing services, and 2. collect data and reach a critical mass for reinforcement learning from human feedback (RLHF).
This aggressive competition makes more sense when considering these goals. OpenAI in the West benefited from a first-mover advantage, which allowed it to grow virally and achieve brand recognition. This distribution, in turn, allowed it to continuously improve its models. Chinese LLMs are all launching around the same time, making it challenging to differentiate and gather valuable user feedback.
They also face competition from private cloud "all-in-one machines" offered by companies like Huawei. These products must be so attractively priced that data privacy becomes a secondary concern for customers. It will be intriguing to see how this situation evolves.
Given these dynamics, artificial intelligence (AI) and cloud computing might not be as profitable for Chinese players. They may even devise different monetization strategies since we're still in the early stages of the cycle. For example, Alibaba has historically charged commission rates much lower than its Western counterparts, opting instead to monetize through ads.
While there are currently no standout applications for AI, these companies might choose a similar route with their chatbots, incorporating ads.
Related: Alibaba Sparks China AI Price War With Spate of Steep Discounts
Premium AI agents
Meta’s stock took a 10% hit last month after reporting its Q1 earnings as investors suffered from PTSD after hearing that the company was heading into another investment cycle while being warned that revenue from it will come at a delayed pace.
From the Q1 results conference call
Overall, I view the results our teams have achieved here as another key milestone in showing that we have the talent, data and ability to scale infrastructure to build the world's leading AI models and services. And this leads me to believe that we should invest significantly more over the coming years to build even more advanced models and the largest scale AI services in the world. As we're scaling CapEx and energy expenses for AI, we'll continue focusing on operating the rest of our company efficiently. But realistically, even with shifting many of our existing resources to focus on AI, we'll still grow our investment envelope meaningfully before we make much revenue from some of these new products. I think it's worth calling that out that we've historically seen a lot of volatility in our stock during this phase of our product playbook, where we're investing in scaling a new product but aren't yet monetizing it. We saw this with Reels, Stories as newsfeed transition to mobile and more. And I also expect to see a multiyear investment cycle before we fully scale Meta AI, business AIs and more into the profitable services I expect as well. Historically, investing to build these new scaled experiences in our apps has been a very good long-term investment for us and for investors who have stuck with us. And the initial signs are quite positive here, too. But building the leading AI will also be a larger undertaking than the other experiences we've added to our apps, and this is likely going to take several years. On the upside, once our new AI services reach scale, we have a strong track record of monetizing them effectively. There are several ways to build a massive business here, including scaling business messaging, introducing ads or paid content into AI interactions and enabling people to pay to use bigger AI models and access more compute. And on top of those, AI is already helping us improve app engagement, which naturally leads to seeing more ads and improving ads directly to deliver more value. So if the technology and products evolve in the way that we hope, each of those will unlock massive amounts of value for people and business for us over time. - Mark Zuckerberg
This week we got a glimpse of what some of this monetisation could look like.
From The Information
Meta Platforms is considering charging users for a more advanced version of its artificial intelligence-powered assistant, called Meta AI, according to an internal post reviewed by The Information.
Google, Microsoft, OpenAI and Anthropic each offer $20-per-month subscriptions to their chatbots. The subscriptions let people use those companies’ chatbots inside workplace apps such as Microsoft Word and give people priority access when usage is high, among other things. The features Meta might offer in a premium tier, and how much Meta might charge, couldn’t be learned. Its plans may change.
Meta is also developing AI agents that can complete tasks without human supervision, according to the internal post. They include an “engineering agent” to assist with coding and software development, similar to GitHub Copilot, according to the internal post and two current employees. The post also cites “monetization agents” that one current employee said would help businesses advertise on Meta’s apps. These agents could be for both internal use and for customers, the employees said. Meta’s competitors—including Google, Microsoft and OpenAI—are also working on AI agents.
…
In the internal post, Ahmad Al-Dahle, vice president of Meta’s generative AI group, called the premium tier of Meta AI “consumer cloud.” That suggests Meta will charge those using a more sophisticated version of its assistant, which would likely require more computing power. The term “consumer cloud” typically refers to services such as Dropbox that sell cloud-based software to consumers, rather than other businesses.
There are numerous instances within Meta's suite of apps where AI agents are useful, with the most obvious being customer service agents for businesses on Whatsapp. A "consumer cloud" service could be beneficial for Meta, considering the vast data it has about its users' interests and preferences. It's conceivable that Meta could use retrieval-augmented generation (RAG) to tailor each AI agent to an individual consumer, providing personalized recommendations and task completion. Such a service would set Meta's agent apart from those offered by OpenAI, Google, and Anthropic, potentially increasing what people are willing to pay. If successful, this could boost the average revenue per user (ARPU) worldwide. Considering that OpenAI charges $20 regardless of location, a non-discriminatory pricing approach could increase Meta’s regional ARPU, its second-largest market, Europe is currently only $100. Therefore, current free cash flow and capital expenditure figures are not as useful for evaluating return on invested capital (ROIC). This is only one example but our job is to look to the future.
Microsoft’s AI positioning and framework
Ben Thompson at Stratechery had a good interview with Microsoft CEO Satya Nadella and CTO Kevin Scott this week that answered some questions investors are debating on AI at the moment. I would recommend reading/listening to the whole thing, but Modest Proposal had a good thread on Twitter summarizing some of the key topics.
It was particularly interesting to hear how Satya thinks about capex given that the material step-up in capex did in fact coincide with Microsoft’s investments into AI.
SN: I think the laws of economics, I think you rightfully pointed out, we are a CapEx-heavy entity. Most people are focused on our CapEx just because of AI. But come on, just take out even AI, we are a knowledge-intensive and a capital-intensive business, that’s what it takes to be in hyperscale. You can’t just show up and say, “Hey, I want to enter the hyperscale”, if you can’t now at this point put $50, 60 bill a year into CapEx, so that’s what it takes to be in the market.
But then also, it’s always going to be governed by what’s happening in the marketplace. You can’t far outstrip your revenue growth. And so therefore, there is an absolute governor, which is yes, the training chunks go where there is step function changes to allocation of training compute, but ultimately inference is demand-driven. So if you take that combination, I feel like if there is something that happens cyclically even, adjusting for it is not that hard. As a pure business management thing, I’m not managing it for a quarter, but it doesn’t scare me.
I find it hard to believe that Microsoft would still be spending $50-60bn on capex without AI. Even taking the mid-point of this spend and applying it to Microsoft’s consensus 2025 revenue would put its capex/sales ratio at almost 20% or about 70% higher than pre chatGPT. I’m not saying the company won’t be able to monetise these investments. They clearly have avenues to take that will start to show in the coming years. It’s just not accurate to say that AI hasn’t made a meaningful impact to capex spend.
AI startup funding cycle
This week, we saw divergent outcomes in the AI startup sphere. AI Agent startup Adept was in talks with potential buyers while Elon Musk's AI startup xAI closed a $6bn funding round.
The leaders of Adept, a 2-year-old artificial intelligence startup run by former OpenAI and Google AI developers, held talks in recent months about a possible sale or strategic partnership with large tech companies, according to a person with direct knowledge of the conversations. One company Adept spoke to was Facebook owner Meta Platforms, this person said, though an acquisition is unlikely.
The effort by Adept, which investors valued at more than $1 billion last year, points to the pressure on a growing number of startups in the field. Even those that quickly raised hundreds of millions of dollars, such as Adept, face questions due to the heavy costs of training and running AI models, as well as rising competition from companies such as Google, Meta, Microsoft and OpenAI.
Adept this summer plans to launch an AI “agent” that can automate personal computing tasks, a type of service some of the bigger incumbents are also developing.
This would put it alongside a growing list of failed AI unicorns that were unable to find product-market fit. It was only a week earlier that The Information reported Stability AI was looking for a buyer. They’re struggling to find distribution as incumbents have found innovation. Unlike past innovation cycles, AI is much more capital intensive and new features can be deployed across existing platforms (Windows, Meta’s family of apps, Google’s apps). This raises the barriers to success for start-ups materially.
Which leads me to xAI’s latest funding round.
Elon Musk’s xAI has secured new backing from Silicon Valley venture capital giants Lightspeed Venture Partners, Andreessen Horowitz, Sequoia Capital and Tribe Capital, as the tech billionaire closes in on a funding round valuing the artificial intelligence start-up at $18bn.
The investors have committed to joining xAI’s latest financing in which Musk is seeking to raise close to $6bn, according to people familiar with the negotiations.
However, one investor involved in the round said the Tesla and X chief remains “a few hundred million dollars” short of that target.
Lightspeed, Tribe and Sequoia declined to comment. Andreessen Horowitz and Musk did not respond to a request for comment.
The funding deal comes as Musk seeks to secure the financial firepower to catch up with market leaders OpenAI, Anthropic and Google, all of which have released more powerful generative AI models than xAI.
His pitch to investors is that xAI can gain ground thanks to its connection to the other companies he leads — which could provide technology, data and early revenue as a customers of the start-up.
The funding round would give xAI a so-called post-money valuation of $24bn once the new investment is taken into account, and help the start-up develop new versions of its Grok chatbot.
Musk's pitch is that he will pursue artificial general intelligence (AGI) more transparently and develop a "maximum truth-seeking AI." He believes current models are trained to be politically correct. While this approach is unique, I'm skeptical that xAI will produce significant returns for its investors, considering its current position. Its existing product, Grok, is only available to premium Twitter users, which, as of April 2023, were around 640k people paying $8 a month. This doesn't offer the same RLHF as other platforms. This new funding may enable it to make its model more widely available. It might have been more economically sensible for Twitter to license its content to current providers and earn 100% margin revenue. But billionaires doing billionaire things means that they want to control their own LLM company, much like owning their own space exploration company.
PS: there was this interesting snippet from the Information article above.
Meta is working towards AI systems capable of reasoning and performing actions such as planning a vacation or booking accommodations using online resources, the company’s chief AI scientist, Yann LeCun, said at an event in London in April.
Other headlines that caught my eye
TBD on whether this becomes an ongoing section but there were a couple more articles I noted down but couldn’t write about otherwise this post would become too long.
Nvidia’s rivals take aim at its software dominance - FT
MercadoLibre Kicks Off Talks to Apply for Mexico Banking License - Bloomberg
Young women fall out of love with dating apps - FT
and lol: in response to Google’s AI overview drama this past week.
That’s all for this week. If you’ve made it this far, thanks for reading. If you’ve enjoyed this newsletter, consider subscribing or sharing with a friend
I welcome any thoughts or feedback, feel free to shoot me an email at portseacapital@gmail.com. None of this is investment advice, do your own due diligence.
Tickers: BABA 0.00%↑ , BIDU 0.00%↑ , META 0.00%↑ , MSFT 0.00%↑