Organisational structures, more AI costs, and Meta's 4D chess
5 November 2023 | Issue #14 - Mentions $SQ, $MSFT, $AMZN, $META
Welcome to the fourteenth edition of Tech takes from the cheap seats. This will be my public journal, where I aim to write weekly on tech and consumer news and trends that I thought were interesting.
Let’s dig in.
Organisational structures matter
At the start of the week The Information published a timely article discussing the ongoing internal conflicts within Block, the parent company of Square and Cash App (I say timely because Block went on to address some of indirectly in its quarterly earnings a few days later).
As CEO of Twitter, Jack Dorsey was widely panned for his hands-off style, seen as contributing to the company’s uneven growth and slow-to-evolve culture, which paved the way for last year’s takeover by Elon Musk. Dorsey’s other public company, Block, originally known as Square, has a whole different problem: Its divisions feud so intensely they can’t agree on even minor points of cooperation.
For example, last year staffers from Square, a payments processing service popular among small and medium-size businesses, and its sibling division, Cash App, which is similar to Venmo, got bogged down in a negotiation over sharing technology to integrate Apple Pay’s Tap to Pay on iPhone feature. No one from Block, including CEO Dorsey, intervened to resolve the fighting. Today, the dispute still hasn’t been resolved: While Square offers the Tap to Pay on iPhone feature, Cash App doesn’t.
Jack Dorsey, Block’s co-founder and Chair, wants to turn the company into a one-stop shop for payments technology by creating synergies between its seller ecosystem and payments processing service, Square and its consumer fintech division, Cash App. Its acquisition of Afterpay, a buy now, pay later provider, in January 2022 was meant to help drive these synergies.
Here’s the vision from Block’s Q2 2021 earnings call
JD: And from our side, I think as I said in my last answer, we get the question all the time on this call, like what are some ways that you are all thinking about connecting the Seller and the Cash App ecosystem. And this one is massive and also obvious. We think, from a Seller perspective, the most obvious point is this is yet another tool to help drive more sales through a seller and also help us reach sellers that we have not been able to serve in the past, and that includes larger more enterprise global retail sellers. And for us to be able to scale from the smallest of shops in your neighborhood up to the largest of retailers in the world with one solution that brings people to the rest of our ecosystem is exactly our ecosystem strategy, and this gives us a lot of fuel to continue to extend that.
On the Cash App side, this is a new payment capability. So, adding more capabilities to Cash App customers that they can have a choice on how they're interacting with the economy is pretty incredible. But also given that a lot of the consumer side is going to be focused on discovery, first and foremost, it gives the Cash App a way to provide more daily value, something that people want to open up every single day to check out what's new and to see our entire system of services within the Cash App as well.
So, you combine these two together, you get a strong connection between the ecosystems. So even if they weren't connected, you have a lot of strength for each ecosystem. But, again, I think the power and the true value of our company over the long-term is how we connect all these ecosystems together, starting with Seller and Cash App, but obviously grows bigger than that years ahead.
As the article from the Information highlights, though, putting this strategy into practice is harder than it seems.
First, though, Dorsey will have to change Block’s culture of independence. The company’s units have often struggled to cooperate, acting at times as if they were competitors, according to former employees.
“They have not been able to really show, just yet, how all these things are interconnected,” said Dominick Gabriele, an equity analyst at Oppenheimer who covers the fintech industry.
Rivalry extends to teams within units: Members of one Square team last year became frustrated that they couldn’t simplify the Square app’s checkout process because other teams controlled parts of the process. So they built a prototype of a simpler app that would compete with the main Square app, according to a former executive. The team ultimately decided not to launch the competing app because it didn’t have enough staff to keep it running smoothly after its release.
….
The most obvious rift within Block is between Square and Cash App. Cash App is run by longtime Block manager Brian Grassadonia. Until earlier this month, Square had been run by former Amazonian Alyssa Henry, although she has since departed. Dorsey didn’t replace Henry but took the reins at Square himself.
Rivalry has been an issue for years. In 2019, when Square was adding Cash App as a payment option to its point-of-sale tablets, Cash App’s product team wouldn’t agree to share some of the data Square’s product team wanted, such as details about the customers who used Cash App to pay at Square terminals. The Square team thought that data would help them more easily detect fraud on Square and allow Square merchants to create more effective marketing campaigns, according to two former managers familiar with the talks.
Cash App staffers wanted to protect the privacy of the app’s users, relying on technology that can limit the risk of hackers stealing credit card information. Instead of sharing the user data, they said Cash App would assume the risk of fraudulent transactions, one of the former managers said. In addition to squabbling over data, the teams took months to decide on a revenue-sharing agreement for the fees the Cash App transactions would generate from Square merchants.
Eventually the quarreling teams agreed on an even split.
I think the root of this conflict can be traced back to the company's organizational structure. Cash App was originally conceived in 2013 as part of Block (formerly known as Square) during a hackathon led by Jack Dorsey and a team of engineers. At its inception, Cash App's organizational framework was product-centric, in contrast to the alternative structure based on functions. This product-centric approach allowed teams to concentrate on specific products within their purview.
While this structure offered benefits such as agility and clear lines of accountability, it also carried drawbacks, as highlighted in the article. It had the potential to result in role duplication, the creation of siloes within the broader organization, and reduced overall coordination.
From the point of view of Block, the parent company, the allocation of revenue between the divisions has little significance. The consolidated profit statement that Block reports publicly excludes fees paid by Cash App to Square, as well as the other way around. But those fees do affect divisional revenues, which the company does report publicly. And the profits of each division determine the future budgets they receive, which gives them an incentive to fight with each other, even at the expense of the company’s overall well-being.
Product Structure
Functional Structure
Source: Lumen Learning
The misalignment of incentives and the lack of coordination within the company are posing significant challenges to Jack in his pursuit of achieving his ultimate goal: establishing a closed-loop payment ecosystem. Indeed, as I mentioned at the outset, the timing of the article’s release was highly relevant. Days after its publication, the company disclosed its Q3 results, and it appears that they have acknowledged the necessity of moving toward greater centralization. However, it's worth noting that this shift may not entail a complete transition to a functional structure.
From Block’s Q3 2023 Shareholder Letter
I’ll start with the last, costs, noting that as we’ve built out our operating model for our four business units, we’ve created structural duplication and redundancy that would better serve us by being more centralized. We’ve spent a lot of time looking at actual needs of each business and for opportunities to recentralize resources, both on Amrita’s teams and within the business units. We’ve already begun this work.
…
Over the past few months we’ve reset the relationship between Square and Cash App and restructured Afterpay to ensure a stronger connection between each, and most importantly, create an innovative customer experience. We finally have line of sight to seeing more of Square within Cash App, and vice versa. We believe combining the two ecosystems enables us to provide consumer experiences others can’t, specifically for commerce. You’ll be able to see this for yourself early next year.
"Better late than never" indeed. There are moments when companies must take a step back and acknowledge the necessity for restructuring. Investors seemed happy with the company's renewed focus on costs and efficiency, as evidenced by the 23% increase in the share price over the next two trading sessions. This shift in strategy has provided the company with a more optimistic outlook, and drawing parallels to Airbnb's organizational transition a couple of years ago, it's possible that it may start to see some margin expansion.
Related: Block’s Stock Price Is Down 80%. Enter CEO Jack Dorsey.
Regulatory capture in AI
If you haven't had the chance to watch Bill Gurley's presentation at the All-In Summit from a couple of months ago, I highly recommend it. His insights provide valuable context regarding the strategy that incumbents use to "pull the ladder up behind them" to hinder competition. I mention this in light of the recent executive order signed by the Biden administration this week.
From the AP
WASHINGTON (AP) — President Joe Biden on Monday signed an ambitious executive order on artificial intelligence that seeks to balance the needs of cutting-edge technology companies with national security and consumer rights, creating an early set of guardrails that could be fortified by legislation and global agreements.
Before signing the order, Biden said AI is driving change at “warp speed” and carries tremendous potential as well as perils.
“AI is all around us,” Biden said. “To realize the promise of AI and avoid the risk, we need to govern this technology.”
The order is an initial step that is meant to ensure that AI is trustworthy and helpful, rather than deceptive and destructive. The order — which will likely need to be augmented by congressional action — seeks to steer how AI is developed so that companies can profit without putting public safety in jeopardy.
Rather than attempting to delve into this topic myself, I am going to highlight this thoughtful piece by Ben Thompson. The article effectively addresses the core of the issue and offers additional historical context. The point is reinforced by Andrew Ng, a professor at Stanford University who taught machine learning to Sam Altman, and was the co-founder of Google Brain and chief scientist at Baidu’s Artificial Intelligence Group.
the “bad idea that AI could make us go extinct” was merging with the “bad idea that a good way to make AI safer is to impose burdensome licensing requirements” on the AI industry.
“When you put those two bad ideas together, you get the massively, colossally dumb idea [of] policy proposals that try to require licensing of AI,” Professor Ng told The Australian Financial Review in an interview*.*
“It would crush innovation,” he said.
“There are definitely large tech companies that would rather not have to try to compete with open source [AI], so they’re creating fear of AI leading to human extinction.
“It’s been a weapon for lobbyists to argue for legislation that would be very damaging to the open-source community,” he said.
“Sam [Altman] was one of my students at Stanford. He interned with me. I don’t want to talk about him specifically because I can’t read his mind, but …I feel like there are many large companies that would find it convenient to not have to compete with open-sourced large language models,” he said.
“There’s a standard regulatory capture playbook that has played out in other industries, and I would hate to see that executed successfully in AI.”
In reaction to this development, a group of venture capitalists (VCs) and executives from AI companies penned a letter to the President. Their letter serves as a warning about the potential ramifications of the Executive Order and expresses their support for easing restrictions on open source AI.
It’s going to be interesting to see how it all unfolds.
GenAI is expensive… part III
Speaking of open source models, The Information recently published more insights into the costs associated with operating various models.
Baseten, a startup that helps developers use open-source LLMs, says its customers report that using Llama 2 out of the box costs 50% to 100% more than for OpenAI’s GPT-3.5 Turbo. The open-source option is cheaper only for companies that want to customize an LLM by training it on their data; in that case, a customized Llama 2 model costs about one-fourth as much as a customized GPT-3.5 Turbo model, Baseten found. Baseten also found that OpenAI’s most advanced model, GPT-4, is about 15 times more expensive than Llama 2, but typically it’s only needed for the most advanced generative AI tasks like code generation rather than the ones most large enterprises want to incorporate.
The price differential between the base versions of Llama 2 and GPT-3.5 Turbo largely stems from how companies use the specialized servers that power the models. OpenAI can bundle the millions of requests it receives from customers and send those batched queries to the chips in its servers to process simultaneously rather than one at a time. By contrast, companies like Cypher that try using open-source AI while renting specialized servers from cloud providers to power the AI may not generate enough customer requests to batch them. That means the companies aren’t taking full advantage of the server chips’ capabilities the way OpenAI can, said Naveen Rao, an executive at Databricks, an AI software provider.
The true cost of using open-source LLMs has big implications. A growing number of businesses are using them to develop new products that automate customer service, advertising or software coding, among other tasks. Some of these businesses don’t want to rely on makers of proprietary LLMs such as OpenAI for fear they might leak sensitive data to those providers. Some also feel open-source software is easier to customize. Others worry OpenAI and other AI providers will build products that compete with those of their most successful customers—a scenario that has already bitten customers such as Jasper and Deepgram. Such concerns aren’t likely to go away, given OpenAI’s growing ambitions for ChatGPT.
But AI buyers that are wary of OpenAI also have found that the so-called free alternatives are anything but. That reality check could hurt open-source AI developers such as Meta Platforms, Databricks and Mistral AI that have touted their open-source software as a cheap alternative to proprietary tech from OpenAI, Anthropic and others.
Users of open-source LLMs say the cost can vary widely, depending on what the software is used for, how many requests it serves and how much it needs to be tweaked for a certain product. For simple tasks like summarization or translation, open-source models can get the job done relatively cheaply. More-complex applications such as those that automatically generate code or answer hard math problems may require the reasoning capabilities only bleeding-edge closed-source LLMs can handle, said Ion Stoica, a Berkeley computer science professor and a co-founder of Databricks and Anyscale.
Some users of open-source AI say the potential extra cost is worth it. Avalara, a provider of tax compliance software, has used Llama 2 as well as an open-source model from MosaicML, owned by Databricks, and customized them to improve its products, said Vsu Subramanian, senior vice president of content engineering.
He didn’t disclose the expenses but said the customization is worth the trouble of added engineering and computing costs.
Another problem for large enterprises that use open-source software is getting access to the specialized servers in the first place—something they don’t have to worry about if they are buying software from OpenAI or Microsoft, which provide access to models through an easy-to-use application programming interface. The open-source software users may need to sign long-term contracts with AI chipmakers such as Nvidia or cloud providers to ensure they get enough supply, and if demand for the chips is too high—which it has been—they may have to wait.
The cost and choice of Large Language Models (LLM) will depend on the specific end-use case. OpenAI presents an appealing proposition that's easy to implement through its API, which can be considered "LLM as a service." However, it's plausible that the industry will eventually lean toward more customized models (as I’ve written in the past), which are cheaper with open source. This is because applications that aren’t built upon fine-tuned models should get commoditised over time, and could potentially be vertically integrated by OpenAI, as we’ve seen with Jasper and Deepgram. Furthermore, it’s possible that OpenAI and Microsoft are currently benefitting from their access to GPUs. The path of least resistance at the moment for companies wanting to create apps with GenAI is by going with an LLMaaS provider. As GPU capacity starts to free up, does this demand start to shift towards open source models as the infrastructure to build is easier to access? It also depends on whether the app developer has enough scale in terms of both data and users to efficiently utilise the GPUs, though it seems AWS is looking to alleviate part of this.
Amazon Web Services announced a new service aimed at helping developers access Nvidia’s hard-to-get graphics processing units—a potentially threatening move for the flurry of GPU rental startups that are attempting to compete with the cloud provider.
In a press release, AWS said the demand for Nvidia’s GPUs has outpaced supply, leading some customers to resort to purchasing large amounts of capacity “only to have it sit idle when they aren’t actively using it.” David Brown, vice president of compute and networking at AWS, said the new service, dubbed “Capacity Blocks,” is a way for companies to “predictably acquire” GPUs, “without making long-term capital commitments.” It allows customers to reserve up to 512 Nvidia H100 chips for up to two weeks at a time.
While the article may underscore that in certain cases, open source models may not immediately appear cheaper to operate, it's worth considering the long-term perspective. As a developer, opting for open source models could prove to be the right decision over time, considering the potential benefits, flexibility, and cost-effectiveness they may offer in the evolving landscape of AI. That is, as long as executive orders don’t dampen their popularity…
Related: Microsoft pushes the boundaries of small AI models with big breakthrough, CIOs Assess Whether Microsoft’s AI Copilot Justifies Premium Price, Elon Musk’s first AI product is a chatbot named Grok
This tweet from @GavinSBaker was also good.
Meta playing 4D Chess
At the start of the week, the company made an announcement about providing users in the EU, EEA, and Switzerland with the option to subscribe to its service for an ad-free experience. This move is in response to EU privacy regulations that will mandate the company to obtain user consent before utilizing their personal information for targeted advertising.
To comply with evolving European regulations, we are introducing a new subscription option in the EU, EEA and Switzerland. In November, we will be offering people who use Facebook or Instagram and reside in these regions the choice to continue using these personalised services for free with ads, or subscribe to stop seeing ads. While people are subscribed, their information will not be used for ads.
People in these countries will be able to subscribe for a fee to use our products without ads. Depending on where you purchase it will cost €9.99/month on the web or €12.99/month on iOS and Android. Regardless of where you purchase, the subscription will apply to all linked Facebook and Instagram accounts in a user’s Accounts Center. As is the case for many online subscriptions, the iOS and Android pricing take into account the fees that Apple and Google charge through respective purchasing policies. Until March 1, 2024, the initial subscription covers all linked accounts in a user’s Accounts Center. However, beginning March 1, 2024, an additional fee of €6/month on the web and €8/month on iOS and Android will apply for each additional account listed in a user’s Account Center.
The subscription price comes in much higher than the company’s ARPU in the EU (which I estimate to be €3.2 per month), which I assume will mean low up-take. Why is Meta playing 4D chess you ask? This all began when the company was told by the EU regulator to reassess the legal basis for personalised ads
DUBLIN, Jan 4 (Reuters) - Meta (META.O) must reassess the legal basis on how Facebook and Instagram use personal data to target advertising in the European Union, its lead privacy regulator in the bloc said on Wednesday when it fined the social media giant 390 million euros ($414 million) for the breaches.
Meta said it intended to appeal both the substance of the rulings and the fines imposed, and that the decisions do not prevent personalised advertising on its platforms.
The order on personalised advertising was made in December by the EU's privacy watchdog, according to a decision seen by Reuters, in which it overruled a draft ruling by Ireland's Data Privacy Commissioner (DPC), Meta's lead EU privacy regulator.
It related to a 2018 change in the terms of service at Facebook and Instagram following the introduction of new EU privacy laws where Meta sought to rely on the so-called "contract" legal basis for most of its processing operations. Having previously relied on the consent of users to the processing of their personal data for targeted advertising, the DPC said Meta instead considered that a contract was entered into upon acceptance of the updated 2018 terms and that this made such advertising lawful.
The DPC, which is the lead privacy regulator for many of the world's largest technology companies within the EU, directed Meta to bring its data processing operations into compliance within three months.
There were concerns that the company might have to either deliver untargeted ads, resulting in lower Return on Ad Spend (RoAS), or transition to a subscription-based model, risking the loss of users. However, by giving users the option to choose between a subscription with no ads and a free plan supported by ads, the company has effectively addressed these concerns. This approach allows the company to obtain user consent and strike a balance that benefits both the users and the company, essentially having its cake and eating it, too.
That’s all for this week. If you’ve made it this far, thanks for reading. If you’ve enjoyed this newsletter, consider subscribing or sharing with a friend
I welcome any thoughts or feedback, feel free to shoot me an email at portseacapital@gmail.com. None of this is investment advice, do your own due diligence.
Tickers: SQ 0.00%↑, MSFT 0.00%↑, AMZN 0.00%↑, META 0.00%↑