- TheTechOasis
- Posts
- Is AI in a Bubble?
Is AI in a Bubble?
đ TheTechOasis đ
part of the:
In AI, learning is winning.
While Thursdayâs newsletter discusses the present, the Leaders segment on Sundays will inform you of future trends in AI and provide actionable insights to set you apart from the rest.
đ Big Launch đ
Finally, my community and news feed, TheWhiteBox, launched on Friday!
TheWhiteBox is the place for high-quality, highly curated AI content without unnecessary hype or ads across research, models, investing & markets, and AI products and companies. With TheWhiteBox, we guarantee you wonât need anything else.
If you signed up for the waitlist, you will have already received the invitation or will receive it as late as today. However, you must accept the invitation you received in your inbox to materialize it.
If you were already a Premium subscriber, you have been automatically added, too, while enjoying full access to all spaces and content. You also must accept the email invite you received.
đ Is AI in a Bubble? đ
Last week, we saw how all the hype surrounding Generative AI was largely unmet with tangible value creation.
Investors know this, and are becoming wary of the industry's current state. âNVIDIA is the new CISCOâ or âDot-Com Bubble 2.0 is hereâ are some of the claims Iâve read in the last few weeks.
But are we really in a bubble?
To answer this, Iâve gone deep into this question and, through the gathering of as much relevant data as I could on historical patterns, public and private valuations, and the factual state of the technology, give you a no-hype, data-driven answer that, at the very least, will make you much more aware of whatâs going on in AI these days and, importantly, how to react if necessary.
Readers beware, you are about to be blown away by some of the numbers. Letâs go!
Oh dearâŚ
$7 trillion.
Thatâs how much Microsoft, Apple, NVIDIA, Google, Amazon, and Meta have grown since ChatGPT was launched.
The HyperScaler Dream
For reference, thatâs the equivalent of the entire British, German, and Spanish stock markets⌠combined. In laymanâs terms, the aforementioned companies together are more valuable than all public companies in those countries combined.
Of those $7 trillion, 3.2 correspond to Microsoft, Google, and Amazon, also known as the hyperscalers.
However, according to Scott Galloway, their combined net revenue creation from AI adds a mere $20 billion. Simply put, investors value their AI efforts at a whopping 160 times revenue.
But if you think those numbers are crazy, we are just getting started.
The Greatest Story Ever Told
Here are some facts about the biggest star in AI today, NVIDIA, since the start of the AI boom:
Its current valuation would make it the seventh-largest country in the world by GDP (2022 values), with a similar value to France. When the boom started, it was the 39th (Denmark).
2024 alone, it has added the combined entire market capitalizations of Tesla and Meta, 1.7 trillion. Thatâs also 2022âs Australiaâs entire GDP.
In 6 months, it has added Coca-Colaâs entire market cap⌠seven times.
And itâs already 21 times larger than its main competitor, Intel.
Yet, truth be told, its two last quarters have been mesmerizing. In the fourth quarter of fiscal year 2024, NVIDIA reported a global revenue of $22.1 billion, which exceeded analyst expectations of $20.6 billion.
The data center segment (AI) contributed significantly, with $18.4 billion in revenue, a 27% increase from the previous quarter and a 409% increase year over year.
Moving into the first quarter of fiscal year 2025, NVIDIA continued its robust performance with global revenues of $26.0 billion, again surpassing expectations, with data center revenue growing to $22.6 billion.
However, this strong dependence on their AI data center business, accounting for most of the earnings, has alarmed investors. The value accrual is so intense that, according to Aswath Damodaran, NYU professor of Finance, NVIDIA has to find another market as big as AI to justify its current value.
This statement was based on the assumption that NVIDIAâs growth in the data center business will create a trillion-dollar market, which, considering the entire AI ecosystem is a $200 billion market today, doesnât seem likely.
But if you thought public markets were crazy, private markets are way crazier.
Unfathomable investments
As we saw last week, the price-to-sales ratios in the private markets are out of this world.
Remember, these valuations are against projected sales. In other words, Cohere will only be valued at 227 times its revenue if it fulfills them. But the frenzy isnât slowing down:
French company, âThe H Companyâ, has emerged from stealth mode with a $220 investment round. With no product, let alone revenues, it is already potentially a unicorn (+$1B valuation).
xAI just raised a staggering Series B $6 billion round at a $18 billion valuation for a company less than a year old.
And the worst thing of all?
These insane valuations are all competing in the same market, LLM software, with huge commoditization risks, as everyone is essentially building the same thing, Transformers, with almost no differentiation or sustainable moats beyond âoutrichingâ the next guy.
But is the AI boom comparable to the situation that ended with the catastrophic âdot-comâ crash?
An Objective Comparison
With some of the multiples we see by some of these companies, the comparisons to the âdot-comâ bubble, which began to explode in March of 2000, are unavoidable.
At that time, the frenzy for anything resembling an Internet company created a large manifold of â.com companiesâ that even IPOed without revenues.
After the burst, most of those companies disappeared, with famous examples like âpets.comâ, and the Nasdaq saw a decrease in value of 90%. But here are clearer numbers on the utter craziness of the time:
Out of the 500 IPOs held at the height of the bubble, 77% had no profits and no clear path to monetization.
The Nasdaq 100, the index that tracks the top 100 tech companies, peaked at a 60x forward P/E, meaning the average forward p/e ratio of tech companies was higher than NVIDIAâs today (48). In laymanâs terms, the average tech company had a higher valuation multiple over projected earnings (profits) than what investors attribute to NVIDIA today.
In particular, CISCO had a 140x ratio at its peak, and Yahoo had an incredible 623x. In laymanâs terms, investors paid $623 for every dollar Yahoo was projected to earn.
Sheer madness. So, is a comparison between both situations fair? No.
The reality is that, despite the craziness of the AI boom, most of the value accrued is being gained by companies with clear and growing revenues. Nonetheless, the Nasdaq 100, which reached a 100x forward p/e in November 2001, is currently at an average of âjustâ 25 for May.
Long story short, today, the comparison is pointless. That said, we have problems of our own, and they arenât small.
The Consolidation Problem
Even if the comparison with the âdot-comâ bubble doesnât seem to hold, that doesnât mean the bubble scenario is discarded. The reason isnât related to multiples but more about the nature of the earnings.
Even though the value accruals in the market cap are based on real revenues, these revenues are primarily based on exchanges between just a few companies.
For instance, 40% of NVIDIAâs sales come from just four companies: Microsoft, Meta, Alphabet, and Amazon.
Analysts forecast NVIDIAâs revenue to reach $111 billion this year. If the size estimation of $200 billion for the entire AI market is true, that means that more than half of the total revenues in the industry will come from NVIDIA alone.
In other words, much of the revenue attributed to AI isnât value creation for end customers but a handful of tremendously rich companies buying all of NVIDIAâs GPU stock.
As Josh Brown from Ritholdz Wealth Management stated in an interview, the AI market seems âincestuousâ and âworking as a perpetual motion machine,â meaning it isnât leveraging exogenous stimuli to work.
Of course, such machines donât exist, making the current trend unsustainable and making it very clear that revenues should transition into actual value deployment.
Quickly.
Bottom line, revenues arenât meeting current demand but subsidizing future predicted demand. And that, my friend, is a problem waiting to explode.
Nonetheless, while the Mag Seven AI expenditure is booming, the rest are lagging. According to a Ramp report on their mainly SMB customer base, although AI spending is increasing, the numbers are still very modest:
Just $1.5 for Q1 on average and just $2.5 for mid-market companies (revenues at least over $10 million):
Whatâs more, according to BCG, as late as January 2024, 90% of companies were on experimental GenAI budgets. Therefore, thereâs no way around it: AI is not living up to its promise.
But if you look at the state of the technology, things become even more concerning.
Sponsored content
MaxAI.me - Do more Faster with 1-Click AI
MaxAI.me lets you chat with GPT-4, Claude 3, Gemini 1.5. You can also perfect your writing anywhere, save 90% of your reading & watching time with AI summary, and reply 10x faster on email & social media.
Bets, Bets, & Bets
Look, the entire industryâs valuation and spending are based on one word:
Inference, aka running the models.
Sadly, large models require huge GPU clusters because LLM inference is memory-bound, meaning GPUs saturate their memory much earlier than their processors.
GPUs have huge compute power but very small memory. As LLMs are huge, that means that we need many more GPUs, ending up in huge clusters running, at best, a 50% processing power.
In other words, in best-case scenario, you need two times the amount of GPUs based on processing power.
However, training is also expected to grow costly.
Nonetheless, prominent figures in the space, such as Sam Altman and Dario Amodei, CEOs of OpenAI and Anthropic, are convinced that we will soon see models costing above $1 billion. The latter even suggests that, by 2025, training an AI model would cost more than $10 billion.
But there are other reasons why things in AI are becoming so inflated:
Itâs the easiest way for founders to justify insane valuations. If you need to justify why youâre valued at 5 billion, argue you need $4 billion in GPUs.
Emerging capabilities. As deep learning is almost entirely an inductive science, researchers stumble upon breakthroughs by making their models larger. You would be surprised how little innovation the underlying architecture has compared to todayâs models and those of 2019.
More data gathering. Data is the real differentiator. Gathering real-life data and generating synthetic data for training is very expensive.
However, this approach has risks galore.
Just like frontier labs train larger models, open-source makes smaller models much better. For instance, the Phi-3 model family from Microsoft has ChatGPT-3.5 level performance while being 1000 times smaller.
Therefore, what if SLMs become good enough for enterprises to focus on them for the âgreat valueâ option instead of using ChatGPT âjust becauseâ?
This threat is real, as SLMs are orders of magnitude cheaper for only a few % points less accuracy. Big tech companies know this, so why do they keep buying as much compute as possible?
Well, the real reason might be the greatest bet of all: âlong-inferenceâ models.
Absurd costs across the board
Based on the principles of System 1&2 thinking modes popularized by the late Daniel Kahneman, frontier labs seem to be betting on a new type of AI model that will emulate human System 2 thinking (slow, conscious, and deliberate thinking required to solve complex problems) and, upon realization, will require millions of GPUs to run at scale.
And, importantly, it could open a huge gap with open-source, which could be impossible to close due to the enormous costs.
As we have discussed many times, these models are poised to combine LLMs and search algorithms, something Google (through AlphaCode 2 or Gemini 1.5 (page 43)) and OpenAI are openly fiddling with.
This capacity to self-reflect is thought to boost model intelligence, as current LLMs seem to be hitting an âintelligenceâ wall.
But if this bet goes south and performance and demand do not explode, big tech companies would be left hanging the biggest bag of unnecessary hardware in history.
Not If, but When
Long story short, although we have discarded a âdot-com bubbleâ scenarioâat least for nowâthe valuations of some of these companies, especially private valuations, and the unquestionable incestuous nature of the majority of current AI profits, with 7 companies making all the money and spending, make the current environment as equally extraordinary as itâs unsustainable.
Disturbingly, some incumbentsânot all, howeverâhave inexorably tied their futures to AI, making the situation even scarier.
Therefore, in my humble opinion, although I assume this is not a surprise by now, the question is not whether AI is in a bubble, but:
When will it pop? And what can you do against it?
Subscribe to Full Premium package to read the rest.
Become a paying subscriber of Full Premium package to get access to this post and other subscriber-only content.
Already a paying subscriber? Sign In.
A subscription gets you:
- ⢠NO ADS
- ⢠An additional insights email on Tuesdays
- ⢠Gain access to TheWhiteBox's knowledge base to access four times more content than the free version on markets, cutting-edge research, company deep dives, AI engineering tips, & more