Is AI in a Bubble?

🏝 TheTechOasis 🏝

part of the:

In AI, learning is winning.

While Thursday’s newsletter discusses the present, the Leaders segment on Sundays will inform you of future trends in AI and provide actionable insights to set you apart from the rest.

💎 Big Launch 💎

Finally, my community and news feed, TheWhiteBox, launched on Friday!

TheWhiteBox is the place for high-quality, highly curated AI content without unnecessary hype or ads across research, models, investing & markets, and AI products and companies. With TheWhiteBox, we guarantee you won’t need anything else.

  • If you signed up for the waitlist, you will have already received the invitation or will receive it as late as today. However, you must accept the invitation you received in your inbox to materialize it.

  • If you were already a Premium subscriber, you have been automatically added, too, while enjoying full access to all spaces and content. You also must accept the email invite you received.

If you haven’t joined yet, click below for a 14-day free trial on the monthly subscription.

😟 Is AI in a Bubble? 😟

Last week, we saw how all the hype surrounding Generative AI was largely unmet with tangible value creation.

Investors know this, and are becoming wary of the industry's current state. ‘NVIDIA is the new CISCO’ or ‘Dot-Com Bubble 2.0 is here’ are some of the claims I’ve read in the last few weeks.

But are we really in a bubble?

To answer this, I’ve gone deep into this question and, through the gathering of as much relevant data as I could on historical patterns, public and private valuations, and the factual state of the technology, give you a no-hype, data-driven answer that, at the very least, will make you much more aware of what’s going on in AI these days and, importantly, how to react if necessary.

Readers beware, you are about to be blown away by some of the numbers. Let’s go!

Oh dear…

$7 trillion.

That’s how much Microsoft, Apple, NVIDIA, Google, Amazon, and Meta have grown since ChatGPT was launched.

The HyperScaler Dream

For reference, that’s the equivalent of the entire British, German, and Spanish stock markets… combined. In layman’s terms, the aforementioned companies together are more valuable than all public companies in those countries combined.

Of those $7 trillion, 3.2 correspond to Microsoft, Google, and Amazon, also known as the hyperscalers.

However, according to Scott Galloway, their combined net revenue creation from AI adds a mere $20 billion. Simply put, investors value their AI efforts at a whopping 160 times revenue.

But if you think those numbers are crazy, we are just getting started.

The Greatest Story Ever Told

Here are some facts about the biggest star in AI today, NVIDIA, since the start of the AI boom:

  • Its current valuation would make it the seventh-largest country in the world by GDP (2022 values), with a similar value to France. When the boom started, it was the 39th (Denmark).

  • 2024 alone, it has added the combined entire market capitalizations of Tesla and Meta, 1.7 trillion. That’s also 2022’s Australia’s entire GDP.

  • In 6 months, it has added Coca-Cola’s entire market cap… seven times.

  • And it’s already 21 times larger than its main competitor, Intel.

Yet, truth be told, its two last quarters have been mesmerizing. In the fourth quarter of fiscal year 2024, NVIDIA reported a global revenue of $22.1 billion, which exceeded analyst expectations of $20.6 billion.

The data center segment (AI) contributed significantly, with $18.4 billion in revenue, a 27% increase from the previous quarter and a 409% increase year over year.

Moving into the first quarter of fiscal year 2025, NVIDIA continued its robust performance with global revenues of $26.0 billion, again surpassing expectations, with data center revenue growing to $22.6 billion.

However, this strong dependence on their AI data center business, accounting for most of the earnings, has alarmed investors. The value accrual is so intense that, according to Aswath Damodaran, NYU professor of Finance, NVIDIA has to find another market as big as AI to justify its current value.

This statement was based on the assumption that NVIDIA’s growth in the data center business will create a trillion-dollar market, which, considering the entire AI ecosystem is a $200 billion market today, doesn’t seem likely.

But if you thought public markets were crazy, private markets are way crazier.

Unfathomable investments

As we saw last week, the price-to-sales ratios in the private markets are out of this world.

Remember, these valuations are against projected sales. In other words, Cohere will only be valued at 227 times its revenue if it fulfills them. But the frenzy isn’t slowing down:

  • xAI just raised a staggering Series B $6 billion round at a $18 billion valuation for a company less than a year old.

And the worst thing of all?

These insane valuations are all competing in the same market, LLM software, with huge commoditization risks, as everyone is essentially building the same thing, Transformers, with almost no differentiation or sustainable moats beyond “outriching” the next guy.

But is the AI boom comparable to the situation that ended with the catastrophic ‘dot-com’ crash?

An Objective Comparison

With some of the multiples we see by some of these companies, the comparisons to the ‘dot-com’ bubble, which began to explode in March of 2000, are unavoidable.

At that time, the frenzy for anything resembling an Internet company created a large manifold of ‘.com companies’ that even IPOed without revenues.

After the burst, most of those companies disappeared, with famous examples like ‘’, and the Nasdaq saw a decrease in value of 90%. But here are clearer numbers on the utter craziness of the time:

  • Out of the 500 IPOs held at the height of the bubble, 77% had no profits and no clear path to monetization.

  • The Nasdaq 100, the index that tracks the top 100 tech companies, peaked at a 60x forward P/E, meaning the average forward p/e ratio of tech companies was higher than NVIDIA’s today (48). In layman’s terms, the average tech company had a higher valuation multiple over projected earnings (profits) than what investors attribute to NVIDIA today.

  • In particular, CISCO had a 140x ratio at its peak, and Yahoo had an incredible 623x. In layman’s terms, investors paid $623 for every dollar Yahoo was projected to earn.

Sheer madness. So, is a comparison between both situations fair? No.

The reality is that, despite the craziness of the AI boom, most of the value accrued is being gained by companies with clear and growing revenues. Nonetheless, the Nasdaq 100, which reached a 100x forward p/e in November 2001, is currently at an average of ‘just’ 25 for May.

Long story short, today, the comparison is pointless. That said, we have problems of our own, and they aren’t small.

The Consolidation Problem

Even if the comparison with the ‘dot-com’ bubble doesn’t seem to hold, that doesn’t mean the bubble scenario is discarded. The reason isn’t related to multiples but more about the nature of the earnings.

Even though the value accruals in the market cap are based on real revenues, these revenues are primarily based on exchanges between just a few companies.

For instance, 40% of NVIDIA’s sales come from just four companies: Microsoft, Meta, Alphabet, and Amazon.

Analysts forecast NVIDIA’s revenue to reach $111 billion this year. If the size estimation of $200 billion for the entire AI market is true, that means that more than half of the total revenues in the industry will come from NVIDIA alone.

In other words, much of the revenue attributed to AI isn’t value creation for end customers but a handful of tremendously rich companies buying all of NVIDIA’s GPU stock.

As Josh Brown from Ritholdz Wealth Management stated in an interview, the AI market seems ‘incestuous’ and ‘working as a perpetual motion machine,’ meaning it isn’t leveraging exogenous stimuli to work.

Of course, such machines don’t exist, making the current trend unsustainable and making it very clear that revenues should transition into actual value deployment.


Bottom line, revenues aren’t meeting current demand but subsidizing future predicted demand. And that, my friend, is a problem waiting to explode.

Nonetheless, while the Mag Seven AI expenditure is booming, the rest are lagging. According to a Ramp report on their mainly SMB customer base, although AI spending is increasing, the numbers are still very modest:

Just $1.5 for Q1 on average and just $2.5 for mid-market companies (revenues at least over $10 million):

What’s more, according to BCG, as late as January 2024, 90% of companies were on experimental GenAI budgets. Therefore, there’s no way around it: AI is not living up to its promise.

But if you look at the state of the technology, things become even more concerning.

Sponsored content - Do more Faster with 1-Click AI lets you chat with GPT-4, Claude 3, Gemini 1.5. You can also perfect your writing anywhere, save 90% of your reading & watching time with AI summary, and reply 10x faster on email & social media.

Bets, Bets, & Bets

Look, the entire industry’s valuation and spending are based on one word:

Inference, aka running the models.

Sadly, large models require huge GPU clusters because LLM inference is memory-bound, meaning GPUs saturate their memory much earlier than their processors.

GPUs have huge compute power but very small memory. As LLMs are huge, that means that we need many more GPUs, ending up in huge clusters running, at best, a 50% processing power.

In other words, in best-case scenario, you need two times the amount of GPUs based on processing power.

However, training is also expected to grow costly.

Nonetheless, prominent figures in the space, such as Sam Altman and Dario Amodei, CEOs of OpenAI and Anthropic, are convinced that we will soon see models costing above $1 billion. The latter even suggests that, by 2025, training an AI model would cost more than $10 billion.

But there are other reasons why things in AI are becoming so inflated:

  1. It’s the easiest way for founders to justify insane valuations. If you need to justify why you’re valued at 5 billion, argue you need $4 billion in GPUs.

  2. Emerging capabilities. As deep learning is almost entirely an inductive science, researchers stumble upon breakthroughs by making their models larger. You would be surprised how little innovation the underlying architecture has compared to today’s models and those of 2019.

  3. More data gathering. Data is the real differentiator. Gathering real-life data and generating synthetic data for training is very expensive.

However, this approach has risks galore.

Just like frontier labs train larger models, open-source makes smaller models much better. For instance, the Phi-3 model family from Microsoft has ChatGPT-3.5 level performance while being 1000 times smaller.

Therefore, what if SLMs become good enough for enterprises to focus on them for the ‘great value’ option instead of using ChatGPT ‘just because’?

This threat is real, as SLMs are orders of magnitude cheaper for only a few % points less accuracy. Big tech companies know this, so why do they keep buying as much compute as possible?

Well, the real reason might be the greatest bet of all: ‘long-inference’ models.

Absurd costs across the board

Based on the principles of System 1&2 thinking modes popularized by the late Daniel Kahneman, frontier labs seem to be betting on a new type of AI model that will emulate human System 2 thinking (slow, conscious, and deliberate thinking required to solve complex problems) and, upon realization, will require millions of GPUs to run at scale.

And, importantly, it could open a huge gap with open-source, which could be impossible to close due to the enormous costs.

As we have discussed many times, these models are poised to combine LLMs and search algorithms, something Google (through AlphaCode 2 or Gemini 1.5 (page 43)) and OpenAI are openly fiddling with.

This capacity to self-reflect is thought to boost model intelligence, as current LLMs seem to be hitting an ‘intelligence’ wall.

But if this bet goes south and performance and demand do not explode, big tech companies would be left hanging the biggest bag of unnecessary hardware in history.

Not If, but When

Long story short, although we have discarded a ‘dot-com bubble’ scenario—at least for now—the valuations of some of these companies, especially private valuations, and the unquestionable incestuous nature of the majority of current AI profits, with 7 companies making all the money and spending, make the current environment as equally extraordinary as it’s unsustainable.

Disturbingly, some incumbents—not all, however—have inexorably tied their futures to AI, making the situation even scarier.

Therefore, in my humble opinion, although I assume this is not a surprise by now, the question is not whether AI is in a bubble, but:

When will it pop? And what can you do against it?

Subscribe to Leaders to read the rest.

Become a paying subscriber of Leaders to get access to this post and other subscriber-only content.

Already a paying subscriber? Sign In

A subscription gets you:
High-signal deep-dives into the most advanced AI in the world in a easy-to-understand language
Additional insights to other cutting-edge research you should be paying attention to
Curiosity-inducing facts and reflections to make you the most interesting person in the room