Volume 179: Pfffffffftttttttt.
Pfffffffftttttttt.
tl;dr: The sound of the GenAI bubble deflating.
Way back in 1999/2000, when I was only ten years old, I took an MBA course at the rather lovely University of Lancaster in England. While there, my classmates and I spent much time discussing this incredible new thing called ‘the web,’ where the possibilities were endless, and there were fortunes to be made!
By the time I graduated a year later, it was already over. The bubble had burst spectacularly, and it would be several years before the idea of ‘Web 2.0’ became part of the cultural lexicon and excitement surrounding the Internet ramped back up.
As an aside, many years later, I worked with someone who’d joined one of the most infamous of the imploding startups, Boo.com, straight out of school. Because it was her first job, she had no idea it wasn’t normal to fly transcontinental via Concorde, expense your cocktails, and get paid a straight-out-of-school salary it would take another decade to match. That’s the insane psychological reality of being in a bubble, folks.
Since then, many more bubbles have come and gone, none more destructive than the housing bubble of 2008/9 that nearly destroyed the global economy. (Frankly, I still find it ridiculous that investment bankers were allowed to bundle shockingly risky junk-rated mortgages into a derivative financial product that, somehow, automagically became triple-A rated. Rather than the banks being bailed out, the investment bankers should’ve gone to jail because that was nothing less than white-collar crime on a spectacular scale).
Anyway, it doesn’t take a rocket scientist to see that generative AI has been hyped to massive bubble levels. So, it’s no surprise to see Barclays, Goldman Sachs, and Sequoia Capital running the numbers and all coming to the exact same conclusion: Unless something changes massively, the infrastructure costs of generative AI development will never be paid back.
To put this in context, Sequoia examined the infrastructure costs of AI data centers and concluded that payback would require GenAI to deliver circa $600bn in yearly revenue. This is a massive lift when they also estimate current revenues to be in the low single-digit billions with zero killer-app use cases on the table. (Thinking of killer apps, Google researchers very scarily concluded that the thing GenAI is most suited for right now is further poisoning the body politic with dis and mis-information. As if society needed that to get any worse).
In concurrence, Goldman examined it through the lens of the jobs most likely to be replaced and concluded that replacing low-wage customer service workers with eye-wateringly expensive technology makes little to no sense. That’s likely another reason McDonald’s pulled it out of their drive-through windows besides putting bacon on ice cream.
The thing with bubbles, though, is that while there’s almost always a bunch of people calling it, they always get drowned out by the propaganda hype and greed of those with skin in the game until, eventually, it all falls apart. Sometimes, the bubble deflates so slowly that we don’t even notice; sometimes, it explodes spectacularly. The problem is that it’s impossible to predict when this will happen and which kind of deflation we’ll get, so it’s easy to point at the bubble-heads and write them off as doom-mongers.
However, while the coming deflation will be painful for the handful of massive tech and VC firms most exposed to generative AI, having the bubble deflate or even burst will benefit everyone else.
Why, you might ask?
Well, right now, generative AI is an unmitigated gold rush that is nowhere near ready for prime time. Everyone is racing to be in it without really having much of a clue what kind of race they’re in or what it will be good for. For all of you out there feeling the pain of projects slowing to a trickle, some of this is due to the inevitable hangover of the pandemic pulling demand forward, but a lot is due to corporate FOMO diverting discretionary budgets to generative AI based on, well, bullshit basically. Oh, and McKinsey revenue targets, which we all know by now, are self-fulfilling prophecies no matter how distasteful that particular advisor may be.
In addition to cost, environmental destruction, complexity, copyright theft on an unprecedented scale, wild inaccuracy, and its security-sieve status, AI products absolutely suck from a user experience perspective - having to become a ‘prompt engineer’ is an utter failure of UX rather than the new job title it’s so breathlessly been framed as. And while many of the video effects we see on the Internet look spectacular as 5s clips, I’ve yet to see anyone meaningfully harness it in the context of genuine storytelling. Not without a load of human interventions, hacks, splices, and workarounds anyway, which suggests that it can’t be done yet, maybe not ever, and that what we are seeing is probably (and in this case, definitely) faked. If all this were really possible, we’d be doing it ourselves already rather than being forced to look at what a few paid ‘influencers’ have been up to.
Considering how many people try tools like ChatGPT and then never try it again, we can see that irrespective of infrastructure costs, we’re nowhere near there yet at a product level, either.
So, it’s good that the bubble deflates. Here’s why.
When technology bubbles deflate or burst, it forces discipline on those that are left. Suddenly, hype isn’t enough; you have to build a product that has a real value proposition, is fairly priced for the value it provides, is scalable in an economically viable fashion, and that people actually want to use.
But you also have a big advantage over those who came before. Because so much speculative money has already been spent on infrastructure, it doesn’t need to be spent again. In essence, one speculator’s loss becomes another entrepreneur’s gain.
Tim O’Reilly put this extremely well when using the example of Tesla to discuss productive versus non-productive bubbles. Because Tesla was so massively over-valued, it had access to the billions necessary to build a charging infrastructure that couldn’t have happened otherwise. And with that infrastructure now being opened up to other manufacturers via government subsidies/bribes, there will be a societal acceleration of electrification. In other words, while tech bubbles tend to be deeply painful at a micro-level, they often bring major macro-benefits that we don’t see playing out until years after the bubble has burst.
Now, I’m sure a few of you are out there saying that I’m a bubble-head, that GenAI is amazing, that it’s already transforming industries and all that jazz. And sure. When I say it’s a bubble, what I’m not saying is that it has no value. I started this story talking about the web because it’s a salutary tale. It absolutely was a bubble in 2000, yet it had also transformed how we live, work, and play by 2024. But not how people were predicting, and nowhere near as fast.
This is likely playing out again as we speak. The predictions of the power of GenAI are likely wildly incorrect, yet it will be a transformative technology, just in ways we haven’t yet considered. And instead of transforming industries in 18 months, it’s more likely to take a decade, maybe two, before we see exactly where, how, and why it’s been transformative.
And don’t forget that value and revenue must also come into line eventually. This isn’t just about being able to handle a task or two or about being brilliant in narrow use cases. If Sequoia is correct, GenAI must generate $600+bn in yearly revenue to be viable.
Until then, folks, the ‘fire everybody and replace them with AI crowd’ will be left crying into their beer. There just isn’t enough there, there. Not yet. Perhaps not ever, depending on the use case.
Except in marketing. Marketers have been useful idiots for tech salespeople for so long and are under such pressure to reduce recurring operational costs that there will undoubtedly be years of pain ahead, just like there was with Mar-tech and Ad-tech. Then, eventually, we’ll come full circle right back to where we already are - way too much money invested in tech that we don’t use but with sunk costs that are so high that the idea of ripping it out cannot be considered. Except now, we’ll face all the added pain of hallucinatory inaccuracy, misplaced confidence, security vulnerabilities, exposure to copyright infringement, and reputational damage that comes with generative AI.
Ah well, you’d think we’d learn.