now loading...
Wealth Asia Connect Middle East Treasury & Capital Markets Europe ESG Forum TechTalk
Treasury & Capital Markets / TechTalk
AI bubble, unlike dotcom’s, carries systematic credit risk
Today’s massive and still-growing investments in AI and its accompanying infrastructure could well pay off like the internet did, following the investment boom of the late 1990s. But, for now, the gains from AI look more muted, and the macro downsides larger, than in the case of the dotcom bubble
Carl Benedikt Frey   20 Nov 2025

When OpenAI recently committed US$1.4 trillion to securing future computing capacity, it was merely the latest indication of irrational exuberance in 2025. By some estimates, US GDP growth in the first half of this year came almost entirely from data centres, prompting a flood of commentary about when the bubble will burst and what it may leave behind. While the late 1990s dotcom party ended with a hangover for Wall Street, Main Street kept what mattered: the infrastructure. Productivity rose, and the fibre laid during the boom years still works today. US President Bill Clinton’s vow to build a “bridge to the 21st century” was one of those rare campaign promises that is actually fulfilled.

Today’s artificial intelligence ( AI ) investments could well pay off like the internet did. For now, though, the gains look more muted, and the macro downsides larger, than in the case of the dotcom bubble. Consider the potential benefits. In the late 1990s, the internet’s payoff showed up while the bubble was still inflating: US labour productivity growth averaged about 2.8% from 1995 to 2004, roughly double the previous two decades’ pace, before fading in the mid-2000s. You could see the gains in the national accounts even as Pets.com was buying up its ill-fated Super Bowl ads.

This time, US labour productivity growth has picked up after two sluggish decades – reaching around 2.7% last year – but it’s too soon to say that AI is the reason. In fact, AI adoption is slipping, with a recent US Census Bureau survey showing lower use among large firms. If the recent uptick in productivity was mostly an AI story, it could be expected to fade as adoption ebbs – another reminder of how fleeting these waves can be. As visible as the 1990s information technology boom was in real time, it petered out within a decade or so.

It is tempting to think that large language models ( LLMs ) will speed up innovation and discovery itself, such as by surfacing hidden links in the academic literature, writing code, and drafting protocols. New tools – from Robert Hooke’s microscope to Galileo’s telescope – have sparked such leaps before. This time, however, we have already had the ultimate research tool in the form of the internet-connected PC. Yet even with instant access to the world’s accumulated knowledge and top talent, measures of research productivity and breakthrough innovation have declined. Keeping alive Moore’s Law – the observation that computer processing power doubles every two years – now takes orders of magnitude more researchers than it did in the early 1970s.

Nor is it clear that the current capital expenditures boom will leave much in the way of durable digital infrastructure. Like the railroads in the 19th century, the dotcom era poured money into long-lived assets – especially fibre-optic cable and backbone networks – that could be “lit” and relit as electronics improved. Much of that glass still carries traffic today. One tranche of capex supported multiple generations of technology and business models.

By contrast, AI isn’t laying track; it’s running a treadmill. Chips and memory degrade or become obsolete in the space of years, not decades. Each server rack used to train an LLM now requires 120 kilowatts ( kW ) of power, up from about 5kW to 10kW a decade ago. And though each new generation of GPUs ( graphics processing units ) slashes cost per watt, this means that hyperscalers churn through fleets faster as older gear becomes economically obsolete. Whereas fibre endures while you swap endpoints, the AI technology stack depreciates fast, requiring constant reinvestment.

This treadmill could be manageable if the macroeconomic picture looked like the one in 1999. But it doesn’t. Although real interest rates were higher then, Clinton-era budget surpluses and a falling debt-to-GDP ratio eased pressure on capital markets and kept the government’s interest bill smaller, limiting a crowding-out effect.

This time, the situation has been reversed. Not only have persistent US government deficits near 6% of GDP ( about US$1.8 trillion ) and net interest payments approaching US$1 trillion reduced fiscal space, but the same pool of savings is now expected to finance clean energy build-outs, rising defence budgets and a power-hungry data centre boom. In practice, this demand shows up as higher borrowing costs, which slow new housing construction and push long-gestation infrastructure to the back of the line.

Public finances also feel the pinch. A larger debt stock means positive real rates feed quickly into a higher interest bill, crowding out programmes that households rely on. During the late-1990s surplus, debt fell and the Treasury even bought back bonds, which meant that the state could invest alongside a private boom without elbowing it aside. Today, more borrowing and a heavier interest tab leave less room for manoeuvre when growth slows. If AI’s payoff does arrive, but only slowly, the arithmetic will be even more difficult. We would see more dollars going to bondholders, and fewer to Social Security, health care and core services; and, if the business cycle goes south, the trade-offs would be even sharper.

Financing has changed, too. The early-2000s downturn was mostly an equity story: stock prices collapsed and venture capital investors targeting long-term returns took a beating; but as brutal and highly visible as it was, the pain subsided relatively quickly. As Carmen Reinhart and Kenneth Rogoff emphasize in their 2009 history of financial crises, This Time Is Different, asset bubbles tend to threaten the macroeconomy mainly when they are credit-fuelled and hit banks’ balance sheets. Because the dotcom bust was largely an equity repricing ( telecoms aside ), not a banking crisis, there was no systemic failure despite the big investor losses.

This time, risk is building through credit. As investor Paul Kedrosky notes, funding is shifting from equity to bonds, special-purpose vehicles and leases, and private credit – all forms of borrowing that ultimately link back to banks and insurers. If AI and data centre revenues fall short, the trouble will likely show up first in credit markets, not stock prices. Keep an eye out for missed coverage targets, tighter loan terms and refinancing squeezes hitting lenders and insurers’ balance sheets through long leases and chip-backed loans.

That’s the systemic risk. Unlike the dotcom era, today’s build-out pushes exposure into the financial plumbing, so stress is likelier to spread through lenders and structured vehicles. You can already see market watchers growing concerned, with Moody’s warning that a meaningful share of Oracle’s data-centre growth depends on OpenAI, which has yet to establish a path to profitability.

Of course, if AI delivers broad and sustained productivity gains quickly, the math improves. Faster growth would ease fiscal pressure, lower debt ratios, and buttress these financing structures. But if gains arrive late or fall short of expectations, the payoff might not compensate for the massive front-loaded costs.

Carl Benedikt Frey is an associate professor of AI & Work at the Oxford Internet Institute and the director of the Future of Work Programme at the Oxford Martin School.

Copyright: Project Syndicate