Illustration: Brendan Lynch/Axios
The markets on Monday started pricing in an AI future that’s going to be cheaper and more accessible than they had previously assumed.
Why it matters: The less money that companies need to spend on the AI equivalent of picks and shovels — Nvidia chips and the electricity needed to power them — the more profitable they will be.
Follow the money: What looked early on Monday like it might be a broad-based rout turned out by market close to be much more selective.
- Indices generally fell in direct proportion to the weighting of Nvidia within that index — and the Dow, which doesn’t include Nvidia, went up.
- While Nvidia lost $600 billion of market value in a single day, for instance, Apple gained more than $100 billion.
What they’re saying: “If you still believe that AI is going to be big, this news out of China should only make you feel better,” wrote Siebert chief investment officer Mark Malek on Monday.
- His argument: DeepSeek’s technological breakthrough will only serve to multiply the amount of performance that companies can get per dollar invested in AI.
The big picture: The market’s theory of AI, at least up until the end of last week, was that, broadly, bigger is always better.
- Companies would see their share prices rise just on the announcement that they had bought a large number of Nvidia chips, even if they were extremely vague as to what they intended to do with them.
- Similarly, energy companies have been soaring on the grounds that there’s no such thing as too much electricity when it comes to powering the AI revolution.
Reality check: DeepSeek has now shown that it’s possible to produce a state-of-the-art AI that needs fewer and less-powerful chips, less energy — and much less up-front investment.
- That seems bad for Nvidia, which has an effective monopoly on AI chips, and it’s also bad for power companies who were counting on surging demand from data centers.
Where it stands: Last week, the markets believed that without billions of dollars in funding, it was impossible to compete with OpenAI. This week, they’re not so sure.
- They were also pricing in massive compute costs for the biggest consumers of AI — Google, Meta, Amazon and Microsoft are expected to spend more than $300 billion between them on capital expenditures (capex) this year.
- You can be sure that all four of them are revisiting those assumptions this week, asking if they can get the same bang for many fewer bucks.
- If some of the $500 billion earmarked for Stargate, for instance, can be redirected to other purposes, that could fund a lot of very profitable opportunities.
Between the lines: The biggest market trend of recent years has been linked to the concept of “positive returns to scale” — the idea that the bigger you get, the harder it becomes for anybody to compete with you, and the more your margins grow.
- That helps explain the enormous sums that investors have poured into AI companies in recent years.
- If AI is the future of business, however, and if powerful AI tools are available at low cost to any company on the planet, then, as Axios’ Dan Primack writes, Silicon Valley’s enormous investment in foundational AI models might never see any returns at all.
The bottom line: What’s bad for the companies looking to sell AI products is likely to be good for the companies looking to buy them.
- And there are many more of the latter than there are of the former.