Google, Meta, Amazon and three other Big Tech firms have announced hundreds of billions of dollars in capital expenditure (CapEx) this year.
The latest earnings season reveals that mega-cap tech firms are looking to pour about $655 billion into AI in 2026, led by Amazon’s eye-popping $200 billion CapEx guidance.
Google follows suit, allocating around $180 billion to CapEx, and Meta’s next in line with $122 billion. Microsoft projects $105 to $120 billion, Tesla forecasts CapEx exceeding $20 billion and Apple sees the figure at $13 billion.
Nvidia CEO Jensen Huang says the massive hyperscaler CapEx allocation this year is justified as he believes the investment will pay off.
In a new CNBC interview, Huang pushes back on concerns that Big Tech is overinvesting following reports that the companies plan to spend hundreds of billions of dollars in AI this year.
According to the Nvidia executive, the spending wave is a response to what he describes as the largest transformation in software history, where AI is using software to drive productivity gains and boost cash flow.
“It is appropriate and sustainable. And the reason for that is because all of these companies’ cash flows are going to start rising. People are comparing it to cash flows. One of those numbers is wrong. It’s just the cash flow is wrong.
We are addressing the largest software opportunity in history. For the very first time, software is not just a tool; a tool is like Excel. Now, software uses tools. So these AIs use Excel. And so I think the opportunity for this new era of software is incredible.”
Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

