OpenAI CEO Sam Altman says the company believes it can grow revenue fast enough to absorb its massive long-term AI spending commitments, highlighting that exponential growth dynamics are being widely underestimated.
In a new interview with journalist Alex Kantrowitz, Altman addresses concerns about OpenAI’s roughly $1.4 trillion in projected capital commitments, saying that the challenge is one of scale, timing and compute availability rather than demand.
Altman says people often struggle to intuitively grasp exponential growth, even when it is playing out in real time.
“You have good intuition for a lot of mathematical things in your head, but exponential growth is usually very hard for people to do a good, quick mental framework on… Modeling exponential growth doesn’t seem to be one of them.”
He says OpenAI believes it can remain on a steep revenue trajectory for an extended period, based on everything the company is currently seeing across its businesses.
“So the thing we believe is that we can stay on a very steep growth curve of revenue for quite a while, and everything we see right now continues to indicate that.”
Altman ties the company’s revenue potential directly to access to computing power, describing compute as the primary constraint on monetization, hence the spending spree on data center projects.
“We cannot do it if we don’t have the compute. We’re so compute constrained, and it hits the revenue line so hard that I think if we get to a point where we have like a lot of compute sitting around that we can’t monetize on a profitable per unit of compute basis, it would be very reasonable to say, ‘Okay… how’s this all going to work?’”
He says OpenAI has modeled multiple scenarios internally and sees flexibility if timelines or assumptions shift, while emphasizing that compute shortages have been a constant rather than a temporary phase.
“We’ve penciled this out a bunch of ways. There are checkpoints along the way. And if we’re a little bit wrong about our timing or math, we have some flexibility, but we have always been in a compute deficit.”
Altman adds that improvements in efficiency will play a role, but revenue growth is expected to come from multiple expanding fronts.
“We will, of course, also get more efficient on a FLOPS (floating-point operations per second) per dollar basis as all of the work we’ve been doing to make compute cheaper comes to pass. But we see this consumer growth. We see this enterprise growth. There’s a whole bunch of new kinds of businesses that we haven’t even launched yet, but will. But compute is really the lifeblood that enables all of this.”
Recent data from Cloudflare shows that ChatGPT is still the number one chatbot this year, but Google’s Gemini and Anthropic’s Claude are closing the gap. Earlier this month, reports emerged that Altman issued a companywide “Code Red” directive following the launch of Gemini 3.
Disclaimer: Opinions expressed at CapitalAI Daily are not investment advice. Investors should do their own due diligence before making any decisions involving securities, cryptocurrencies, or digital assets. Your transfers and trades are at your own risk, and any losses you may incur are your responsibility. CapitalAI Daily does not recommend the buying or selling of any assets, nor is CapitalAI Daily an investment advisor. See our Editorial Standards and Terms of Use.

