OpenAI is committed to spending about $1.4 trillion over the next eight years on data centers and cloud services, CEO Sam Altman said.
While some of those costs are being offset with revenue sharing or equity deals, the scale of the compute spend has led to questions about how OpenAI will afford to meet its financial promises.
This week, OpenAI CFO Sarah Friar said at the Wall Street Journal’s Tech Live event that the company is looking to create an ecosystem of banks, private equity, and a federal “backstop” or “guarantee” to help finance its deals.
The comments were widely interpreted as a call for government backing of loans and deals, which Friar walked back in a later LinkedIn post.
Altman then posted a lengthy tweet to X about the comments and added further details about the company’s financial approach.
“We do not have or want government guarantees for OpenAI data centers,” he said.
“We believe that governments should not pick winners or losers, and that taxpayers should not bail out companies that make bad business decisions or otherwise lose in the market. If one company fails, other companies will do good work.”
However, he added that the government could fund a strategic national reserve of computing power, and that it should support the build-out of semiconductor fabs in the US.
As for OpenAI’s revenues, Altman said that the company expects to end the year above $20 billion in annualized revenue run rate “and grow to hundreds of billions by 2030.”
The company had to line up more than $1 trillion in compute commitments – across Stargate, Oracle, CoreWeave, Google, Microsoft, and Amazon Web Services – because “we believe the risk to OpenAI of not having enough computing power is more significant and more likely than the risk of having too much,” he said.
“We are trying to build the infrastructure for a future economy powered by AI, and given everything we see on the horizon in our research program, this is the time to invest to be really scaling up our technology. Massive infrastructure projects take quite [a while] to build, so we have to start now.”
He added: “Even today, we and others have to rate limit our products and not offer new features and models because we face such a severe compute constraint.”
However, counter to that point, in the very same tweet, Altman said that the business is “looking at ways to more directly sell compute capacity to other companies (and people); we are pretty sure the world is going to need a lot of ‘AI cloud,’ and we are excited to offer this.”
This would potentially put it in competition with its customers and cloud providers, and presumably see it reselling cloud services that it has contracted from those providers.
https://www.datacenterdynamics.com/en/news/sam-altman-openai-isnt-seeking-a-government-bailout-has-14tn-in-data-center-commitments-may-launch-an-ai-cloud-service/

