September 26, 2025
OpenAI's historic week has redefined the AI arms race for investors: 'I don’t see this as crazy'
OpenAI cemented its role as the central force in AI infrastructure this week, unveiling nearly a trillion dollars in planned spending.

OpenAI CEO Sam Altman listens to questions at a Q&A following a tour of the OpenAI data center in Abilene, Texas, U.S., Sept. 23, 2025.

Shelby Tauber | Reuters

This week, OpenAI redefined what momentum — and risk — look like in the artificial intelligence arms race.

Now comes the hard part: Executing on CEO Sam Altman‘s multitrillion-dollar vision.

In a rapid-fire series of announcements, the company unveiled partnerships involving mind-bending sums of money and cemented its place at the center of the next wave of machine learning infrastructure.

It began Monday with news that Nvidia plans to invest up to $100 billion to help OpenAI build data center capacity with millions of graphics processing units (GPUs). A day later, OpenAI revealed an expanded deal with Oracle and SoftBank, scaling its “Stargate” project to a $400 billion commitment across multiple phases and sites. Then on Thursday, OpenAI deepened its enterprise reach with a formal integration into Databricks — signaling a new phase in its push for commercial adoption.

“In all, this is the biggest tale yet of Silicon Valley’s signature fake it ’til you make it, and so far it seems to be working,” said Gil Luria, managing director at D.A. Davidson.

The startup, known mostly for its ChatGPT chatbot and GPT family of large language models, is trying to become something much bigger: the next hyperscaler. Never mind that it’s burning billions of dollars in cash and is fully reliant on outside capital to grow, nor that its buildout plans require the amount of energy that would be needed to power more than 13 million U.S. homes.

Altman has long said that delivering the next era of AI will require exponentially more infrastructure.

“You should expect OpenAI to spend trillions of dollars on data center construction in the not very distant future,” he told CNBC and a small group of reporters over dinner in San Francisco last month. “And you should expect a bunch of economists wringing their hands, saying, ‘This is so crazy, it’s so reckless,’ and we’ll just be like, ‘You know what? Let us do our thing.'”

The story OpenAI is selling is that it’s responding to market demand, which shows no signs of stopping. And eventually, the thinking goes, this will all be profitable.

Current financial projections show OpenAI is on track to generate $125 billion in revenue by 2029, according to a source familiar with the company’s internal forecasts.

It’s a bold bet – and one full of execution risk.

Building out 17 gigawatts of capacity would require the equivalent of about 17 nuclear power plants, each of which takes at least a decade to build. The OpenAI team says talks are underway with hundreds of infrastructure providers across North America, but there are no firm answers yet.

The U.S. grid is already strained, gas turbines are sold out through 2028, nuclear is slow to deploy and renewables are tied up in political roadblocks.

“I am extremely bullish about nuclear, advanced fission, fusion,” Altman said. “We should build more … a lot more of the current generation of fission plants, given the needs for dense, dense energy.”

What did crystallize this week, however, was the scale of Altman’s ambition as the OpenAI CEO began to put hard numbers behind his vision – some of them staggering. 

“Unlike previous technological revolutions or previous versions of the internet, there’s so much infrastructure that’s required, and this is a small sample of it,” Altman said Tuesday at OpenAI’s first Stargate site in Abilene, Texas.

That mentality – blunt, ambitious, and dismissive of convention – has defined Altman’s leadership in this new phase.

Deedy Das, partner at Menlo Ventures, said the scale of OpenAI’s infrastructure partnerships with Oracle may seem extreme to some, but he views it differently.

“I don’t see this as crazy. I see it as existential for the race to superintelligence,” he said.

Das argued that data and compute are the two biggest levers for scaling AI, and praised Altman for recognizing early on just how steep the ramp in infrastructure would need to be.

“One of his gifts is reading the exponential and planning for it,” he added.

History shows that breakthroughs in AI aren’t driven by smarter algorithms, he added, but by access to massive computing power. That’s why companies like OpenAI, Google, and Anthropic are all chasing scale.

OpenAI’s $850 billion buildout contends with grid limits

Alibaba, OpenAI, and Anthropic have all pointed to insatiable demand for their models from consumers and businesses alike. As these companies push to embed AI into everyday workflows, the infrastructure stakes keep rising.

Ubiquitous, always-on intelligence requires more than just code — it takes power, land, chips, and years of planning.

“I think people who use ChatGPT every day have no idea that this is what it takes,” Altman said, gesturing to the site in Abilene. “This is 10% of what the site is going to be. We’re doing ten of these.”

He added, “This requires such an insane amount of physical infrastructure to deliver.”

The cost of staying ahead

Though the buildout is flashy, the funding behind it remains hazy.

Nvidia’s $100 billion investment will arrive in $10 billion tranches over the next several years. OpenAI’s buildout commitment with Oracle and SoftBank could eventually reach $400 billion.

Microsoft, OpenAI’s largest partner and shareholder that holds a right of first refusal for cloud deals, “is not willing to write them an unlimited check for compute,” Luria said. “So they’ve turned to Oracle with a commitment considerably bigger than they can live up to.” 

As a non-investment-grade startup without positive cash flow, OpenAI still faces a major financing challenge.

Executives have called equity “the most expensive” way to fund infrastructure, and the company is preparing to take on debt to cover the rest of its buildout. Nvidia’s long-term lease structure could help OpenAI secure better terms from banks, but it still needs to raise multiples of that capital in the private markets.

OpenAI CFO Sarah Friar said the company plans to build some of its own first-party infrastructure — not to replace partners like Oracle, but to become a savvier operator. Doing some of the work internally, she said, makes OpenAI “a better partner” by allowing it to challenge vendor assumptions and gain a clearer view into actual costs versus padded estimates.

That, in turn, strengthens its position in rate negotiations.

“The other tool at their disposal to reduce burn rate is to start selling ads within ChatGPT, which may also help with the fundraising,” Luria suggested as a way to ease its burn rate.

Altman said earlier this year in an interview with Ben Thompson’s Stratechery that he’d rather test affiliate-style fees than traditional ads, floating a 2% cut when users buy something they discovered through the tool. He stressed rankings wouldn’t be for sale, and while ads aren’t ruled out, other monetization models come first.

That question of how to monetize becomes even more urgent amid OpenAI’s breakneck growth.

“We are growing faster than any business I’ve ever heard of before,” Altman said, adding that demand is accelerating so quickly that even this buildout pace will “look slow” in hindsight. Usage of ChatGPT, he noted, has surged roughly tenfold over the past 18 months, particularly on the enterprise side.

And that demand isn’t slowing.

Accenture CEO Julie Sweet told CNBC’s Sara Eisen on “Money Movers” Thursday that she’s seeing an inflection point in enterprise adoption. 

“Every CEO board in the C-suite recognizes that advanced AI is critical to the future,” she said. “The challenge right now they’re facing is that they’re really excited about the technology, and they’re not yet AI-ready — for most companies.”

Her firm signed 37 clients this quarter with bookings over $100 million.

“We’re still in the thick of it,” she added. “There’s a ton of work to do.”

Databricks CEO on OpenAI partnership: Enterprises are excited to get AI agents working

Ali Ghodsi, CEO of Databricks, said Thursday that concerns about overbuilding miss the bigger picture.

“There’s going to be much more AI usage in the future than we have today. There’s no doubt about that,” he said. “Not every person on the planet is using at the fullest capacity these AI models. So more capacity will be needed.” 

That optimism is one reason Ghodsi struck a formal integration deal with OpenAI this week — a partnership that brings GPT-5 directly into Databricks’ data tooling and reflects growing enterprise demand for OpenAI’s models inside business software.

Still, Ghodsi said it’s important to maintain flexibility.

Databricks now hosts all three major foundation models — OpenAI, Anthropic, and Alphabet’s Gemini — so customers aren’t locked into a single provider.

But even as infrastructure ramps up, the scale and speed of OpenAI’s spending spree have raised questions about execution.

Nvidia is supplying capital and chips. Oracle is building the sites. OpenAI is anchoring the demand. It’s a circular economy that could come under pressure if any one player falters.

And while the headlines came fast this week, the physical buildout will take years to deliver — with much of it dependent on energy and grid upgrades that remain uncertain. 

Friar acknowledged that challenge.

“There’s not enough compute to do all the things that AI can do, and so we need to get it started,” she said. “And we need to do it as a full ecosystem.”

WATCH: Oracle, OpenAI and SoftBank unveil $400 billion Stargate data center

Oracle, OpenAI and SoftBank unveil $400 billion Stargate data center expansions