Dark cloud over ChatGPT revolution: the cost

WASHINGTON: The explosion of generative AI has taken the world by storm, but one question all too rarely comes up: Who can afford it?

OpenAI bled around $540 million last year as it developed ChatGPT and says it needs $100 billion to meet its ambitions, according to industry media The Information.

“We’re going to be the most capital-intensive startup in Silicon Valley history,” OpenAI’s founder Sam Altman told a panel recently.

And when Microsoft, which poured billions of dollars in investment into OpenAI, is asked about how much its AI adventure will cost, the company answers with assurances that it is keeping an eye on its bottom line.

Building something even near the scale of what OpenAI, Microsoft or Google have on offer would require an eye-watering investment on state-of-the-art chips and recruiting prize-winning researchers.

“People don’t realize that to do a significant amount of AI things like ChatGPT takes huge amounts of processing power. And training those models can cost tens of millions of dollars,” said Jack Gold, an independent analyst.

“How many companies can actually afford to go out and buy 10,000 Nvidia H100 systems that go for tens of thousands of dollars a piece?” asked Gold.

The answer is pretty much no one and in tech, if you can’t build the infrastructure, you rent it and that is what companies already do massively by outsourcing their computing needs to Microsoft, Google and Amazon’s AWS.

And with the advent of generative AI, this dependency on cloud computing and tech giants deepens, leaving the same players in the driver’s seat, experts warned.

‘Heavily underestimated’

The unpredictable costs of cloud computing, “is a heavily underestimated problem for many companies,” said Stefan Sigg, Chief Product Officer at Software AG, which develops software for businesses.

Sigg compares cloud costs to electricity bills and says companies that don’t know better are in for “a big surprise” if they let their engineers run up bills in the mad rush to build tech, including AI.

Microsoft’s signature cloud offer is Azure and some observers believe the giant’s all-in bet on AI is really about protecting Azure success and guaranteeing the cash cow’s future.

Azure has been the giant’s unsexy bread-winner for years, bringing in huge profits but without attracting the headlines of an iPhone or social media that go straight to the consumer.

For Microsoft, “the golden goose is a monetising cloud with Azure because we’re talking about what could be a $20, $30, $40 billion opportunity annually down the road if the AI bet is successful,” said Dan Ives of Wedbush Securities.

Microsoft CEO Satya Nadella insists that generative AI is “moving fast in the right direction.”

Deeply respected on Wall Street, Nadella will have a six- or nine-month grace period to show his bet is a winner, Ives predicted.

Microsoft acknowledges the risk, but insists that on AI, it must “lead this wave,” CFO Amy Hood told analysts this month.

“We will charge for those AI capabilities, and then ultimately, we’ll deliver operating profit,” she said.

‘Squashed out’

Piling up profit at the company founded by Bill Gates can only mean passing on the cost of AI to customers.

From Main Street to Fortune 500, the dependency on the AI-amped will be an expensive one and companies and investors are drumming up alternatives to at least reduce the bill.

“AI training, and GPT training will become a very important cloud service going forward,” said Spectro Cloud CEO Tenry Fu.

His company, like many others in the sector, helps companies optimize cloud technology to reduce expenses.

“But after training, a company will be able to get their model back for real AI application” and the dependence on the cloud giants will hopefully be reduced, he added.

Regulators are hoping that they can keep up, and not leave the giants in charge, imposing their terms on smaller companies.

“Law enforcers (must) ensure that… opportunities and openings for competition… are not getting squashed out by the incumbents,” FTC chairwoman Lina Khan told CNBC.

But it might be too late, at least when it comes to which companies have the means to provide the groundwork of generative AI.

“It is absolutely true that the number of companies that can train the true frontier models is going to be small just because of the resources required,” OpenAI’s Altman told a US Senate panel on Tuesday.

“And so I think there needs to be incredible scrutiny on us and our competitors,” he added.

Must Read

China, Italy reiterate commitment to deeper collaboration, mutual prosperity

MILAN: Chinese and Italian officials, alongside business leaders, have reaffirmed their commitment to enhanced collaboration and mutual prosperity during the launch ceremony of the...