High costs cast shadow over ChatGPT revolution

The explosion of generative AI has taken the world by storm, but one question all too rarely comes up: Who can afford it? 

OpenAI bled around $540 million last year as it developed ChatGPT and says it needs $100 billion to meet its ambitions. 

"We're going to be the most capital-intensive startup in Silicon Valley history," OpenAI's founder Sam Altman told a U.S Senate panel recently.

And when Microsoft, which poured billions of dollars in investment into OpenAI, is asked about how much its AI adventure will cost, the company answers with assurances that it is keeping an eye on its bottom line.

Building something even near the scale of what OpenAI, Microsoft or Google have on offer would require an eye-watering investment on state-of-the-art chips and recruiting prize-winning researchers.

"People don't realize that to do a significant amount of AI things like ChatGPT takes huge amounts of processing power. And training those models can cost tens of millions of dollars," said Jack Gold, an independent analyst.

"How many companies can actually afford to go out and buy 10,000 Nvidia H100 systems that go for tens of thousands of dollars a piece?" asked Gold.

The answer is pretty much no one and in tech, if you can't build the infrastructure, you rent it and that is what companies already do massively by outsourcing their computing needs to Microsoft, Google and Amazon's AWS.

And with the advent of generative AI, this dependency on cloud computing and tech giants deepens, leaving the same players in the driver's seat, experts warned.

From Main Street to Fortune 500, the dependency on the AI-amped will be an expensive one and companies and investors are drumming up alternatives to at least reduce the bill.<...

Continue reading on: