ChatGPT has received both good and bad press since its inception, and it even got banned in some places. But have you ever wondered how much money it takes to run an advanced AI-powered bot of this scale? Well, a research firm named SemiAnalysis did the math.
The firm estimated that running ChatGPT requires approximately $700,000 per day which narrows down to around 36 cents per query. And most of the expense goes into the hardware infrastructure required to run the AI systems. SemiAnalysis writes in a blog post:
Estimating ChatGPT costs is a tricky proposition due to several unknown variables. We built a cost model indicating that ChatGPT costs $694,444 per day to operate in compute hardware costs. OpenAI requires ~3,617 HGX A100 servers (28,936 GPUs) to serve Chat GPT.
ChatGPT is among the fastest-growing technologies that amassed over 100 million active users in January, just two months after its launch. For reference, it took nine months for TikTok and 2.5 years for Instagram to achieve the same milestone. SemiAnalysis chief analyst Dylan Patel told Insider that this initial estimate is based on GPT-3 and the newer model GPT-4 might be even more expensive to operate.
The Information (paywalled) recently reported that OpenAI's prominent backer Microsoft has been working on a dedicated AI chip that is expected to reduce the cost of operations. OpenAI also launched its $20 premium subscription ChatGPT Plus earlier this year to earn some cash.
SemiAnalysis says that if the ChatGPT model is used to power Google's existing search businesses, it would eat $36 billion from the company's profits in "LLM inference costs" alone.
"Deploying current ChatGPT into every search done by Google would require 512,820.51 A100 HGX servers with a total of 4,102,568 A100 GPUs. The total cost of these servers and networking exceeds $100 billion of Capex alone, of which Nvidia would receive a large portion," the firm writes.