Skip to main content

You won’t believe how much ChatGPT costs to operate

A new report claims to know how much ChatGPT costs to run per day, including how much each query approximately costs. The popular text-to-speech chatbot might have set off an AI revolution in November 2022, but has proven extremely expensive to maintain.

The new report comes from Dylan Patel, chief analyst at the research firm SemiAnalysis, who says it costs approximately $700,000 per day, or 36 cents per query, to keep the chatbot up and running.

Recommended Videos

It’s so expensive, in fact, that Microsoft might be developing its own proprietary AI chips to assist in the maintenance of OpenAI’s operation of ChatGPT, according to Windows Central.

The ChatGPT website on a laptop's screen as the laptop sits on a counter in front of a black background.
Airam Dato-on/Pexels / Pexels

In addition to quickly hitting a 100 million active users in January, a feat that previously took tech brands years to achieve, ChatGPT has struggled with high traffic and capacity issues slowing down and crashing its servers. The company attempted to remedy this by introducing a paid ChatGPT Plus tier, which costs $20 per month, however, there is no word on how many users subscribe to the paid option.

OpenAI currently uses Nvidia GPUs to maintain not only its own ChatGPT processes, but also those of the brands with which it partners. Industry analysts expect the company will likely require an additional 30,000 GPUs from Nvidia to maintain its commercial performance for the remainder of 2023 alone.

With Microsoft as one of its primary collaborators and investors, OpenAI might be looking at the tech brand to assist in developing hardware to bring down the cost of operations for ChatGPT. According to Windows Central, Microsoft already has this AI chip in the works. Code-named Athena, it is currently being tested internally with the brand’s own teams. The chip is expected to be introduced next year for Microsoft’s Azure AI services.

There is no word on how or when the chip will trickle down to OpenAI and ChatGPT, but the assumption is that it will. The connection comes from the fact that ChatGPT is supported by Azure services. The AI chip might not fully replace Nvidia GPUs, but might help decrease the demand for the hardware, thus reducing the cost of running ChatGPT, Windows Central added.

Fionna Agomuoh
Fionna Agomuoh is a Computing Writer at Digital Trends. She covers a range of topics in the computing space, including…
It’s not your imagination — ChatGPT models actually do hallucinate more now
Deep Research option for ChatGPT.

OpenAI released a paper last week detailing various internal tests and findings about its o3 and o4-mini models. The main differences between these newer models and the first versions of ChatGPT we saw in 2023 are their advanced reasoning and multimodal capabilities. o3 and o4-mini can generate images, search the web, automate tasks, remember old conversations, and solve complex problems. However, it seems these improvements have also brought unexpected side effects.

What do the tests say?

Read more
ChatGPT’s awesome Deep Research gets a light version and goes free for all
Deep Research option for ChatGPT.

There’s a lot of AI hype floating around, and it seems every brand wants to cram it into their products. But there are a few remarkably useful tools, as well, though they are pretty expensive. ChatGPT’s Deep Research is one such feature, and it seems OpenAI is finally feeling a bit generous about it. 

The company has created a lightweight version of Deep Research that is powered by its new o4-mini language model. OpenAI says this variant is “more cost-efficient while preserving high quality.” More importantly, it is available to use for free without any subscription caveat. 

Read more
The original AI model behind ChatGPT will live on in your favorite apps
OpenAI press image

OpenAI has released its GPT‑3.5 Turbo API to developers as of Monday, bringing back to life the base model that powered the ChatGPT chatbot that took the world by storm in 2022. It will now be available for use in several well-known apps and services. The AI brand has indicated that the model comes with several optimizations and will be cheaper for developers to build upon, making the model a more efficient option for features on popular applications, including Snapchat and Instacart. 

Apps supporting GPT‑3.5 Turbo API

Read more