Skip to main content

ChatGPT’s resource demands are getting out of control

a server
panumas nikhomkhai / Pexels

It’s no secret that the growth of generative AI has demanded ever increasing amounts of water and electricity, but a new study from The Washington Post and researchers from University of California, Riverside shows just how many resources OpenAI’s chatbot needs in order to perform even its most basic functions.

Recommended Videos

In terms of water usage, the amount needed for ChatGPT to write a 100-word email depends on the state and the user’s proximity to OpenAI’s nearest data center. The less prevalent water is in a given region, and the less expensive electricity is, the more likely the data center is to rely on electrically powered air conditioning units instead. In Texas, for example, the chatbot only consumes an estimated 235 milliliters needed to generate one 100-word email. That same email drafted in Washington, on the other hand, would require 1,408 milliliters (nearly a liter and a half) per email.

Data centers have grown larger and more densely packed with the rise of generative AI technology, to the point that air-based cooling systems struggle to keep up. This is why many AI data centers have switched over to liquid-cooling schemes that pump huge amounts of water past the server stacks, to draw off thermal energy, and then out to a cooling tower where the collected heat dissipates.

ChatGPT’s electrical requirements are nothing to sneeze at either. According to The Washington Post, using ChatGPT to write that 100-word email draws enough current to operate more than a dozen LED lightbulbs for an hour. If even one-tenth of Americans used ChatGPT to write that email once a week for a year, the process would use the same amount of power that every single Washington, D.C., household does in 20 days. D.C. is home to roughly 670,000 people.

This is not an issue that will be resolved any time soon, and will likely get much worse before it gets better. Meta, for example, needed 22 million liters of water to train its latest Llama 3.1 models. Google’s data centers in The Dalles, Oregon, were found to consume nearly a quarter of all the water available in the town, according to court records, while xAI’s new Memphis supercluster is already demanding 150MW of electricity — enough to power as many as 30,000 homes — from the the local utility, Memphis Light, Gas and Water.

Andrew Tarantola
Former Computing Writer
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
Meta’s new AI app lets you share your favorite prompts with friends
Meta AI WhatsApp widget.

Meta has been playing the AI game for a while now, but unlike ChatGPT, its models are usually integrated into existing platforms rather than standalone apps. That trend ends today -- the company has launched the Meta AI app and it appears to do everything ChatGPT does and more.

Powered by the latest Llama 4 model, the app is designed to "get to know you" using the conversations you have and information from your public Meta profiles. It's designed to work primarily with voice, and Meta says it has improved responses to feel more personal and conversational. There's experimental voice tech included too, which you can toggle on and off to test -- the difference is that apparently, full-duplex speech technology generates audio directly, rather than reading written responses.

Read more
It’s not your imagination — ChatGPT models actually do hallucinate more now
Deep Research option for ChatGPT.

OpenAI released a paper last week detailing various internal tests and findings about its o3 and o4-mini models. The main differences between these newer models and the first versions of ChatGPT we saw in 2023 are their advanced reasoning and multimodal capabilities. o3 and o4-mini can generate images, search the web, automate tasks, remember old conversations, and solve complex problems. However, it seems these improvements have also brought unexpected side effects.

What do the tests say?

Read more
ChatGPT’s awesome Deep Research gets a light version and goes free for all
Deep Research option for ChatGPT.

There’s a lot of AI hype floating around, and it seems every brand wants to cram it into their products. But there are a few remarkably useful tools, as well, though they are pretty expensive. ChatGPT’s Deep Research is one such feature, and it seems OpenAI is finally feeling a bit generous about it. 

The company has created a lightweight version of Deep Research that is powered by its new o4-mini language model. OpenAI says this variant is “more cost-efficient while preserving high quality.” More importantly, it is available to use for free without any subscription caveat. 

Read more