GPT-3 training consumed 700k liters of water, 'enough for producing 370 BMWs'

The data centers that help train ChatGPT-like AI are very 'thirsty,' finds a new study.
Christopher McFadden
The problem involves the way data centers are cooled.


A new study has uncovered how much water is consumed when training large AI models like OpenAI's ChatGPT and Google's Bard. The estimates of AI water consumption were presented by researchers from the Universities of Colorado Riverside and Texas Arlington in a pre-print article titled "Making AI Less 'Thirsty.'"

Of course, the water used to cool these data centers doesn't just disappear into the ether but is usually removed from water courses like rivers. The researchers distinguish between water "withdrawal" and "consumption" when estimating AI's water usage.

In contrast to consumption, which relates mainly to water loss due to evaporation when used in data centers, the former involves physically removing water from a river, lake, or other sources. The consumption component of that equation, where the study claims "water cannot be recycled," is where most of the study on AI's water use is concentrated.

However, it is essential to note that water "consumed" is released back into the atmosphere through cooling towers and is not "lost" from the water cycle. However, it takes time for it to return to Earth in liquid form again.

The study reveals that an average data center uses about a gallon of water for every kilowatt-hour. Additionally, not just any water can be used.

To prevent corrosion or bacterial development that can occur with seawater, data centers get their water from pristine freshwater sources. To control the humidity in the rooms, fresh water is also necessary.

The scientists also hold the data centers responsible for "off-site indirect water consumption," or the water required to produce the large amounts of power they consume.

But how much water are we talking about here?

According to the study, the large data centers used to train them require so much water to cool them that, in the case of the ChatGPT, it could fill a nuclear power plant's cooling tower.

GPT-3's training alone required 185,000 gallons (700,000 liters) of water. According to the study, a typical user's interaction with ChatGPT is equivalent to emptying a sizable bottle of fresh water onto the ground.

Researchers worry that all this consumption could hurt water supplies given the chatbot's enormous popularity, especially in light of the past droughts and impending environmental instability in the US.

Microsoft, which has partnered with OpenAI for multiple years and billions of dollars and built supercomputers for AI training, claims that its most recent supercomputer, which would require an extensive cooling apparatus, contains 10,000 graphics cards and over 285,000 processor cores, providing a glimpse into the enormous scale of the operation behind artificial intelligence.

In other words, ChatGPT, which followed GPT-3, would need to "drink" a 500-milliliter water bottle to finish a fundamental discussion with a user of around 25–50 questions.

The enormous volume of water required to train the AI model also presupposes that the training occurs at Microsoft's cutting-edge US data center, which might cost tens of millions of dollars to build just for OpenAI.

According to the research, water use might have increased threefold if the data had been generated in the business' less energy-efficient Asian data center. With more recent models, like the just-released GPT-4, which depend on a broader range of data factors than their predecessors, the researchers anticipate that these water requirements will continue to rise.

“AI models’ water footprint can no longer stay under the radar,” the researchers said. “Water footprint must be addressed as a priority as part of the collective efforts to combat global water challenges,” they added.

Study abstract:

"The growing carbon footprint of artificial intelligence (AI) models, especially large ones such as GPT-3 and GPT-4, has been undergoing public scrutiny. Unfortunately, however, the equally important and enormous water footprint of AI models has remained under the radar. For example, training GPT-3 in Microsoft’s state-of-the-art U.S. data centers can directly consume 700,000 liters of clean freshwater (enough for producing 370 BMW cars or 320 Tesla electric vehicles) and the water consumption would have been tripled if training were done in Microsoft’s Asian data centers, but such information has been kept as a secret. This is extremely concerning, as freshwater scarcity has become one of the most pressing challenges shared by all of us in the wake of the rapidly growing population, depleting water resources, and aging water infrastructures. To respond to the global water challenges, AI models can, and also should, take social responsibility and lead by example by addressing their own water footprint. In this paper, we provide a principled methodology to estimate fine-grained water footprint of AI models, and also discuss the unique spatial-temporal diversities of AI models’ runtime water efficiency. Finally, we highlight the necessity of holistically addressing water footprint along with carbon footprint to enable truly sustainable AI.Source codes: The codes used to generate the results in this paper are available at:"

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board