‘3 raccoons in a trench coat’: AI boom could actually warm our planet

Training GPT-3, a single general-purpose AI software, required 1.287-gigawatt hours, or roughly the amount of electricity used by 120 American houses annually, noted a study earlier.
Baba Tamim
Conceptual image: Two sides of AI.
Conceptual image: Two sides of AI.

Interesting Engineering/marvinh/iStock 

The worry of excessive carbon emissions warming our planet has increased with the advent of artificial intelligence (AI), a new technological revolution in the industry.

Every new chatbot and image generator requires a significant amount of electricity to build, increasing the quantity of carbon emissions that contribute to global warming, according to a Bloomberg report on Thursday.

"We're talking about ChatGPT, and we know nothing about it," Sasha Luccioni, a researcher at AI company Hugging Face Inc., told Bloomberg

"It could be three raccoons in a trench coat," said Luccioni, who wrote a paper quantifying the carbon impact of her company's BLOOM, a rival of OpenAI's GPT-3.

Based on a limited set of publicly available data, Luccioni attempted to estimate the same for OpenAI's viral hit ChatGPT.

AI giants don't disclose carbon emission graph

Microsoft Corporation, Alphabet Inc., Google parent, and OpenAI's ChatGPT use cloud computing, which depends on thousands of servers in massive data centers worldwide. 

This is done to train AI algorithms known as models by analyzing data to help them "learn" to execute jobs. 

Several businesses are rushing to develop products that leverage massive AI models to give features to anybody from Instacart customers to Snap users to CFOs in response to ChatGPT's success.

"Obviously, these companies don't like to disclose what model they are using and how much carbon it emits," said Roy Schwartz, professor at the Hebrew University of Jerusalem, who partnered with a group at Microsoft to measure the carbon footprint of a large AI model.

AI consumes more energy

AI consumes more energy than traditional types of computing, and just one model's training can consume more electricity in a year than 100 US homes do.

The industry is expanding so quickly and is so opaque that no one is certain how much of the overall electricity used and carbon emissions are attributable to AI.

A data center that gets its electricity from a coal or natural gas-fired plant will produce far more emissions than one that gets its electricity from solar or wind farms, so the emissions may also vary greatly depending on the type of power plants that supply that electricity.

Training GPT-3, a single general-purpose AI software that can generate language and has numerous applications, required 1.287-gigawatt hours, or roughly the amount of electricity used by 120 American houses annually, Bloomberg report said, quoting a study article released in 2021.

As per the study, such training produced 502 tons of carbon emissions, or roughly 110 US automobiles' worth, in a year. 

Researchers have also found that in some situations, the power used during model use—which involves processing billions of requests for popular programs—is only approximately 40% of the power used during training.

The models are also becoming bigger. The 175 billion parameters, or variables, that OpenAI's GPT-3 employs were learned through training and retraining by the AI system. The previous version just required 1.5 billion.

According to Microsoft Germany's CTO, Andreas Braun, GPT-4 will be released "next week." While this may have sparked excitement in the artificial intelligence (AI) community, it has raised concerns among climate change activists.

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board