Training AI Is Shockingly Costly to the Environment
We have seen the result of new artificial intelligence powered algorithms that can do everything from detect cancer to help drive cars.
AI is at the driving force of so many new technologies, but at what cost?
A new report shows that a common AI training model can emit more than 626,000 pounds of carbon dioxide equivalent. That’s about five times the lifetime emissions of the average American car - including the manufacture of the car itself.
SEE ALSO: AIS CONTINUE TO ACT IN UNPREDICTABLE WAYS, SHOULD WE PANIC?
While some AI researchers have noted the potential environmental impact of what they do - the actual figure has shocked the sector.
Making machines understand language is bad for the environment
The paper looked at the process of natural-language processing (NLP). this subfield of AI is focused on racing machines to work with human language. The NLP development community have made some fascinating breakthroughs in recent years. The work by researchers in this area are responsible for technology such as machine translation and sentence completion.
These advances are super helpful across a range of applications, but to get there, the training requires huge data sets scraped from the Internet. Converting these sprawling datasets from nonsense into a training program takes a huge amount of computer power and, as a result, a huge amount of energy.
Power costs needs to be addressed
Four models identified by the researchers as being responsible for the biggest leap forward in performance were examined; the Transformer, ELMo, BERT, and GPT-2.
To figure out just how much CO2 emissions the models were responsible for, the researchers first trained each model on a single GPU for up to a day - to measure its power draw.
They, then, used the number of training hours stated by the model’s original papers to calculate the total energy that the complete training process would consume.
This number was, then, converted into pounds of carbon dioxide equivalent. The results show that computational and environmental costs of training grew in proportion to the model size.
Final steps are the most costly
The costs ballooned when extra training was added to increase the model's accuracy.
For example, a tuning process known as ‘neural architecture’ that uses a trial and error model to optimize a model through tweaking the network design has huge cots for little overall gain.
By removing this final step, the most costly model, BERT, had a much more modest carbon footprint of about 1,400 pounds of carbon dioxide. This is almost equivalent to a round-trip trans-America flight for one person.
What’s worse is that the researchers say that these are modest numbers based on training a model to the most minimal level. And, that most big training programs will actually have a much larger footprint as they develop parts of the rising program from scratch.
The impact of these numbers is massive.
AI is driving everything from medical research to defense. Authors of the paper express their concern for the move of AI development away from academia, into the private world. The costs of AI development are so high, as seen by this paper, that educational institutions can’t keep up.
Tracking and monitoring the development of AI may become harder and harder.
Researchers are testing a new type of drug delivery device that stores the second dose of vaccine for a specified period before releasing the substance.