Skip to main content

What’s the carbon footprint of A.I.? This clever tool breaks it down

Deep-learning A.I. is the machine learning technology that powers everything from cutting-edge natural language processing to machine vision tools. It may also be powering climate change — as a result of the massive energy consumption and CO2 emissions associated with training these deep-learning models. As the use of deep learning has exploded, so has the compute power associated with them, although this effect is rarely studied.

Recommended Videos

Researchers at the University of Copenhagen’s Department of Computer Science are working to change that, however. They’ve developed a tool called Carbontracker, which works out the energy consumption associated with deep-learning algorithms and then converts this into a prediction about CO2 emissions.

“[Carbontracker] is implemented as a package, or extension, for the popular programming language Python, where the majority of machine learning takes place,” Benjamin Kanding, one of the researchers who worked on the project, told Digital Trends. “The way [it] works is that, during model training, it periodically measures the energy consumption of the hardware on which the model is trained and queries the live local carbon intensity — the CO2 emitted by electricity consumption — in the training region. These numbers are then combined to give an estimate of the total carbon footprint of model training and development.”

The A.I. tools we rely on

The amount of energy used by some of the tools on which we rely on a daily basis is pretty terrifying. For instance, a 2019 study by researchers at the U.K.’s University of Bristol suggested that YouTube videos carry a carbon footprint of around 10 million tons of CO2 equivalent each year. They suggested that carrying out some relatively minor code tweaks could save 100,000 to 500,000 tons of CO2 equivalent every year.

In the case of Carbontracker, Kanding said that the aim is not to point to specific models and claim they are “ruining the environment.” Instead, it’s an attempt to raise awareness about the impact of compute-intensive research and to promote the development of energy-efficient deep neural networks and “responsible computing.” This could hopefully lead to the reduction of carbon footprints associated with the training and development of deep-learning models. (One possible immediate-term solution would be to make sure training is carried out at data centers powered by green energy.)

However, the researchers do give some indication of just how significant the environmental impact of certain A.I. tools can be. For example, a single training session for the ultra-advanced deep-learning language model GPT-3 repotedly consumes the equivalent energy of 126 homes in the researchers’ native Denmark. It also spits out the same CO2 quantity as almost 44,000 miles of driving in a car.

Lasse F. Wolff Anthony, another researcher on the project, said that there are no current plans to make Carbontracker available in the form of a plug-in for coders. “The current goals for Carbontracker is to improve the tool in Python by making it more lightweight [and] easier to use, and to extend its capabilities by supporting a larger variety of hardware and more regions for querying the live carbon intensity,” Anthony said.

The project is open-source, and the researchers say they “actively encourage” contributions from anyone who wants to get involved.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Kid-mounted cameras help A.I. learn to view the world through eyes of a child
smart toys not for kids gps enabled smartwatch

Talk to any artificial intelligence researcher and they’ll tell you that, while A.I. may be capable of complex acts like driving cars and spotting tiny details on X-ray scans, they’re still way behind when it comes to the generalized abilities of even a 3-year-old kid. This is sometimes called Moravec’s paradox: That the seemingly hard stuff is easy for an A.I., while the seemingly easy stuff is hard.

But what if you could teach an A.I. to learn like a kid? And what kind of training data would you need to feed into a neural network to carry out the experiment? Researchers from New York University recently set out to test this hypothesis by using a dataset of video footage taken from head-mounted cameras worn regularly by kids during their first three years alive.

Read more
A.I. can tell if you’re a good surgeon just by scanning your brain
brain with computer text scrolling artificial intelligence

Could a brain scan be the best way to tell a top-notch surgeon? Well, kind of. Researchers at Rensselaer Polytechnic Institute and the University at Buffalo have developed Brain-NET, a deep learning A.I. tool that can accurately predict a surgeon’s certification scores based on their neuroimaging data.

This certification score, known as the Fundamentals of Laparoscopic Surgery program (FLS), is currently calculated manually using a formula that is extremely time and labor-consuming. The idea behind it is to give an objective assessment of surgical skills, thereby demonstrating effective training.

Read more
A.I.’s next big challenge? Playing a quantum version of Go
alphago zero

When Google DeepMind’s AlphaGo program defeated the world’s greatest Go player in March 2016, it represented a major tech breakthrough. Go, a Chinese board game in which the goal is to surround more territory than your opponent, is a game that’s notoriously easy to learn but next to impossible to master. The total number of allowable board positions exceeds the total number of atoms in the observable universe. However, an A.I. still learned to defeat one of humanity’s best players.

But while cutting-edge technology made this possible, cutting-edge technology could also make mastering Go even more difficult for future machines -- thanks to the insertion of quantum computing concepts like entanglement to add a new element of randomness to the game.

Read more