Artificial intelligence (AI) has become the tech industry’s bright new toy, slated to revolutionize trillion-dollar industries from retail to healthcare. However, creating new chatbots and image generators requires a lot of power. In other words, this technology could be responsible for the massive carbon dioxide emissions that cause global warming.
Microsoft, Alphabet’s Google, and ChatGPT maker OpenAI use cloud computing. This cloud computing relies on thousands of chips in servers in massive data centers around the world, which train AI algorithms called models to analyze data and “learn” to perform tasks. To do.
AI uses more energy than any other form of computing, and training a single model could consume more electricity than 100 US households use in a year. Emissions can also vary significantly depending on the type of power plant that supplies the electricity. Data centers powered by coal- or natural gas-fired power plants contribute significantly more emissions than data centers powered by solar or wind farms.
Researchers aggregate emissions from creating a single model, and some companies provide data on energy use, but we don’t have overall estimates of the total amount of electricity the technology uses. Sasha Luccioni, a researcher at AI company Hugging Face, wrote a paper quantifying the carbon impact of her company Bloom, OpenAI’s GPT-3 competitor. I tried to extrapolate the same for OpenAI’s viral hit ChatGPT based on the data set.
Increased transparency
Researchers like Luccioni say there needs to be transparency about the power usage and emissions of AI models. More transparency can also lead to more scrutiny. The crypto industry may offer a cautionary tale.
According to a research paper published in 2021, it takes 1.287 gigawatt hours, or 120 U.S. households, to train GPT-3, a single general-purpose AI program that can generate language and has multiple uses. It required about the same amount of power as it does. Year. According to the same paper, that workout emitted 502 tons of carbon dioxide. This is for one program or model.
Training a model has a huge initial power cost, but researchers say that a model’s actual use can consume only about 40% of the power consumed, leaving billions of requests for popular programs. found to be inundated with OpenAI’s GPT-3 uses his 175 billion parameters or variables that AI systems have learned through training and retraining. Its predecessor used just 1.5 billion.
Another relative measure comes from Google, where researchers found AI accounted for 10-15% of the company’s total power consumption. This is 18.3 terawatt hours for him in 2021. This means Google’s AI consumes about 2.3 terawatt hours per year.
net zero pledge
Models are often larger, but AI companies are also constantly working to improve their models to run more efficiently. Microsoft, Google, and Amazon all commit to being carbon negative or neutral. In a statement, Google said it is pursuing net zero emissions across its operations by 2030, with the goal of running its offices and data centers entirely on carbon-free energy.
OpenAI cites the work it has done to make ChatGPT’s application programming interface more efficient, reducing power usage and pricing for customers. “We take our responsibility to halt and reverse climate change very seriously and are thinking hard about how to best use our computing power,” an OpenAI spokesperson said in a statement. increase.
Microsoft said it is taking other steps to purchase renewable energy and meet its previously announced goal of becoming carbon negative by 2030. We are working on ways to make large-scale systems more efficient, both in training and applications,” the company said in a statement.
There are ways to make AI run more efficiently. AI training can take place at any time, so developers and data centers can schedule training when power is cheap or surplus, making operations greener, says energy consultant Wood Mackenzie. Ben Hertz-Shargel said.
One of the great mysteries of AI is the total carbon footprint associated with the chips used. His Nvidia, the largest manufacturer of graphics processing units, says that when it comes to AI tasks, tasks can be completed more quickly, thus improving overall efficiency.
Nvidia discloses its direct emissions and energy-related indirect emissions, but does not disclose all of its indirect emissions, Luccioni said, adding that he did not use the data for the study. I asked for
Luccioni believes that if Nvidia shares that information, it will become clear that GPUs consume as much power as a small country. She said, ‘It’s going to be a banana’
Read also: Abu Dhabi’s ADNOC begins work on new carbon capture and storage project
