- The environmental impact of AI is largely unknown, but the points of a new paper give us insight into it.
- GPT-3 training requires water to reduce the heat generated during the computation process.
- For every 20 to 50 questions, the ChatGPT server has to “drink” the equivalent of a 16.9 oz water bottle.
As the public flocks to generative AI tools like ChatGPT, the environmental impact of new technologies is beginning to become apparent.
Data on AI and sustainability are still scarce, but a recent study by researchers at the University of California, Riverside and the University of Texas at Arlington showed the water footprint of AI models like OpenAI’s GPT-3 and GPT-4. has been pointed out.
It was estimated that Microsoft used 700,000 liters (approximately 185,000 gallons) of fresh water while training GPT-3 in its data center. That’s enough to fill a reactor cooling tower, according to Gizmodo, and research shows that 370 of them are the same amount used to produce 370 BMW cars or 320 Tesla cars. is.
Using these numbers, it was determined that ChatGPT required 500ml of water, or a standard 16.9oz water bottle, for every 20-50 questions answered.
“500ml of bottled water may not seem like too much, but due to ChatGPT’s large user base, the combined footprint of water required for inference is still very large,” said the study. The author of writes
Microsoft is “investing in research to measure the energy use and carbon impact of AI while working on ways to make large-scale systems more efficient in both training and application,” Microsoft said. A spokesperson told Insider in a statement.
“We also continue to invest in renewable energy purchases and other initiatives to reach our sustainability goals of being carbon negative, water positive and zero waste by 2030,” they added. rice field.
OpenAI did not respond to Insider’s request for comment.
AI models such as GPT-3 and GPT-4 are hosted in data centers, which are physical warehouses that house large numbers of computational servers. These servers identify patterns and links across massive datasets that utilize energies such as electricity, coal, nuclear power, and natural gas.
The training process consumes considerable energy, which is then converted into heat. Water is then used on-site to maintain temperature throughout the infrastructure. Proper humidity control requires fresh water, as studies show that salt water can lead to “corrosion, clogged water pipes, and bacterial growth.”
Going forward, these numbers “could increase several-fold with the newly launched GPT-4, which has a significantly larger model size,” the researchers said.
Using a proprietary methodology to calculate on-site and off-site water use efficiency (WUE), in addition to energy usage, the research team calculated the water footprint of Google’s large-scale language model, known as LaMDA. I also made an estimate for
Ultimately, however, the lack of transparency regarding water consumption associated with AI training makes it difficult to determine the actual footprint. When asked about LaMDA’s water usage, Google pointed out that in a November 2022 report, it published his 2021 data on extensive water consumption across data centers.
“While it is impossible to know the actual water footprint without detailed information from Google, our estimate is that the total water footprint of LaMDA training is on the order of millions of liters. has been shown,” the researchers wrote.
While the carbon footprint associated with generative AI is beginning to sound alarm bells, the researchers argued that their paper “requires water emissions along with carbon dioxide emissions to achieve truly sustainable AI.” It aims to “emphasize the need for a holistic approach.”
Watch Now: Top Insider Inc. Videos
Now loading…
