Recently, I have been asking the same concerning questions from university staff over and over again. “Does AI use contribute to climate change?”
Tools such as ChatGpt, Copilot, and Gemini are widely used to traverse installation workloads. Scholars use AI to prepare educational materials. Professional service staff use it to manage their inbox. Researchers use it to summarize journal articles. However, the waves of headlines on the rise in energy demand for AI and the intensive use of water for cooling in data centers have made them wonder whether that labor savings are all too high in environmental costs.
I am committed to digital sustainability at JISC, a nonprofit organization that supports further education and higher education in the UK. My job is to help universities and universities understand the environmental impact of digital tools: good, bad, inconvenient complexity.
Let's start with the good ones. When used wisely, digital technology can reduce its environmental impact. Cloud-based systems are often more energy efficient than local server aging when deployed thoughtfully. Online collaboration tools can reduce unnecessary travel. Smart systems for heating, cooling and lighting have helped engines reduce energy waste so they can measure.
advertisement
But here is the unpleasant reality. Our digital life has a growing footprint of the environment. All streamed videos, emails, or notifications consume electricity. The recent boom in AI-powered tools adds new layers to this, embedding additional high-energy processes into everyday routines, and increasing pressure on the infrastructure that drives them.
But this is where guilt begins to get misguided.
advertisement
Using chatgpt to build a lecture overview or refine your coursework overview, academics consume energy. The commonly cited estimate is approximately 3 watts of time per query. However, experts warn that such figures are based on rough estimates, primarily due to the lack of transparency in the data provided by AI companies. Still, energy use may be modest in the context of everyday digital activities. Certainly it's just the energy used for short video calls and for several minutes of video streaming. For reference, it takes more than 100 watts to boil a full kettle.
Let's be clear: this is not what. It's not that “AI is fine because there's Netflix and Nespresso.” It does not suggest that the individual use of the generator AI tools is environmentally irrelevant. However, the impact of these tools must be relatively measured. If you don't routinely question the footprint of watching YouTube videos, deploying social media, or leaving webcams in online meetings, everything relies on the same energy-intensive infrastructure.
And the pressure on people in higher education is very real. A recent sector-wide survey found that less than half of academic staff felt comfortable managing their workloads, with only 44% feeling well supported in the workplace. In that context, it is not generous to look at AI tools to save time. It is a practical response to relentless pressure.
This is where the concept of green shift appears. He's Greenwashing's cousin. Instead of overstuffing an organization's environmental claims, Greenshift subtly redirects responsibility downwards, from businesses to consumers, systems to individuals. When the climate burden lands on the shoulders of individuals using AI rather than governments that build infrastructure or regulate governments, our focus is in the wrong place.
advertisement
You should absolutely be worried about AI's energy and water use, but let's focus on concerns that could make the biggest difference. That means demanding accountability from AI developers. It's a real investment in transparent energy reporting, clean infrastructure and renewable electricity. This means that governments are stepping into regulations, standards and incentives for operating greener data centres.
It is also worth recognizing that not all AI is created equally. Generated chatbots like ChatGpt represent only a small fraction of AI-related energy usage. The actual large amount of energy consumption is happening elsewhere. There are not only video generation and analysis, targeted ads, recommendation engines, and new AI models, but there are also large general-purpose AI systems on the horizon.
In that case, the environmental threat is not an environmental threat, relying on chatgpt to summarise meeting notes and use copilot to draft emails. This is an unidentified expansion of opaque infrastructure, a systemic absence of emissions data, and a lack of guardrails to ensure sustainable deployment.
The individual perception is good. It helps to drive informed choices, build pressure on change and retain institutions that explain it. But individual responsibility? It completely misses the point. Guilt doesn't solve the problem, but regulation, transparency, and system-level accountability may be exactly that.
advertisement
Cal Innes It's a Digital Sustainability Specialist At JISC.
