‘Environmentally-positive AI is achievable’: Practical tips to mitigate AI’s climate impacts

At a time when the environmental footprint of artificial intelligence is coming to the fore, experts recently shared advice for corporate AI users to mitigate its negative climate impacts at a CSO Futures webinar.
Gathered on April 15 to discuss how AI and sustainability can co-exist, Lisa Wee, Chief Sustainability Officer at industrial software firm AVEVA, Joseph Cook, Head of R&D at the Green Software Foundation, and Sandeep Chandna, Chief Sustainability Officer at Tech Mahindra, agreed that measuring AI’s GHG footprint remains a complex exercise.
“I think the right way to think about AI sustainability is, it's like a, it's like a set of scales that has to be balanced right. On one side we've got all the negative externalities, including for the environment, and on the other side, we've got the positive externalities, including societal benefits. And the challenge that we've got today is that it's very, very hard to accurately estimate how much weight is sitting on either side of the scale,” Cook explained.
The Green Software Foundation is currently developing a carbon intensity measurement framework for artificial intelligence models, but Cook warned that basic questions on what and how to measure exactly remain open. “What's an appropriate unit to represent AI model emissions in? Is it carbon per request? Is it carbon per task completed? Is it carbon per unit? Time? Is it something related to token generation?” he asked, adding that each option comes with its own trade-offs.
But even if detailed data remains unavailable, it’s clear that something needs to be done to ensure AI is a net climate positive: according to the latest New Energy Outlook by BloombergNEF, an additional 362 gigawatts of power plant capacity will be required by 2035 to meet data centre energy demand – and 64% of the new electricity generated to meet this demand is likely to come from fossil fuels.
So what can Chief Sustainability Officers do right now to mitigate the potential negative climate impacts of AI – all the while leveraging it to meet their company’s sustainability and business goals? Experts at the webinar had a few ideas.
What kind of AI do you need (and do you really need it?)
The first thing to recognise is that “AI is not a monolith”, said Wee. She gave examples of AI applications being used by AVEVA clients including Australian energy company AGL – which uses it to maximise the efficiency of wind turbines, and Spanish firm Acciona – which managed to reduce the energy consumption of water desalination pumps by 4.6% with AI-based predictive data analysis.
“That's not what people immediately think of in the age of AI. We've been so much exposed to the generative AI tools, and have been playing around with things like ChatGPT, but our industries are still on this journey towards digital transformation at scale, and being able to reap the benefits over the top of well thought through AI applications,” Wee added.
“I think it's just really important to remember that AI is not one thing: it’s different types of technology, some of them more compute-intensive than others.”
She warned that at this point, it would be difficult for Chief Sustainability Officers to integrate AI sustainability goals into their work, considering the lack of a common, standardised framework for measuring and reducing AI emissions.
“We're all struggling with the fact that we can have high level principles, but what would be the measurements and the kpis that we would set in place to to know and show that we're living up to them? That just doesn't exist yet,” Wee said.
And for companies whose employees have started using ChatGPT for everything and anything, one statistic might help staff adopt a more frugal approach: one ChatGPT search uses about 10 times the electricity of a basic Google search
Green data centres – and greenwashing risks
The environmental impact of AI comes from the expanded data centre infrastructure needed to power it – so in the race to make AI more sustainable, some data centre providers have come up with a “green” offering.
“The intention of a green data center is that it has minimal negative environmental externalities. So you can expect them to have a range of systems in place that minimise their electricity consumption, use cleaner electricity, make them more efficient in the way they use water, etc,” explained Cook at the Green Software Foundation.
The most obvious thing is the type of energy used by the data centre, which depends strongly on its location and the grid it sources electricity from.
But Cook added that there are also systems in place to handle the energy inside the data centre: “PUE or power use efficiency is a metric that's used a lot. It's about how efficiently the energy being brought into a data centre is directed towards doing computational work and how much is used by the servers rather than the surrounding operations like lighting, heating, cooling, maintenance, etc.”
Most data centres try to optimise these parameters through systems like waste heat recovery, with some even redistributing this heat to the local community. Some also have systems in place to minimise the number of idling servers – which can consume as much as 70% of the power of a server operating at full capacity. “Idling servers are very wasteful,” warned Cook.
But it’s important to know that there is no standard or certification scheme for green data centres, meaning Chief Sustainability Officers need to look carefully at claims. “The challenge people have in choosing a green data centre is to actually have to dig into the claims and determine what processes they actually have in place, and whether they're relying on just netting off energy certificates rather than reducing on-site emissions,” he added.
Green software and carbon-aware computing
Finally, companies can adopt carbon-aware AI practices, which can involve moving compute jobs to regions with lower-carbon electricity grids or to times with more access to renewable power, for example. This would be particularly useful when training AI models – a highly energy-intensive exercise.
At Tech Mahindra, Chandna suggested: “Companies should adopt green software engineering principles that reduce the carbon intensity of applications. This includes optimising code for energy efficiency, and reducing redundancy in AI training models. If you are curating the data even before it is put up for training, that itself will save around 25 to 30% on energy.”
Another aspect would be to choose frameworks that are less energy-intensive or that minimise data transfers within the system, he explained.
Tech Mahindra itself recently launched a ‘Green Code Refiner’ that uses AI-driven prompts to refactor existing codes to codes that are about 30 to 40% more energy-efficient.
Finally, Chandna also talked about green data centres and more efficient PUE. “If PUE is in the range of between 1.5 to 1.2 [1.0 being perfect data centre efficiency], that is where we should look at.”
Will AI be a net climate positive?
Ironically, AI is part of the answer to reduce its own impacts, helping to refactor code and identify ways to reduce the energy efficiency of data centre infrastructure and operations.
“I had first-hand experience of how AI-driven automation or analytics can help operational efficiency, so I believe that AI’s positive impact can outweigh the negative. But I think there are a few steps to take: running AI on green compute infrastructure, embedding models or training pipelines in an optimised way, and prioritising use cases that are going to directly advance sustainability, “Chandna summarised.
At AVEVA, Wee sees this as “one of the key aspects” of her role as the CSO of a software firm, “to make sure that we shepherd it towards that direction”. This means getting involved in developing the right measurement frameworks and green software design principles, as well as asking the right questions of data centre teams. “You are seeing this idea among CSOs and their AI teams: there is a discussion that's happening to think through the roadmap of AI maturity and sustainability maturity, and how do we look to align them so that it's a better outcome for the business, but also for the environment,” she added.
For Cook, “environmentally-positive AI is achievable”, but “we're at a point in time with many possible futures ahead,” he warned.
“We need several things to be true: one, that data centers be powered with clean electricity and be power and water-efficient, and that there's some scalable solution for e-waste recycling as well. And then we also need some big wins for AI and making other industries less environmentally harmful, because we only need AI to nudge a few percent of the emissions out of heavy manufacturing industries, agriculture, transport, etc, to offset its own negative externalities,” Cook added.
Member discussion