5 min read

How smarter deployment can help AI remain a sustainability net positive

To avoid jeopardising their decarbonisation efforts, CSOs can promote more intentional AI use.
Melodie Michel
How smarter deployment can help AI remain a sustainability net positive
Photo by Igor Omilaev on Unsplash

Big Tech’s latest sustainability reports highlighted the significant emissions impact of implementing AI at scale. To avoid jeopardising their decarbonisation efforts, CSOs need to promote smarter AI policies within their companies.

Artificial intelligence can be a game changer for sustainability, but its downside is now obvious. In July, Google reported a 48% increase in emissions over the past four years, which it attributed to growing energy demand and data centre usage driven by the explosive growth of artificial intelligence.

Microsoft’s carbon footprint jumped by 30% in 2023 alone due to the construction of new data centres to meet growing AI expectations. 

As Big Tech’s emissions climb due to their pursuit of ever-more transformative AI applications, questions are emerging over these firms’ ability to achieve their climate targets. In fact, Google, Meta, Microsoft and Salesforce recently launched a coalition to develop 20 million tonnes of carbon credits from nature-based projects by 2030, a signal that they expect to rely heavily on offsets to compensate for their impact.

The details of generative AI’s power consumption and sustainability impact have been explored at length, but what’s less clear is how Chief Sustainability Officers can help their companies make smarter choices to limit the technology’s negative impacts – ensuring that AI remains a net positive for sustainability.

Do you really need AI?

Before even looking at data centres and the type of energy they use, there are many steps companies can take internally to make their software – AI-powered or not – more efficient, and therefore, less energy-intensive.

This is something that IBM’s new Chief Sustainability Officer Christina Shim alluded to in a recent interview: “I don't know that every company is thinking about it as intentionally as they should,” she said. “One is, how do you make sure that you're making the smart choice around your AI models? The biggest thing is, a bigger model is not necessarily a better model. Think about it from that vantage point: What is it that I need to achieve, and then what is the smallest model that I can use to get there? That helps with sustainability, but also costs and speed.”

A lot of the time, asking these questions to refine a company’s needs will result in a decision not to implement an AI solution at all.

Ian Ray, Head of Data, ML and AI at tech consultancy Daemon, says his company has often had to rein in clients’ excitement about AI in recent months. “We've seen a massive push from clients to say: ‘We really need machine learning. We really need artificial intelligence.’ But when you look into it, they just need to understand their data a bit better and create those foundations. There is this perceived need to move really, really fast. But actually, when you dive into the details, it's really understanding what the requirements are of the client and scaling it back,” he tells CSO Futures.

AVEVA is an industrial software company serving the energy, chemicals, mining, manufacturing, transportation and engineering sectors. Its Global Head of Sustainability, Lisa Wee, points out that while there is a lot of attention around AI at the moment, “70% of companies in the sectors that we operate in still have not successfully done digitalisation at scale”. 

“You're jumping ahead a little bit if you're immediately focused on your AI: a lot needs to be done to actually make sure that you have a good foundation in place, to have good data that you can then leverage and to really understand how you want to use AI,” she adds.

Smarter software practices for more sustainable AI

AVEVA is a member of the Green Software Foundation, and has developed a framework for measuring and reducing IT emissions.

Wee explains that best practices in green software design can include promoting more energy-efficient coding, using apps when cleaner electricity is available on the grid, reducing the amount of data that has to travel and the distance that it has to travel across networks, and avoiding unnecessary lookups that go on in the background. “It’s about making your software as efficient as possible before the focus of AI comes into play,” she adds.

At Daemon, Ray says his team is leveraging recently-released consumption dashboards made available by cloud hyperscalers to evaluate clients’ estates and look at how they're storing their information. “It just makes sense for businesses: we can tell them if they're not using instances and they can shut those servers down, or if they need to hone in on their data retention policies… Ultimately it just saves that overhead of running unnecessary compute in the background that would cost them a lot of money,” he notes.

Read also: Platform.sh sustainability lead Leah Goldfarb on optimising IT for sustainability

When it comes to AI applications themselves, Wee reminds CSO Futures that “AI is not a monolithic term”: “We're an industrial software company, and of course, we do leverage the large language models and the generative AI that does have power consumption, but actually a lot of the other machine learning technologies don't necessarily have or require the same compute or GPU, so we have not seen [a rise] in our own emissions.”

It is worth noting that AVEVA develops AI applications on hyperscaler cloud infrastructure, and as such, accounts for AI-driven emissions as part of its Scope 3. These increased by almost 3% in 2023, which Wee attributes to overall business growth. 

Data centres and renewable energy

The majority of companies implementing AI solutions are not building the data infrastructure for it themselves: like AVEVA, they rely on the data centres deployed by Big Tech firms such as Google, Microsoft, Amazon, etc.

For them, it might be reassuring to see these tech giants state that they are increasing the proportion of renewable energy used to power this infrastructure – in fact, a recent survey suggests that a very wide majority of firms would be willing to pay a premium for data centres that use clean or renewable energy. Some people even believe that Chief Information Officer compensation could soon be tied to how green the data centres used by the company are.

Daemon, for example, runs most of its solutions on the AWS cloud – which Amazon says is powered by 100% renewable energy. But the truth is most of Amazon’s energy consumption is only indirectly ‘matched’ with renewable energy produced far from its operations through what’s called renewable energy certificates (RECs), leading critics to denounce the company for giving a misleading impression of its climate footprint.

Beyond the reliability of renewable energy claims, the sheer amount of power required by fast-growing data centres is set to become a global problem: OpenAI CEO Sam Altman famously believes there is no way for AI to reach its full potential without new energy sources. What's clear us that Big Tech has to do more than rely on RECs and carbon offsets to make their data centres greener.

Some of the avenues firms are exploring include more efficient chipsets, smarter temperature regulation to reduce cooling needs and infrastructure upgrades – and many observers expect that regulators will soon provide more guidance on the environmental credentials of data centres.  

AI as a net positive for sustainability

Overall, most experts believe that despite the recently-observed increase in emissions, AI is still a net positive for sustainability. AVEVA, for example, has seen machine learning help customers reduce energy consumption by up to 5%, simply by giving them better data insights and helping them spot anomalies in equipment or asset use.

For Wee, it’s impossible for us to fully grasp the potential of new technologies for sustainability. “There's been a lot of talk about guardrails, but I think there still needs to be a lot of focus on innovation: this is part of what we look to continue to cultivate,” she says. 

“As just one example, we're very excited about a new partnership we're exploring right now with the Oxford Quantum Circuits around quantum computing, which would significantly reduce the footprint from an energy standpoint. Those kinds of futures are really exciting: what's beyond AI, what's coming next? I think that's something that we have to keep our eye on and continue to understand how that can help solve problems around climate and sustainability as well.”