Data Centre Control Room.

Companies must focus on reducing their carbon footprint and run their data centres on renewable energy. (Photo by Tom Lee/Construction Photography/Avalon/Getty Images)

Artificial intelligence (AI) is highly dependent on computing power, and the complexity of machine learning or deep learning models requires substantial computational resources. Given the significant energy requirements of modern hardware, this translates into extremely high power consumption.

Most AI research today focuses on achieving the highest levels of accuracy, with little attention to computational or energy efficiency. Leaderboards in the AI community track which system performs best on tasks such as image recognition or language comprehension, prioritising accuracy above all else.

Deep learning, based on neural networks with billions of parameters, is inherently compute-intensive. The more complex the network, the greater the need for high-performance computational power and extended training times.

Canadian researchers Victor Schmidt et al report that state-of-the-art neural architectures are often trained on multiple GPUs for weeks or months to surpass previous achievements.

The cost of AI

AI is costly; research by OpenAI researchers Dario Amodei and Danny Hernandez shows that since 2012, the computing power used for deep learning research has doubled every 3.4 months. This equates to a 300,000-fold increase from 2012 to 2018, far exceeding Moore’s Law, which states that processing power doubles every two years.

As AI usage grows, especially with consumer applications like ChatGPT, energy consumption escalates further.

But, and this is good news, as the world focuses on climate change, AI researchers are also beginning to recognise its carbon cost. A study by Roy Schwartz et al, of the Allen Institute for AI, questions whether efficiency, along with accuracy, should become a priority. AI models require vast amounts of computational power for training data processing and experimentation, which drives up carbon emissions.

Similarly, the University of Massachusetts (Strubell et al, 2019) highlighted the effect of AI on the environment, analysing the computational demands of neural architecture searches for machine translation.  

Therefore, five years ago already, it was projected that the carbon cost of training such models is at 28 4019.13kg of CO₂, equivalent to 125 round-trip flights from New York to Beijing. As AI’s energy demands continue to grow, it’s vital to consider sustainability alongside utility.

The good news

Fortunately, AI can assist in our global quest to drive down greenhouse gas emission. A 2019 study by Microsoft and PwC predicted that responsible use of AI could reduce global greenhouse gas emissions by 4% (2.4 gigatonnes) by 2030.

AI is already being used to optimise energy consumption in industrial and residential sectors, forecast supply and demand, manage autonomous transportation and reduce carbon footprints. For example, Google has improved the energy efficiency of its data centres by 35% using machine learning technology developed by DeepMind.

AI is also helping to minimise waste in green energy production, predicting the output of solar, wind and hydro energy, and optimising water usage in residential, agricultural and manufacturing areas.

Furthermore, algorithms have improved agricultural processes, such as precision farming, ensuring that crops are picked at the right time and water is used efficiently.

AI’s environmental responsibility

According to the Shift Project, the information and communications technology (ICT) sector accounts for about 4% of global carbon emissions, with its contribution to greenhouse gas emissions surpassing that of the aviation industry by 60%.

Furthermore, as more businesses adopt AI to drive innovation, the demand for cloud-optimised data centre facilities will rise. By 2025, data centres will account for 33% of global ICT electricity consumption.

To minimise their carbon footprint, companies must ensure their data centres are equipped to handle high-density compute demands efficiently. Unfortunately, up to 61% of systems run by corporate data centres are running at low efficiency, according to ScienceDirect.

Additionally, it’s crucial that data centres are powered by renewable energy. If housed in fossil-fuel-powered facilities, AI’s energy efficiency efforts can be negated, which is why it’s important that companies verify their cloud provider’s green credentials.

Location is another factor in ensuring sustainable AI. Cooling data centres is expensive, especially in warmer climates, and more than 80% of hardware does not need to be near the end user in terms of latency.

As an example, tech giants such as Google are investing in data centres in Nordic countries for better energy efficiency. Plus, in countries like Iceland, natural cooling reduces energy usage, with renewable geothermal and hydroelectric power ensuring cleaner operations.

The future

The future of AI must focus on sustainability. The World Economic Forum suggests a four-step process to balance AI’s benefits with its environmental impact:

  1. Select the right use case: Not all AI optimisations lead to significant carbon reductions. Organisations should prioritise processes that can be meaningfully optimised by AI, especially for sustainability use cases.
  1. Choose the right algorithm: The energy consumption of an AI system depends largely on the algorithm used. By selecting the most efficient algorithm, organisations can significantly reduce training time and energy usage.
  1. Predict and track carbon outcomes: Good intentions alone aren’t enough. AI implementers must include carbon footprint estimates in cost-benefit analyses and use sustainability as a key performance indicator for AI projects.
  1. Offset the footprint with renewable energy: Organisations must use green energy sources to power AI models. 

Ben Selier is the vice-president of Secure Power, Anglophone Africa at Schneider Electric.





Source link

author-sign