11/25/2024 | News release | Distributed by Public on 11/25/2024 06:04
This article has been adapted from the "Sustainable Computing for HPC and AI" eGuide.
Concerns about global warming and unsustainable levels of carbon dioxide (CO2) emissions loom large for organizations of all sizes. Manufacturers are under more pressure than ever to design energy-efficient products and move towards net zero emissions to meet corporate environmental, sustainability, and governance (ESG) goals. Virtually all organizations that operate large-scale data centers have similar concerns, from financial services firms to government agencies to pharmaceutical companies to online service providers. While high-performance computing (HPC) and artificial intelligence (AI) play key roles in designing and delivering more energy-efficient products, ironically, the widespread adoption of AI - with its enormous appetite for electricity - is driving dramatic increases in energy demand across all industries.
According to research from the International Energy Agency (IEA), global energy consumption due to data centers, AI, and the cryptocurrency sector is poised to double from an estimated 460 terawatt-hours (TWh) in 2022 to more than 1,000 TWh by 2026 - an amount roughly equal to Japan's total annual electricity consumption.
Fortunately, technologies pioneered in HPC can play a critical role in improving the efficiency of energy-intensive AI model training and inference (prediction). By leveraging these technologies and taking other measures, such as deploying more energy-efficient servers or improving data center power usage effectiveness (PUE), organizations can dramatically reduce their carbon footprint and help realize a greener, more sustainable future. In addition to providing environmental benefits, more energy-efficient data centers help organizations reduce costs and increase profitability.
Without further ado, let's discuss the challenge of climate change, the promise of HPC, and the challenges of curbing AI's appetite for energy.
Sustainability has emerged as a key requirement in modern design and manufacturing environments. Faced with longer heatwaves, more wildfires, increasingly intense storms, and growing threats to agriculture and the global food supply, 193 countries and the European Union signed the Paris Agreement, pledging to pursue efforts to limit global temperature rise to 1.5 C above those recorded in pre-industrial times. The goal is to avoid the catastrophic projected impacts of climate change.
Driven by consumer demand and government regulation, most organizations have made public commitments to meet carbon reduction goals as part of corporate ESG programs, and they're incorporating these requirements into their supply chains. Suppliers are increasingly required to disclose greenhouse gas emissions and climate risks in public and private procurements. Common strategies to meet sustainability goals include reducing scope 1 and scope 2 carbon emissions from operations, investing in renewable energy, responsible sourcing and recycling, and purchasing carbon credits to offset emissions.
Manufacturers play an outsized role in enabling a low-carbon future because the lifetime environmental impact of their products tends to dwarf the impact of product design and manufacturing over the product life cycle. Automobiles are a good example. According to the IEA, the total carbon impact of manufacturing a typical internal combustion engine vehicle is approximately six tons of CO2 equivalent (tCO2e). By contrast, operating the car over its lifetime results in approximately 35.9 tCO2e. In other words, roughly 86% of life cycle emissions are due to the fuel cycle and tailpipe emissions.
To meet sustainability goals, manufacturers face two critical challenges:
Advances in AI are transforming a broad variety of industries, from manufacturing to healthcare to retail to financial services. Even with powerful GPUs that deliver almost 6x the model training performance per socket compared to CPUs alone, the power required to train large models is enormous.
What makes AI workloads particularly challenging is that, depending on the model, inference - using an AI model to analyze new data and make predictions - can be even more energy-intensive than model training. Google estimates that 60% of the cost and environmental impact of AI is from inference, due to the sheer number of times predictive models are invoked.
Some examples that highlight the enormous energy requirements and environmental impact of AI model training and inference:
Though these energy demands are steep, HPC and simulation are essential in designing more sustainable, environmentally friendly products. By employing advanced computer simulation and optimization, organizations can:
In the next article, we'll examine how organizations can make their HPC operations more energy efficient and reduce HPC and simulation's carbon footprint with Altair technology.
To read the full eGuide, visit "Sustainable Computing for HPC and AI."