Last night, I was listening to the Decoder podcast by Nilay Patel, which featured an episode focused on AI’s impact on climate change. Just a few minutes in, I was fascinated by the significant effect that cloud computing and data centers in general have on energy consumption and climate change. Some of the stats that caught my attention include the fact that, as of this year, there are approximately 8,000 data centers worldwide, with about a third of these located in the United States; I had expected this number to be much lower, especially for the U.S.
These data centers, essential for modern technology—including the training of AI models and other cloud computing services—consume roughly 2% to 3% of global electricity. In the U.S., where the concentration of data centers is higher, they account for about 4% of the nation’s electricity usage. Experts predict that the energy consumption of data centers could exceed 9% by 2030, driven largely by the increasing computing needs, especially from the ongoing AI boom. It’s important to remember that over 84% of the energy on the U.S. grid comes from unclean sources, such as fossil fuels like petroleum, natural gas, and coal, which significantly contribute to carbon emissions and climate change.
Data centers, from massive facilities operated by cloud computing giants like Amazon AWS, Microsoft Azure, and Google Cloud Platform to smaller setups for crypto mining, are at the heart of this growing energy challenge. The concern isn’t just about how much energy these data centers consume today, but how much more they will require as the world becomes increasingly dependent on AI and cloud computing. In today’s article, I want to explore the impact AI and cloud computing might have on global energy consumption and climate change. I’ll also discuss what different players, including you and I, as enthusiasts and users of cloud computing services, can do to reduce our carbon footprint while leveraging various cloud resources for our everyday work.
The Rising Energy Consumption of Data Centers
In January, the International Energy Agency (IEA) projected that global electricity demand from data centers will more than double between 2022 and 2026. One of the major factors contributing to this surge is the rapid expansion of artificial intelligence (AI), which requires substantial computing power and thus increases energy consumption.
To put this into perspective, in 2022, data centers globally consumed about 415 terawatt-hours (TWh) of electricity. This amount is greater than the entire electricity consumption of the United Kingdom. Despite advancements in making data centers more energy-efficient, the sheer volume of data and computing workloads managed by these facilities is growing at a remarkable rate, which has led to the energy usage in data centers increasing by 20-40% each year.
This rise in energy consumption is driven by several factors, including:
- Increased Data Processing Needs: Modern applications, especially AI and machine learning, require more processing power and data storage. For instance, the Netflix recommendations engine processes over 1 trillion data points every day, including viewing habits, search queries, and interactions with the platform.
- Growing Number of Data Centers: The global expansion of data centers to support various industries and cloud services adds to the total energy demand. Major players in the cloud computing space like Amazon, Google, and Microsoft are expanding their data center portfolio on an annual basis. This trend will likely continue now that these companies need to deal with the increasing need for computing resources for AI applications. For instance, Amazon plans to invest over $150 Billion in the next 15 years to build data centers to deal with the AI boom.
- Increased Use of Cloud Services: The demand for data centers is increasing not only due to the AI boom but also because of several other online services such as gaming, streaming, and social media, which are driving up energy consumption.
The Role of AI and Cloud Computing
AI’s Energy Demand
As you might expect, AI is significantly influencing energy consumption in data centers. Currently, AI applications account for more than 10% of data center electricity use in the U.S. This percentage is expected to grow substantially in the coming years since the energy demand associated with training and using generative AI models such as OpenAI’s GPT or Meta’s Llama is particularly high.
For instance, processing a single query for a model like ChatGPT consumes around ten to twenty-five times more energy than a standard Google search. Future projections indicate that by 2028, AI will represent more than 19% of data center power demand. This shift is anticipated to drive a staggering 160% increase in overall data center power consumption towards the end of this decade.
Cloud Computing Growth
The expansion of cloud computing services is another major factor contributing to increased energy consumption by data centers. Data centers supporting cloud services are projected to account for 1.86% of global electricity consumption by 2030.
The surge in cloud computing is driven by the increasing use of cloud-based applications and services such as online gaming and streaming, which require extensive infrastructure and energy to operate efficiently. As more businesses and consumers rely on cloud solutions, the energy demands of these data centers continue to rise.
Corporate Emissions and Challenges
The rise in data center energy consumption is also reflected in the carbon emissions of leading tech companies. For instance, Alphabet, the parent company of Google, reported a 48% increase in CO2 emissions in 2023 compared to 2019. The company attributes this increase to the higher energy demands associated with more intensive AI computing and the expansion of its global computing infrastructure.
Similarly, Microsoft, which has partnered with OpenAI to integrate large language models into its suite of products, experienced a 30% increase in CO2 emissions compared to 2020. Microsoft’s CO2 emissions have increased in recent years, partly due to AI computing, as OpenAI relies on the Azure cloud platform for its deep learning and AI training operations.
What Can Be Done?
Cloud computing and AI are essential technologies that billions of people around the world use directly or indirectly every day. However, as we enjoy the benefits of these technologies, it’s important to also consider their potential negative impacts now and in the future. The rising energy demand driven by AI workloads presents a significant climate change challenge that should be concerning to anyone who cares about the future of humanity. The good news is that there are many ways we can this challenge. Let’s look at some of the viable solutions:
More Efficient Computing
Advancements in computing hardware can play a crucial role in reducing energy consumption. Fortunately, companies like Amazon and Google are pioneering the use of Arm-based chips, such as Amazon’s Graviton and Google’s Axion, which are designed to deliver high performance with significantly lower energy usage compared to traditional X86 chips. These Arm chips can achieve up to 60% less energy consumption while maintaining similar levels of performance.
However, despite their efficiency, the supply of these chips remains limited. As a result, many computing workloads on these platforms are still handled by the more energy-intensive X86 processors, which are currently more widely available. The transition to these more efficient chips is a promising step, but broader adoption is needed to fully realize their potential in reducing overall energy consumption.
Whenever possible, we recommend that users of platforms like AWS choose ARM chips over X86. However, not all workloads can be handled by ARM chips. For example, computing workloads running on Windows Server OS cannot use ARM since Windows Server for ARM is not yet available. The good news is that Microsoft is set to release Windows Server for ARM (currently in preview) in 2025. Read this article to check out our deep dive into ARM chips for serversand how their potential impact in the next couple of years.
More Efficient Software Development
Developers also have an important role in minimizing the energy impact of AI applications. As a developer, before integrating any form of AI, it is essential to evaluate whether it is the most appropriate solution for a given task or if a less energy-intensive alternative might be sufficient. When AI is necessary, developers should use existing, specialized AI tools that are optimized for energy efficiency.
Other software optimizations such as runtime scheduling and resource-aware programming can also help reduce energy consumption. Finally, it is also crucial to monitor the energy usage of AI models throughout their lifecycle—from training to deployment—can reveal opportunities for improvement.
Optimize Data Management
Efficient data management practices can substantially reduce the energy footprint of AI operations. For instance, distributed computing frameworks allow for the parallel processing of data across multiple machines, which can lead to more efficient data handling and reduced overall energy use.
Additionally, cloud resource management is crucial for minimizing waste. By properly managing cloud resources, implementing autoscaling, and using monitoring tools, organizations can avoid overprovisioning and ensure that resources are allocated based on actual demand. These practices subtle as they might seem can significantly help to optimize energy consumption and make more efficient use of available computing resources.
Using More Renewable Energy in Data Centers
Transitioning to renewable energy sources in data centers is a critical step in reducing the carbon footprint of data centers. Solar, wind, and hydroelectric power offer cleaner alternatives to fossil fuels and can significantly lower the environmental impact of data center operations. Many leading technology companies, including Amazon and Google, are already making substantial investments in renewable energy projects to power their data centers.
By shifting to green energy, these companies are setting a positive example and contributing to a more sustainable future. Most tech companies are also investing in various renewable energy projects around the world to offset their net carbon footprint. For example, Google became the first major company to match its energy use with 100% renewable energy back in 2017. However, more emphasis needs to be placed on the actual energy used by data centers, with the goal of using 100% renewable energy whenever possible.
Related Posts
- Technologies That Will Shape the Future Of AI
The idea of using Artificial Intelligence (AI) to solve complex problems that initially required human…