top of page

Can AI Solve Its Energy Woes? — The Massive Carbon Footprint of Artificial Intelligence


Artificial Intelligence (AI) is promising to transform the world’s relationship with computing.  ©Khanchit Khirisutchalual/iStock
Artificial Intelligence (AI) is promising to transform the world’s relationship with computing. ©Khanchit Khirisutchalual/iStock

The new technology of Artificial Intelligence (AI) is promising to transform the world’s relationship with computing. Its potential in multiple fields is astounding. However, like so many other innovations, this technology also threatens the environment due to its tremendous energy consumption.


Do the advantages of AI outweigh its potential environmental impact? Can the architects and developers of this technology find a way to mitigate this impact? Their response to this dilemma will significantly impact AI's future growth and role in society.


What is Artificial Intelligence?

Like a lot of overused, high-tech jargon—e.g., cloud computing, metaverse, and the internet of things (IoT)—there may not be one universally recognized definition of AI. The University of Illinois at Chicago defines AI as "a branch of computer science that aims to create machines capable of performing tasks that typically require human intelligence."


More specifically, AI uses algorithms that analyze data to “learn,” or perform tasks based on that information, in a way that looks a lot like human intelligence. This might include functions such as understanding speech, recognizing faces, writing sentences, creating images, and yes, even driving cars.


AI is not a new concept. Stanford University professor John McCarthy is credited with having coined the term “artificial intelligence” in a proposal he co-authored in 1956. McCarthy gave credit to English mathematician Alan Turing, who gave a lecture on the subject in 1947 and later published a paper on it called, “Computing Machinery and Intelligence.”


The field had yet to make any significant progress in the realm of the practical—until two years ago.


With the commercial availability in 2022 of sophisticated applications like ChatGPT, otherwise known as large language models (LLMs), AI made the leap from the mostly hypothetical to the actual.

©UnitoneVector
©UnitoneVector

With the commercial availability in 2022 of sophisticated applications like ChatGPT, otherwise known as large language models (LLMs), AI made the leap from the mostly hypothetical to the actual.


The potential for AI in computing applications seems limitless. Software giant SAS says that features of AI, such as “automation, conversational platforms, bots, and smart machines, can be combined with large amounts of data to improve many technologies.”


These include home security, financial investment analysis, fraud detection in banking and accounting, cancer detection and other medical diagnoses, grid energy management, and much more.


The Downside of AI

Most breakthroughs come with a drawback. With AI, it is the enormous increase in energy consumption.


The development of the internet and, more recently, the surge in so-called cloud computing introduced the world to the new concept of data centers. These are large facilities filled with servers, or processors, where most of the world's everyday online computing actually takes place (as opposed to on desktops or onsite computer servers).


Understandably, these large concentrations of computing equipment under one roof consume vast amounts of electricity and generate an equally large amount of heat. Now that AI is going mainstream, the energy demands of these facilities are expected to balloon—data center traffic alone is expected to grow by a factor of 10 every two years, according to technology assurance company Spirent.


Existing data center equipment runs on the same processing chips as the personal computers (PCs) that people use at home and in the office. … Unfortunately, CPUs are not sufficient to support the computing needs of AI applications.

The reasons are rooted in the technology itself. Existing data center equipment runs on the same processing chips as the personal computers (PCs) that people use at home and in the office. These central processing units (CPUs) are constructed from billions of transistors and can have multiple processing cores.


Computing giant Intel explains that CPUs are the “brain” of a computer and are ”essential to all modern computing.”


Unfortunately, CPUs are not sufficient to support the computing needs of AI applications.


Network Group Manager Andy Kowalski works inside the data center for the Thomas Jefferson National Accelerator Facility (Jefferson Lab), a U.S. Department of Energy Office of Science national laboratory.  ©Aileen Devlin/Jefferson Lab/Public Domain
Network Group Manager Andy Kowalski works inside the data center for the Thomas Jefferson National Accelerator Facility (Jefferson Lab), a U.S. Department of Energy Office of Science national laboratory. ©Aileen Devlin/Jefferson Lab/Public Domain

Enter the graphics processing unit, or GPU. Commercially available first in the 1990s, they were designed primarily for design and graphic applications, like image rendering, video editing, and gaming.


GPU chip used in the Xbox.  ©A7N8X. CC BY-SA 4.0
GPU chip used in the Xbox. ©A7N8X. CC BY-SA 4.0

Today, GPUs can “deliver massive performance,” as Intel puts it, but this also means more demand for electricity, plus generation of heat, which requires more energy for cooling. The market research firm, Newmark, notes that GPUs “require up to 15 times the energy of traditional CPUs.”


Given that massive data centers will be incorporating GPU-enabled equipment to meet the growing demands of AI, the energy footprint of these data centers is going to explode.


In its 2023 study on the U.S. data center market, Newmark projects the need for computing resources to “increase exponentially” as a result of AI. It explains that the typical stack (“rack”) of computer servers in a data center for one business customer currently requires 10 to 14 kilowatts. The study projects the demands of AI to more than quadruple that figure, pushing the requirement to between 40 and 60 kilowatts per rack.


How Will Tech Companies Meet the Challenge?

This is where tech companies face a conundrum. The tech sector has embraced sustainable energy practices, and Apple, Google, and others have invested heavily into renewable energy resources and energy efficiency to reduce their carbon footprint.


But if these companies invest in ever-larger data centers that require greater amounts of energy and space to meet the demands of AI, how can they also meet their goals to reduce their energy footprint?


The Wall Street Journal (WSJ) reports that while Google and Microsoft have pledged to dramatically reduce carbon emissions, the opposite is occurring for the two tech giants—and AI is largely to blame.


Google’s overall emissions increased by 13.5% from 2022 to 2023 and are up by nearly 50% since 2019. … Microsoft's emissions have followed a similar trajectory as Google’s, increasing by 29% between 2020 and 2023.

According to WSJ, Google’s overall emissions increased by 13.5% from 2022 to 2023 and are up by nearly 50% since 2019. Google chief sustainability officer Kate Brandt and senior vice president Benedict Gomes responded to this, saying in a letter accompanying the company's annual sustainability report that “in spite of the progress we’re making, we face significant challenges that we’re actively working through.”


Microsoft's emissions have followed a similar trajectory as Google’s, increasing by 29% between 2020 and 2023.


This chart shows the rapid increase in the computing costs used to train large language models. The training cost of models like GPT-4 is not publicly known, so this is just an estimate.  ©The data is from Epoch in 2023, and the chart is from Stanford University's 2024 AI index (CC BY-SA 4.0.).
This chart shows the rapid increase in the computing costs used to train large language models. The training cost of models like GPT-4 is not publicly known, so this is just an estimate. ©The data is from Epoch in 2023, and the chart is from Stanford University's 2024 AI index (CC BY-SA 4.0.).

Challenges Not Insurmountable

On the positive side, the challenges regarding data centers and AI are not insurmountable.


Companies have many remedies at their disposal to make AI more energy efficient. The Massachusetts Institute of Technology (MIT) reports that “new tools are available to help reduce the energy that AI models devour.”


For example, MIT's own Lincoln Laboratory Supercomputing Center (LLSC) has found that by limiting or “capping” the amount of power a GPU is able to draw, energy consumption by AI is reduced by 12% to 15%. The only drawback to this technique is an increase in task completion time by about 3%, which according to Vijay Gadepally, senior staff at the LLSC, is “barely noticeable.”


Data center operators can also reduce the demand for computational power by optimizing the algorithms they employ to program their AI models.

Data center operators can also reduce the demand for computational power by optimizing the algorithms they employ to program their AI models. Less demand for computational power requires less energy consumption.


Other techniques, such as improvements to hardware, using smaller and less complex models, more efficient training techniques for AI models, optimizing the scheduling of AI computing, and moving computations closer to where data is stored (also known as “edge computing”), whether employed singularly or collectively, can increase the energy efficiency of AI computing and the data centers where it occurs.


‘Green Data Centers’

Another technique for reducing the carbon footprint of AI is to conduct the computing in so-called “green data centers.” The Institute of Electrical and Electronics Engineers (IEEE) defines a green data center as a facility that performs the functions of a traditional data center “but in a more sustainable way.”


According to IEEE's definition, all the center's systems “are designed to consume less energy and minimize its environmental impact.” This includes computer, electrical, mechanical, and lighting systems.


They are designed and built, using low-emission construction materials and furnishings, to minimize building footprints. Perhaps most importantly, they utilize alternative energy sources, such as heat pumps or solar photovoltaic technology. Their design also incorporates responsible practices such as e-waste recycling. The energy savings of these green data centers can be significant, with some reports of savings as much as 40%.


[M]any proponents of AI argue that AI will help drastically increase energy efficiency in society broadly, and that these improvements will far outweigh any increase in its energy footprint.

Additionally, many proponents of AI argue that AI will help drastically increase energy efficiency in society broadly, and that these improvements will far outweigh any increase in its energy footprint.


For example, the WSJ also reports that Google has worked on an AI-powered tool that would help airplanes avoid generating contrails, which account for 57% of aviation’s global-warming impact.


Google has worked on an AI-powered tool that would help airplanes avoid generating contrails, which account for 57% of aviation’s global-warming impact.  ©Gralo/Public domain
Google has worked on an AI-powered tool that would help airplanes avoid generating contrails, which account for 57% of aviation’s global-warming impact. ©Gralo/Public domain

The ‘Greening’ Effect of AI Data Centers

The Caribbean Electric Utility Services Corporation (CARILEC), which is an association of electric energy solutions providers and other stakeholders operating in the Caribbean region, Central and South Americas, and globally, reports that AI can contribute to energy conservation in a number of ways.


For instance, AI can contribute to energy conservation by optimizing energy consumption in buildings, which account for nearly 40% of global energy consumption. CARILEC says AI can be used to adjust heating, ventilation, and air conditioning (HVAC) systems in real-time, ensuring that energy is only used when and where it is needed, which “can result in significant energy savings and reduced carbon emissions.”


There are other ways in which AI can help stop global warming, some of which may seem a little less obvious, but just as effective. A list compiled by the World Economic Forum (WEF) includes:

  • Tracking patterns in iceberg melting.

  • Mapping deforestation.

  • Helping vulnerable communities adapt to climate change.

  • Increasing waste recycling.

  • Detailed mapping of ocean waste.

  • Predicting climate disasters.

  • Improving weather forecasting and wind patterns for better wind energy generation.

  • Tracking platforms to help industries reduce their emissions.

  • Using drones to disperse seeds for Brazilian reforestation.


If some of WEF's solutions seem far-fetched, another proposal may seem totally out of this world, by comparison.


Thales Alenia Space is a joint venture between two European defense, security and transportation system providers, Thales and Leonardo. In June 2024, the partnership announced the results of its ASCEND (Advanced Space Cloud for European Net zero emission and Data sovereignty) feasibility study of data centers in outer space.


According to the study, orbiting platforms could save energy. For example, they would consume power generated by solar panels that are located outside the Earth's atmosphere, and they would be much easier to cool than data centers on the ground because space is much colder than Earth. (See the news brief "European Plan to Lower AI Energy Bill Is Out of This World" in this The Earth & I issue).


Technology Moves Quickly and in Surprising Ways

Barely two years ago, AI broke through the barrier of imagination to become a transformative force for technological innovation, and the hype is far from subsiding.


With that breakthrough, the carbon footprint of AI is already significant and something that cannot be overlooked as the world grapples with the challenge of climate change.


Just as other powerful innovations, like smart phones, electric cars, windmills, and rooftop solar panels, have grappled with their own unique set of environmental challenges to balance their benefits with their impact, AI needs to do the same.

 

*Rick Laezman is a freelance writer in Los Angeles, California. He has a passion for energy efficiency and innovation. He has been covering renewable power and other related subjects for more than ten years.

Comentários


Join Our Community

Sign up for our bi-monthly environmental publication and get notified when new issues of The Earth & I  are released!

Welcome!

bottom of page