In recent years, artificial intelligence (AI) has become increasingly prevalent in our daily lives.
As AI grows so does its demand for power. @NVIDIAAI's director of accelerated computing, Dion Harris, and @PennEngineers expert Benjamin Lee discuss ways to minimize AI energy demand and grid impact in the new episode of the Energy Policy Now podcast. https://t.co/3Iwp8fwDwo
— Kleinman Energy (@KleinmanEnergy) September 27, 2024
However, the rapid growth of AI technology has also led to concerns about its environmental impact, particularly in terms of energy and water consumption. As tech giants continue to develop more powerful AI systems, the demand for electricity to support these technologies is surging.
“…generative #AI platform such as ChatGPT can use at least 10x the amount of energy as a google search. Data centers could use as much as 9% of U.S. electricity by 2030”
My POV – Find more ways for responsibly producing energy or risk falling behind. https://t.co/kKmc1XI7PA pic.twitter.com/pjnjYyFtPg
— Aaron Miri (@AaronMiri) September 29, 2024
Large data centers, which are critical for AI operations, consume immense amounts of energy and often rely on non-renewable sources. This increase in energy usage poses a challenge to the Biden administration’s goals of cutting greenhouse gas emissions in half by 2030 compared to 2005 levels. Researchers and policymakers are now examining the balance between the benefits of AI and its environmental impact.
‘Three New York Cities’ Worth of Power:
AI Is Stressing the Grid https://t.co/OHDymtyLGr @Jennifer_Hiller @wsj
— Spiros Margaris (@SpirosMargaris) September 29, 2024
Shaolei Ren, a researcher in the field, explains that the energy-intensive nature of AI is due to the large number of parameters and calculations required to generate even a small amount of output, such as a 100-word email. “Each model has several billions of parameters, or even hundreds of billions of parameters,” Ren says. “Let’s say you have 10 billion parameters to generate one token or one word: You’re going to go through 20 billion calculations.
Balancing AI growth with sustainability
That’s a very energy-intensive process.”
This energy is converted into heat, which requires cooling, often in the form of water evaporation. According to a Pew Research Center study, generating a 100-word email using AI could use up an entire bottle of water for cooling purposes.
The impact of AI data centers on local communities is also a concern. In the U.S., roughly 80 to 90 percent of the water consumption for data centers comes from public water sources. Ren estimates that data centers’ water consumption currently accounts for 2 to 3 percent of public water supplies in the U.S.
While tech companies have an incentive to reduce the energy and resource usage of AI computing, real-world constraints, such as strict service-level objectives and user request patterns, limit the effectiveness of optimization techniques.
Ren suggests that one potential solution is to use smaller AI models for tasks that do not require the full capabilities of larger models. “Instead of using larger and larger models, we could be using smaller and smaller models, because usually those smaller models are good enough to complete many of the tasks that we actually care about,” he says. As AI continues to evolve, finding a sustainable path forward is imperative to ensure that technological advancements do not come at the expense of environmental health.
The next few years will be critical in determining whether the tech industry can align its growth with the nation’s climate goals.