When we think about artificial intelligence, we imagine future tech, faster innovation, and convenience. But what we don’t know about is how much energy it takes to power that convenience, and the toll it takes on our environment.

Why Does AI Use So Much Energy?

The process of training large AI models (like ChatGPT or image generators) requires huge amounts of energy. For example, GPT-3 took weeks of nonstop CPU use to train, which burned through thousands of kilowatt-hours of electricity. However, what is most overlooked by everyday users is how much energy it takes just to use these AI programs.

One ChatGPT query uses nearly 10 times more electricity than a Google search. A Google search typically uses about 0.3 watt-hours of electricity, while a single ChatGPT question can use around 2.9 watt-hours. When you multiply this by millions of daily users, the energy usage adds up very fast.

AI’s Water Footprint

Besides energy, there’s also a hidden cost in water, especially in the cooling systems that keep the massive data centers from overheating. To process just 20-50 ChatGPT prompts, a data center can use up to 500 milliliters (a typical water bottle) of clean water, mostly for cooling. That might not sound like a lot, but again, the scale matters. Multiply that by millions of queries a day, and we’re talking about millions of gallons of water consumed daily just to keep the servers cool.

Training large models like GPT-3 or GPT-4 also uses water before the model is even live. One study estimated that training GPT-3 used 700,000 liters (over 184,000 gallons) of clean water, the equivalent of watering an entire soccer field for a week straight.

The Push for Bigger Models Makes Things Worse

The tech industry is constantly trying to make bigger, more powerful AI models. More parameters mean more computing, which leads to more emissions. This cycle is not sustainable for our Earth. Just because we can make AI models bigger doesn’t mean we should, especially if it comes at the cost of the planet.

Although AI has powerful uses in fields like medicine, climate science, accessibility, and more, its typical everyday usage is significantly less crucial and much more energy-intensive than many people understand. Many people utilize AI to create entertaining images, memes, or digital art through platforms such as Midjourney or DALL·E. According to a 2023 MIT Technology Review article, generating images is now “by far the most energy- and carbon-intensive AI-based task,” with some studies estimating that producing a single AI image can consume as much energy as fully charging a smartphone. The article also explains that because people often refine and regenerate images multiple times, the emissions from everyday usage now far exceed the emissions from training large models.

If millions of people generate one image a day, the total energy usage can be compared to the amount of energy it takes to energize entire data centers. The more we casually normalize using AI for unnecessary tasks, the more challenging it gets to control its environmental effects. 

What can be done?

Companies rarely disclose how much energy or water their AI models use, leading to the public having little to no knowledge of how environmentally costly these tools are. These companies need to disclose their data and invest in green infrastructure. Also, these AI companies need to push for more energy-efficient AI. This might mean using smaller models and better training techniques, but it would be worth the effect it’s having on our environment. Lastly, many people can help the planet by using AI more responsibly and avoiding unnecessary uses. AI has the power to solve some of our biggest global problems, but only if we’re honest about the ones it’s creating.

About the Author: Jaqueline Mancilla was the Media Specialist Intern at the Lesniak Institute. She graduated with a bachelor’s degree in Communications, Media and Film.