Site icon Infused Innovations

It’s Not Biological, But AI’s Carbon Footprint and Thirst for Water are Astonishing

Image of WALL-E holding a plant alludes to AI's carbon footprint.

In recent months we’ve all been stunned by the rapid rate of AI development, particularly when ChatGPT became publicly available and then quickly led to the even more capable GPT-4, Microsoft’s integration of GPT into Bing, and the AI race between tech giants we’re seeing now. There’s lots of discussion about how this advancing technology can benefit us and what it will mean for the future in terms of transforming the workforce. There’s fear that it will cause mass joblessness and make us question human purpose. But what’s often ignored is the environmental impact: AI’s carbon footprint is massive, and it requires surprising amounts of water to power it. With the ever-growing problem of climate change and millions of people who don’t have access to adequate water, this is a major issue that we need to face—as soon as possible.

How Much Energy is Needed to Train and Deploy AI Models?

Even looking at the state of technologies several years ago, it was already clear that AI’s carbon footprint was alarming. The computational resources necessary to produce a best-in-class AI model had been doubling every few months, with a 300,000x increase between 2012 and 2018. Training a typical AI model at that time produced more CO2 emissions than an average American would in two years, and the training of a more energy-intensive model would create a carbon footprint equivalent to the entire lifetime of five cars. By the time GPT-3 replaced its predecessor in 2020, its 175 billion parameters (compared to GPT-2’s 1.5 billion) required thousands of times more energy to train. Fast-forward to today, and GPT-4 reportedly has a trillion parameters. (Some sources even state a much higher number.)

And that’s just for the models’ training. Nvidia has estimated that 80-90% of resource costs of neural networks go to their real-world use, after training. It’s hard to even conceive of the massive amounts of energy consumed and CO2 produced by today’s large language models—but here’s a thoughtful calculation of the daily carbon footprint of ChatGPT.

Why do AI models need so much energy? In order to train these systems well, they need very large datasets. Generally, the more data they can ingest, the more accurate their predictions and outputs will be. For every piece of this vast training data, a model needs to perform complex mathematical operations. On top of that, AI developers will often run hundreds of different versions of a particular model in order to find the optimal configuration for its neural architecture. All of this adds up (and multiplies) very fast.

AI Needs Water, Too

To provide all that computing power, massive server farms are turning lots of energy into heat—so they need to be cooled in order to prevent equipment from malfunctioning. That’s done with cooling towers that evaporate cold water to maintain an ideal temperature. It turns out a lot of water is used for this process, roughly a gallon for every kilowatt-hour expended in a typical data center. Since AI processing requires so much more than other computing, it sucks up a lot more water, too.

Moreover, it can’t be just any water that’s used for this cooling—it needs to be clean freshwater. This helps prevent bacteria growth and corrosion, and it allows for the best control of humidity. Such a setup requires the kind of water that many people in poor or drought regions could desperately use.

Training GPT-3 took the same amount of water needed to fill a nuclear reactor’s cooling tower, and a typical exchange with ChatGPT is roughly equivalent to pouring a large bottle of water on the cement.

Reducing AI’s Carbon Footprint

Two big factors of energy and climate crises in general are how much energy is consumed and what type of fuel is used for it. So even if something requires a lot of energy, if the energy source is cleaner, the resulting CO2 emissions won’t be nearly as bad. In terms of computing and AI, smart choices would be to maintain servers with clean, renewable energy and in optimal locations: regions that are already colder to begin with would need less cooling. Even time could be a factor, with AI teams optimally training systems in the cooler hours of the night. As it is now, it doesn’t look like these options are being pursued very well. OpenAI’s succession of machine learning models, which are the most successful and well-known among similar competitors, are also by far the most polluting.

Transparency about energy and water usage could hold companies more accountable. If those driving AI were required to track and disclose their resource usage and CO2 emissions, along the lines of Environmental, Social & Governance (ESG) disclosures, that could be a good step moving forward. Perhaps usage limits could also force them to make better choices.

Another path that some researchers are exploring is a whole new rethinking of the materials used for intelligent computing. This is a way-outside-the-box, moonshot approach that would revolutionize the way we create emergent intelligence, using quantum materials rather than silicon. If that sounds intriguing, read more about it here.

In the meantime, it might be a good idea to really take stock of costs and benefits in using GPT and similar AI models, reserving their use for cases that will pack the most punch and carry their weight in emissions. In terms of energy consumption, the human brain is remarkably efficient. Despite all the hype and awe of these capable computer systems, on some occasions it might make sense to slow down, take a sip of water and write that email ourselves.

Also see: Technologies for Tackling Ocean Plastic Pollution

Exit mobile version