On average, a ChatGPT query requires nearly 10 times as much electricity to process as a Google search, according to a recent Goldman Sachs report. Open AI CEO Sam Altman said that artificial intelligence (AI) will use vastly more energy than people expected and a clean energy breakthrough is necessary in that case.
Generative AI is the most used word in recent times, and we see it around us in all ways possible—be it on our social media apps or our work. While this technology is making life easier across the globe, until we have cleaner sources of energy, it is not good for the climate.
Take for instance, higher use of artificial intelligence would mean more need for data centres. The Goldman Sachs report said that AI is expected to drive a 160% increase in data centre power demand.
International Energy Agency estimates the energy use from data centres that power AI will double in the next two years, consuming as much energy as Japan. The expected rise of data centre carbon dioxide emissions will represent a social cost of $125 to $140 billion and this is just at the present value.
Right now, the energy consumption is largely thermal, which would mean burning fossil fuels to generate electricity. Unless the shift is made to clean energy to power data centres, it will continue to cause massive carbon emissions if there's no switch due to higher power consumption.
Training the GPT-3 used as much energy as 120 American homes over a year and training
the GPT-4 model used approximately 40 times more energy than GPT-3. Additionally, a study by Hugging Face and Carnegie Mellon University found that generating just one image from a powerful AI model takes as much energy as a full charge of a smartphone. Scale that up and generating 1,000 images would result in the carbon output of driving a car for six and a half kilometres.
This is about electricity. But AI systems are also water guzzlers and sound reports suggest they're often located in areas that already face water shortages.
AI servers’ massive energy consumption generates much heat to take care of the heat and to avoid server overheating, data centres commonly use cooling towers, which need a staggering amount of clean fresh water.
Training large language models such as GPT-3 can require millions of litres of fresh water for both cooling and electricity generation. This puts a strain on local freshwater resources.
The US Department of Energy estimated that US data centres consumed 1.7 billion litres per day in 2014, 1.4% of daily water use in the country.
So yes, AI comes with its benefits, but a lot of negative for the environment as well. Efforts are being made to ensure AI helps in sustainable solutions — that is where technology helps us. But till the time we have more investments in renewables and clean energy, the AI part could continue to be dangerous for the environment.
Also Read: Here is how GenAI will change the business landscape in the coming years