This article was written by Daniel Foreman, a member of the Web Professionals Global Advisory Board. You can read more about Daniel here.
The image above was created by the ChatGPT Image Generator by searching the phrase “Environmental cost of AI.” The image depicts a robot holding cash standing next to a factory emitting smoke.
In the past couple of years, we’ve all become painfully aware of the environmental cost of cryptocurrency. The energy consumption of these digital currencies, driven by the massive demand for GPUs to mine coins, is equivalent to that of an entire nation. This is a major issue that we’ve had to grapple with.
Now, as we begin to emerge from a hectic 2023, we find ourselves in the midst of what can only be described as an AI war. Companies are vying for dominance in this new digital frontier, diverting vast amounts of computational resources to power the training of these services. The environmental implications are startlingly similar to those of cryptocurrency.
Analysts predict that the carbon footprint of AI could match, or even surpass, that of bitcoin mining – a sector that already generates more greenhouse gas emissions than some entire countries. If the AI industry continues on its current trajectory, it’s projected to consume a staggering 3.5% of the world’s energy supplies by 2030.
Consider this: one of the leading AI companies spends an estimated $700,000 per day – yes, per day – delivering its online service to 100 million users worldwide. And with the recent integration of Windows Copilot into Windows 11 as a default feature, this usage number is set to soar even higher.
ChatGPT use increases during the school year. Website traffic jumped up nearly 5% from August 2023 to September 2023.
A survey by Savanta reveals that 47% of respondents have used ChatGPT for fun or learning purposes. Surprisingly, 42% of millennials use it for businesses, which is higher than 29% of Gen Z users, 26% of Gen X, and 20% of Boomers.
Both cryptocurrency mining and AI share a voracious appetite for high-end, powerful GPUs. This surge in AI activity has sent nVidia’s profits skyrocketing 101% since last year, reaching a whopping $13.5 billion dollars.
When an AI is deployed, it undergoes a training procedure that requires hundreds of hours of runtime before it even reaches the end user. It’s estimated that GPT-3, a relatively simple AI model and the predecessor of the current model GPT-4, consumed 1,287 megawatt hours of energy during training using 10,000 GPUs. This process produced as much CO2 as the equivalent of 123 gasoline-powered passenger vehicles driven for one year.
And let’s not forget about other language models like BLOOM and LLAMA, which have also consumed significant amounts of power during training. These three models alone have consumed enough energy to fly around the entire Earth 41.25 times.
Once these models have been trained and are up and running on a daily basis, every single query given to GPT-4 on Open.AI is estimated to consume the same amount of power as a 5 watt lightbulb running for 1 hour and 20 minutes.
If we take OpenAI’s 100 million users and assume each user makes one query every day, then that amount of energy used is 6.65W x 100,000,000 = 665 megawatts per day or 242 terawatts a year. To compare, most people spend about 10,000 kilojoules amount of energy per day or about constant of 110 watts for 24 hours.
Currently, it’s estimated that cryptocurrencies draw about 150 terawatts annually. If our estimate of one person making one query a day every day for a year holds true (which is just a guess), then OpenAI on its own is already consuming more power than cryptocurrency.
Additionally, major players in other countries, such as Baidu in China, have entered the AI realm.
This is a wake-up call for all of us. We need to be mindful of our digital consumption and strive for sustainable solutions in our pursuit of technological advancement.