Technology

How much damage is GenAI really doing to the environment?

You ask ChatGPT for a quick answer to a burning question, or perhaps you get it to generate a stunning image in seconds. The response is immediate, precise, and often brilliant. But have you ever stopped to wonder: What’s the real cost behind these lightning-fast results?

Recently, OpenAI's CEO, Sam Altman, dropped a remark that caught everyone’s attention. In a tweet, he noted: "It's super fun seeing people love images in ChatGPT, but our GPUs are melting. We are going to temporarily introduce some rate limits while we work on making it more efficient. Hopefully won't be long!"

Now, Altman wasn’t suggesting that ChatGPT was literally melting anything — but his words reveal an important truth: the soaring demand for image generation and other AI-powered services is stretching the infrastructure to its limits. The underlying message? The massive energy consumption driving this technology is becoming harder to ignore.

ChatGPT has revolutionised everything from how we write to how we create. It seems like AI is the magic wand that can do it all. But as these tools become indispensable, there’s a darker question lurking: Is this AI marvel helping the planet, or is it quietly accelerating environmental degradation?

Let’s find out!

AI models like ChatGPT require immense computational power for training. Training a large-scale AI model is not a simple task; it’s like asking a computer to run the digital equivalent of a marathon for weeks. Training GPT-3, the predecessor of ChatGPT, took an estimated 300 metric tons of carbon dioxide – roughly the same amount as 125 round-trip flights between New York and Beijing. While that’s a start, the carbon toll doesn't stop at training.

Once an AI model is trained, it moves into the deployment phase. When you interact with ChatGPT, it's powered by a massive cloud infrastructure with high energy demands. A single query to ChatGPT, depending on the model, uses around 2.9 watt-hours of electricity. For context, this is about 10 times the energy consumed by a standard Google search.

Now, imagine the scale: millions of interactions are happening globally every single day. In fact, recent data shows that over 300 million people have engaged with ChatGPT (as of early 2024). Every time a user inputs a request, the servers power up, generating more electricity consumption and inevitably increasing carbon emissions.

A general estimate suggests that for every 1 billion queries, the carbon footprint could rise to approximately 6,000 metric tons of CO2 — the equivalent of 1,200 cars driving for an entire year. Let that sink in. With the growing popularity of AI-driven tools, these numbers will only continue to climb, unless drastic steps are taken toward sustainable infrastructure.

Water usage

When we think of tech’s environmental footprint, water is rarely top of mind. But when you dig deeper, water usage is a key part of AI's footprint, especially in data centers. These data centers are essential for hosting AI models, but they’re also energy-guzzling behemoths that need cooling systems to keep them from overheating. The water footprint of AI might surprise you.

Take the training of GPT-3, which reportedly used around 700,000 litres of freshwater – that’s the water required to manufacture 370 BMW cars or 320 Tesla vehicles. And this is just for one model. Cooling for the ongoing operation of AI servers adds up quickly, and it's often powered by large-scale water usage from local sources.

Now consider the ongoing operational needs: for every 20-50 queries answered by ChatGPT, it consumes about 500 milliliters of water. With millions of interactions each day, that amount multiplies rapidly. It's the hidden cost that most users don’t think about when they type their next question to ChatGPT.

E-waste

AI is driving technological advancements at lightning speed, but as anyone who’s tried to keep up with the latest gadgets knows, this rapid pace often leads to electronic waste (e-waste). And this isn’t just about your old smartphone or laptop. In the AI ecosystem, the need for powerful, cutting-edge GPUs (graphics processing units) has led to an explosion of e-waste. The hardware needed to run these systems is constantly being upgraded to keep pace with AI’s growing demand.

In 2023, generative AI applications accounted for about 2,600 tonnes of e-waste, and the trend isn’t slowing down. By 2030, experts estimate that the sector could generate as much as 2.5 million tonnes of e-waste globally. That’s an entire generation’s worth of hardware — just to power AI. And the problem with e-waste isn’t just that it clutters up landfills; it’s that the valuable resources in these chips and circuits aren’t being recycled efficiently. So, not only does the mining for these materials have its own environmental toll, but their disposal also creates long-term pollution issues.

Who bears the brunt?

It’s crucial to note that the environmental costs of AI are not distributed equally across the globe. Data centers — where AI models like ChatGPT live — are often located in areas with cheaper energy sources, which tend to be more environmentally harmful. For instance, many data centers use coal-powered electricity, which significantly increases the carbon emissions associated with their operations.

In places like Memphis, Tennessee, where Elon Musk’s xAI operates a major data centre, there have been concerns raised about the local environmental impact. These facilities require massive amounts of energy and cooling, leading to increased air and water pollution, a critical concern for the surrounding community.

Moreover, this phenomenon could exacerbate environmental inequalities, where poorer regions (often rural or developing) bear the brunt of the environmental degradation caused by these energy-intensive operations. This is environmental injustice, where the benefits of AI are enjoyed by the affluent, while the costs are offloaded to less-resourced regions.

How can AI become green?

With all of these environmental issues in mind, what’s being done to reduce AI’s impact? Tech companies are aware of the challenges, and many have taken important steps toward mitigating the environmental damage caused by their operations.

  1. Microsoft’s carbon negative pledge - Microsoft, a key partner of OpenAI, has committed to becoming carbon negative by 2030. This means that Microsoft will remove more carbon dioxide from the atmosphere than it emits, which could help offset the environmental costs of the AI models it supports.

  2. Renewable energy investments - Microsoft and other major tech players are investing heavily in renewable energy for their data centers. Many of these companies are moving towards wind and solar power, which would dramatically lower the carbon footprint associated with running AI models.

  3. AI model efficiency - There is a growing focus on creating more energy-efficient models. By optimising algorithms to use less power, AI companies can reduce the energy consumption of their models. OpenAI has already made strides in improving the efficiency of its models, and more companies are expected to follow suit.

  4. Cooling systems innovation - Cooling AI data centers traditionally involves using vast amounts of water and electricity. However, innovative cooling systems, including using AI to optimise cooling efficiency, are being researched and tested. For example, liquid cooling systems could drastically reduce the water needed for cooling.

What can we do?

If we want to harness the power of AI responsibly, it’s not just up to tech giants to change. We all have a role to play in minimising the environmental cost of these systems.

  1. Reducing unnecessary AI interactions - As consumers, we can limit our over-reliance on AI for tasks that don’t require it. Do we really need to ask ChatGPT for a haiku about toast every single time we’re bored? Every query counts. While this seems trivial, imagine the environmental impact if everyone cut down on frivolous AI requests.

  2. Support green AI initiatives - Supporting tech companies that are leading the charge on sustainability can drive the industry toward greener practices. If we demand accountability and transparency from AI companies, they may be more motivated to pursue eco-friendly practices.

  3. Policymaking - Governments should push for regulations that encourage energy efficiency and carbon reduction in AI operations. Just like any other heavy industry, AI must be held accountable for its environmental impact. Regulations can ensure that energy-intensive models meet environmental standards, reducing the sector’s carbon footprint over time.

So yes, AI is pushing the boundaries of what we can achieve in terms of technology, but the environmental consequences are real and cannot be ignored. It’s up to the industry, governments, and consumers alike to ensure that the next wave of innovation doesn’t come at the cost of our planet’s health.

The challenge now is to balance technological advancement with sustainability, ensuring that the green revolution of AI doesn’t come at the expense of our future.

Browse more in: