
The Carbon Cost Of Talking To A.I.
How many 'Thank You's' does it take to boil a planet?
Let’s start with the bad news: even your politeness is unsustainable now.
Recently, OpenAI CEO Sam Altman offhandedly admitted that people saying “please” and “thank you” to the chatbot costs the company tens of millions of dollars in electricity.
Of all the things we imagined could nudge the planet further into crisis—SUVs, fossil fuels, plastic straws—“thanks” probably wasn’t high on the list. And yet here we are, in a world where your polite little exchange with ChatGPT might have just guzzled half a litre of water and left a trail of carbon in its wake.
However, it is true that each extra word you feed the machine sends ripples through vast server farms and energy-hungry data centres. Generative AI isn’t just intellectually expensive—it’s materially, environmentally, and ethically taxing too. And while Altman’s comment might’ve been tongue-in-cheek, the implications aren’t.
You May Also Like: Moving On From ChatGPT? Here Are 6 Other AI Chatbots You Need To Try
Talking to AI Feels Human, Powering AI Is Anything But.
There’s something unnervingly charming about the way people interact with AI—like it’s a well-read but slightly aloof co-worker who doesn’t judge. We ask it questions, crack jokes, and apparently, go out of our way to be polite.
Here’s what doesn’t show up on your screen when you type into an AI: the server farms roaring to life in the background, powered by energy-intensive GPUs running models trained on terabytes of data. Each interaction, no matter how trivial, pulls electricity from a grid that—let’s be honest—isn’t running on fairy dust.
Training large language models like GPT-4 can cost millions of dollars and consume as much electricity as some small countries. According to the OECD, global data centres consumed around 460 terawatt-hours in 2022—roughly equal to the annual power usage of France. That figure’s only gone up thanks to the AI boom.
Even after the training’s done for a language model, the cost doesn’t go away. Every query you send—yes, even asking what you should make for dinner tonight—fires up those servers and racks up more environmental debt.
One study estimated that training a large language model can emit over 284 tonnes of CO₂. That’s the equivalent of flying from Delhi to New York more than 100 times. And that’s just the training. Every time we use it—from casual memes to full-on corporate deployments—we’re adding to that footprint.
You May Also Like: The Best Tech Gifts For Every Type Of Person
And Then There’s the Water
Oh, and let’s not forget the water.
AI models generate heat—lots of it—and keeping data centres from turning into oversized microwaves requires water. Not a splash, but hundreds of thousands of litres. A 2023 study estimated that OpenAI used over 700,000 litres of water just to train GPT-3. That’s roughly the yearly consumption of a small village. A single conversation is using about 500 ml.
We’re not just making machines think—we’re making them sweat. And the pressure on local water systems, especially in drought-prone areas like parts of the US Midwest or South Asia, is already a slow-burning problem.
Rare Metals, Rarely Discussed
Then there’s the hardware.
Beneath all this AI infrastructure is a resource extraction story we’re barely acknowledging. Those chips powering large-scale models are built on cobalt, lithium, and other rare earths—mined in conditions that are often environmentally devastating and ethically murky. There’s a cost here that’s harder to quantify but just as real: communities displaced, ecosystems gutted, supply chains that look less like innovation and more like old-school exploitation rebranded in shiny packaging.
We tend to talk about AI like it lives in the cloud. But that cloud is built on the ground—and it’s getting crowded. Oh and, the AI prompt that helped you draft your LinkedIn post probably has a dirtier supply chain than your fast-fashion buy.
You May Also Like: The Death of Skype—and Every Other Tech Relic We Left Behind
But AI Can Save the Planet… Right?
Yes. And no. Generative AI can help fight climate change—optimising grids, forecasting extreme weather, analysing emissions. But that potential is increasingly used to generate cat memes and Ghibli images. And the infrastructure needed to support it is growing faster than our ability to regulate or green it.
The real kicker? There are ways to reduce AI’s environmental burden. Compressing models, designing for efficiency, recycling hardware, sharing datasets—it’s all on the table. But innovation often takes a backseat when speed, scale, and user acquisition are the key indicators of growth.
Is There A Greener Way To Think?
Here’s the nuance: this isn’t a call to cancel AI. It’s not even a call to stop saying “thank you” (although maybe just a thumbs up will do?). It’s a call to reckon with the cost. AI isn’t some weightless cloud—it’s metal and minerals, megawatts and water.
There are fixes in the pipeline. While some are working on algorithms that could slash energy consumption, others are exploring recycled rare earth materials, better cooling systems, and cooperative model training to reduce redundancy. But those solutions are years away from becoming mainstream—and meanwhile, the AI arms race rages on.
In the end, maybe it comes down to intent. We’re building machines to mimic humanity, but the more human they get, the more resources they devour. Being polite to AI may make us feel good—but it doesn’t make the machine any smarter, or the planet any cooler.