Politeness comes at a price, reveals the CEO of AI giant
It turns out good manners aren’t just a social nicety — they’re an expensive habit in the digital age. OpenAI’s CEO Sam Altman recently confirmed that user politeness, such as saying “please” and “thank you” to ChatGPT, adds millions of dollars to the company’s operating costs each year.
The revelation followed a light-hearted post on social media platform X, where a user posed a tongue-in-cheek question: “How much money has OpenAI lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models?” Altman’s reply was short but telling: “Tens of millions of dollars well spent. You never know.”
Though the comment was made in jest, the underlying issue is quite real. Even seemingly trivial phrases like “thank you” and “please” require the AI model to interpret context, generate full responses, and maintain conversational flow. Each exchange, no matter how short or polite, consumes valuable computational power — and, by extension, energy.
AI’s growing appetite for electricity
As AI tools like ChatGPT become increasingly integrated into daily life, the energy demands associated with their operation are soaring. According to a report by Goldman Sachs, each ChatGPT-4 query consumes roughly 2.9 watt-hours of electricity — a figure that dwarfs the energy usage of a standard Google search, which typically requires just 0.3 watt-hours.
With over a billion queries handled daily by OpenAI, the maths adds up quickly. ChatGPT is estimated to consume around 2.9 million kilowatt-hours of electricity every single day. That’s equivalent to the daily energy usage of nearly 100,000 UK households.
This growing energy demand is raising eyebrows among environmental experts and policymakers. The Electric Power Research Institute (EPRI) projects that data centres, heavily powered by AI workloads, could account for as much as 9.1 per cent of total electricity usage in the United States by 2030. The International Energy Agency (IEA) has issued similar warnings, forecasting that data centres will be responsible for more than 20 per cent of the rise in electricity demand across developed economies by the end of the decade.
Clean energy or clean conscience?
OpenAI, like many tech giants, is acutely aware of the environmental implications of its operations. In response, Sam Altman has taken proactive steps to invest in cleaner energy alternatives. He has backed nuclear fusion start-up Helion Energy and solar energy innovator Exowatt — both seen as potential game-changers in the race to power tomorrow’s AI in a more sustainable way.
The company is also investing heavily in its own infrastructure. Enhancements to OpenAI’s data centres are aimed at making them more energy-efficient and capable of handling the surging demand for AI-generated content and interaction.
Still, the tension between growth and sustainability remains. For all the charm of a digital assistant that never tires of being thanked, every extra word, phrase, or virtual courtesy requires electricity. And that electricity, more often than not, comes at an environmental cost.
A matter of manners — and megawatts
Altman’s light-hearted acknowledgement that “please” and “thank you” cost millions may seem amusing, but it shines a light on a deeper truth. As AI becomes more human-like, our interactions with it increasingly mirror our interactions with real people. That includes being polite — and expecting polite responses in return.
While we may not want to discourage kindness, users might one day need to weigh their words with more than just etiquette in mind. In a future where every phrase has a power footprint, even manners may need to be energy-efficient.