Each time you hearth off a fast immediate or question to your AI assistant, be it ChatGPT, Gemini, or Claude, there’s much more occurring behind the scenes than we understand. All these responses are powered by huge knowledge facilities that reportedly gulp thousands and thousands of gallons of water to maintain servers cool. Fashionable AI infrastructure is so thirsty {that a} single 100-megawatt knowledge middle can use as much as 2 million liters of water per day for cooling alone. That roughly interprets to the each day water wants of about 6,500 households. However it appears that evidently demand isn’t going to decelerate anytime quickly.
A report from Morgan Stanley forecasts that AI knowledge facilities might eat round 1,068 billion liters of water yearly by 2028. In different phrases, that’s an 11x enhance over present estimates as computational demand and cooling wants develop (through Economic Times). These rising water calls for have already began to concern native communities, particularly in areas the place drought and restricted provides are already an issue. To present you an instance, Microsoft needed to scrap its Wisconsin knowledge middle plan again in October 2025 on account of pushback from native communities.
OpenAI has its say concerning AI infrastructure’s rising pressure on native assets
OpenAI, which is actively investing in new knowledge facilities, is effectively conscious of this and has determined to handle the rising issues throughout native communities. The corporate has pledged to minimize its water usage and canopy vitality infrastructure upgrades associated to its knowledge facilities. “We’re being good neighbors,” the corporate mentioned, immediately addressing issues about rising utility payments and useful resource pressure close to its services.
The corporate is reportedly dedicated to paying its personal method on vitality, so its operations don’t enhance native electrical energy costs. It additional mentioned that it’ll work with communities to cut back the footprint of its Stargate knowledge facilities. Whereas OpenAI didn’t define actual water-saving strategies, it mentioned the affect could possibly be restricted by improvements in cooling water techniques and AI design that scale back general water consumption.
OpenAI additionally highlighted how a lot water is commonly used, together with potable water, to chill knowledge facilities, which has been a typical level of concern throughout native communities. This comes amid related pledges from different main tech corporations, like Microsoft. Placing it merely, the AI infrastructure footprint isn’t nearly electrical energy and carbon anymore, but in addition water, one other useful resource communities depend on daily.
Solely time will inform how quickly the promised measures might be adopted
As AI continues to increase, and assistants like ChatGPT and Claude deal with extra well being, work, and analysis questions, that environmental price is shifting to the forefront of public discussions. For now, OpenAI’s pledge means that the business is listening, however actual change will rely on how rapidly these water-saving improvements are put into observe.
