Morgan Stanley says the rapid expansion of artificial intelligence (AI) could run into a resource bottleneck far beyond chips.
In a research note circulating on X, the bank projects that AI data centers will drive annual water use for cooling and electricity generation to new heights in just four years.
“We expect AI data centers to drive annual water consumption for cooling and electricity generation to approximately 1,068 billion liters by 2028 (our base case) – an 11x increase from 2024 estimates.”
The surge comes as demand for generative AI compute explodes. Morgan Stanley forecasts power needs will expand significantly, straining not just electricity grids but also the cooling systems required to keep advanced GPUs running.
“Compared to GenAI power expanding to 8.5x current levels by 2028e, per MS estimates.”
Cooling relies heavily on water. And unlike chips, which can be ordered and fabricated, water is finite and politically sensitive. Morgan Stanley analysts point to the role of semiconductor fabrication as well, which requires vast amounts of ultrapure water.
“Separately, semiconductor manufacturing (AI’s scope 3 water footprint) is also water-intensive, as a typical facility requires up to five million gallons of ultrapure water each day.”
Sam Altman, chief executive of OpenAI, has called compute “the currency of the future.” But Morgan Stanley’s data suggests the real denominator of compute may be water, not silicon.