The Water Footprint of AI Data Centres

The use of Artificial Intelligence (AI) is increasing in the modern world and among a multitude of concerns raised have been the environmental impacts of such technology. There has been a lot of public debate about the energy consumption of AI data centres, yet there has been almost no discussion about their water consumption – until now.

Artificial Intelligence (AI) data centres are the physical facilities that house the enormous computing resources and storage that enable chatbots like ChatGPT to learn and remember information. In many parts of the world data centres are housed in vast warehouses, and these have to be fed with electricity 24 hours a day, with internal climates tightly controlled to allow the technology to perform at an optimum level.

As such, data centres have attracted negative media attention for the vast amount of energy they consume. It is estimated that collectively that AI data centres use 2% of all electricity generated globally.

However, less well known is the amount of water these facilities use. According to a new paper by researchers from the University of California Riverside and the University Texas Arlington, ‘Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models’, it is estimated that a conversation of between 20 and 50 questions with an AI chatbot could consume 500ml of water. This may not seem all that much, but the report also suggested that in 2021, Google’s US based, self-owned AI data centres alone used an estimated 12.7 billion litres of water for on-site cooling, 90% of which was potable.

AI data centres consume water in two ways – indirectly and directly – and consume the most during the ‘training’ phase: the period in which AI is fed data and programmed how to respond.

Indirect consumption is the water that is used off-site in the generation of power. An example of this could be the water used in the cooling towers of coal-fired power stations. Although AI data centres do not directly consume this water, their enormous demand for power means that to keep the centres running all day, more water is used in comparison with other types of facilities.

Direct water consumption is the water that is used on-site by the data centre for its own cooling purposes. Almost all of the power put into data centre servers is converted into heat and the sheer size of the units only exacerbates this, so in order to maintain stable functional temperatures water is passed through a heat exchange system to cool the equipment and stop it overheating.

The water consumption for US-based facilities may be striking but the research in ‘Making AI Less “Thirsty”’ also suggests that less technologically advanced data centres in Asia could use as much as three times the amount of water as their western equivalents. This is worrying considering that by 2030 it is predicted that half of the world’s population will face severe water stress.

Looks like “Chatting” is a thirsty business.

AI should come up with a solution.

Dilute the Water perhaps!