September 27, 2023 | By Avneet Nehel
The whole world is seemingly going gaga over what an amazing technology ChatGPT is, yet few pause to consider where this technology comes from. Use of technology is a necessity – especially for the younger generation. Artificial intelligence (AI) driven technology requires reliable and large data centers to process and store all the data. These data centers are used not only by ChatGPT, but by every other large digital technology-driven service provider, including Google, Microsoft, and Apple – leaving a large carbon footprint from the emissions to run the computers and heat and cool the facility.
According to the data management company Veritas Technologies, the energy required to store these massive amounts of data in 2020 resulted in the production of 5.8 million tonnes of carbon dioxide. Every year, need for data storage increases, especially as AI-driven technologies like ChatGPT become more prevalent. It is even more troubling given that in a world where data mining is becoming increasingly critical, the increasing prevalence of energy-hungry data centers is rarely identified as a problem by climate action groups. The impact data centers are having, or I should say have been having, on natural resources for more than a decade is the more crucial issue.
This issue has been covered in the media for a few years now. However, it appears that few are concerned about the effects on water, the most valuable natural resource we have on Earth. We frequently believe data has no physical form and, thus, no effect on the environment. This is far from the truth. More than 50 percent of all enterprises store data which is inactive and underused. The resources used to store this information have a significant negative influence on the environment, particularly water.
Data centers are thought to have a water footprint ranging from 1047 to 151,061 m3/TJ. According to a recent study, ChatGPT “drinks” half a liter of water to respond to every five to fifty inquiries. It takes 700,000 liters of freshwater to train GPT-3 alone, which is about equivalent to the water needed to produce 320 Tesla electric vehicles or nearly 370 BMW cars. These statistics will only increase with GPT-4 because the model size will be much larger.
Although ChatGPT conducts all its business online, the actual data is stored in large data centers. To avoid equipment failure, data centers need a cooling system, because they generate a lot of heat. These data centers typically use evaporative cooling towers which require huge amounts of water. These cooling towers use water to cool servers, evaporating the hot water.
The water used in this process must be freshwater with no impurities, not even minerals, in order to prevent corrosion and the growth of microbes. This means that while much of the world suffers from a lack of clean drinking water, data centers use “clean drinking water” to keep themselves cool.
These data centers depend on large power generation plants, many with their own water requirements for operation, in addition to using water for cooling purposes. Not only is the energy used to cool the water squandered, but the water itself evaporates and is lost.
The daily water consumption of a 15 megawatt data center might reach over 164,000 liters. The estimated water footprint of Microsoft-owned ChatGPT alone is 1.7 billion gallons per year. Microsoft said in its environmental report that it consumed 34% more water globally in 2022 than it did in 2021. Data centers will be used more frequently as demand for AI technology rises, which will have an impact on global water supplies.
In 2022, approximately 300 million people worldwide were affected by drought in regions of Africa, Europe, North America, and Asia. Half of the United States was dry, and nations like France and Portugal had a record drought for several years. Africa was devastated by the worst drought in the last four decades. As AI systems using large data centers proliferate, they must begin assuming social responsibility and working to reduce the risk of global water scarcity by reducing their own water footprint.
There are numerous approaches to reduce water waste, according to AI study by author Shoalei Ren. To save water, AI models can be trained during cooler times of the day. Some businesses are utilizing AI modules themselves to cut down on electricity usage.
One thing big tech corporations are doing is moving data centers to colder regions, where there is less of an influence on the aquatic environment, and where the heat is naturally regulated. Microsoft is researching the viability of Project Natick, which would immerse data centers under the sea, and is seeking to lower cooling costs by immersing server racks in a specially created fluid. Other data centers have used various strategies, such as the Nordic Data Center DigiPlez, which heats 5,000 city flats, with the waste heat from its facility in Ulven, Oslo.
As the ice melts and the temperatures rise, it will become harder to keep these servers cool because of climate change. Higher temperatures also make it more difficult for data servers to be naturally cooled, which means they will need more electricity and water. What the future holds for the most valuable natural resource in the world remains unclear. The Procido LLP Intellectual Property & Technology (IPT) Group intends to stay ahead of these developments as they unfold.
This publication is provided as an information service and may include items reported from other sources. We do not warrant its accuracy. This information is not meant as legal opinion or advice. Contact Procido LLP (www.procido.com) if you require legal advice on the topic discussed in this article.
 Making AI Less ‘Thirsty:’ Uncovering and Addressing the Secret Water Footprint of AI Models – GitHub – Ren-Research/Making-AI-Less-Thirsty: [Preprint] Making AI Less ”Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI
 Making AI Less ‘Thirsty:’ Uncovering and Addressing the Secret Water Footprint of AI Models