Energy is the most widely noted (and sometimes lamented) resource driving data centers. But although not all data centers use water directly, many do, and the water supply is even more critical than energy to life in general. Following the Data Center Journal’s look at energy consumption, then, we turn to water.
U.S. Water Use
Per-capita total water usage in the U.S. has followed a pattern very similar to that of per-capita energy usage. According to U.S. Geological Survey (USGS) data on water consumption, which is reported at five-year intervals, along with Census Bureau data on population, per-capita water usage (like energy usage) shot up steeply in the 1950s and 1960s. (The USGS data is for total water withdrawals, defined as “water removed from the ground or diverted from a surface-water source for use.”) Both, however, apparently reached a peak in roughly the mid-1970s. At that point, water consumption began an equally dramatic falloff. Per-capita energy consumption seems to have followed a general downward trend, although that trend only became more apparent after about 2000. But both downward trends appear to have steepened—right about the time of the Great Recession. The chart below shows an overlay of per-capita consumption for water (in units of five gallons per day) and energy (in units of one million BTUs per year). Energy data is from the U.S. Energy Information Administration.
Total consumption shows water usage rising from the 1950s to about 1980, as does energy, but it demonstrates a marked shift after that point. Total usage largely leveled, only to see a major decline in 2010 relative to 2005. Again, the Great Recession may be at least partially to blame; total energy consumption saw a decline even more pronounced than what followed the dot-com bust. The chart below compares total water consumption in units of 250 million gallons per day versus energy in quadrillion BTUs.
The declining per-capita water usage in the U.S. is very likely due to increased efficiency, such as through better appliances, water-conserving fixtures and a general growing awareness of the need to avoid wasting water.
Data Centers: Direct and Indirect Usage
Water-usage statistics for the data center industry are difficult to determine, and they wouldn’t necessarily represent the industry as a whole. Specifically, although all data centers use energy (some more efficiently than others), not all use water. And among those that do use water, consumption levels can vary wildly depending on the type of cooling system. Nevertheless, even for those data centers that don’t use water to cool their IT equipment, their electric utility provider likely does use water. Fossil-fuel and nuclear power both rely on water for their steam turbines, which is why these facilities are typically located near rivers or other bodies of water. Therefore, in most cases, every watt (or joule, to be a little more precise) that a data center consumes probably requires some corresponding water use. Certain electricity-generation methods such as wind and solar require no water whatsoever, though. Hydroelectric power is obviously water intensive, but its main effect on the water quality is increased temperature.
Not all types of water usage can be equated. Water usage doesn’t necessarily imply water contamination (through heat or waste products, for instance), but it may reduce available supply for other critical purposes—particularly in the case of a drought, such as the brutal dry spell that California continues to suffer. Furthermore, not all water is equal: fresh water, “grey” water, saltwater and so on all have different levels of purity relative to different contaminants, and each has a its own range of uses. Fresh water may be the most useful, but it’s also relatively rare compared with, say, saltwater.
Some data center operators have made efforts to reduce their direct use of water by using grey water instead of fresh water, for instance, or even by using seawater. Seawater in particular poses a challenge owing to its corrosive salt content, but for operators that can overcome such challenges, the supply is virtually limitless. The danger here, however, is managing the temperature of the “waste” water, as warmer water reintroduced directly to the ocean (or whatever the body) can affect the local ecosystem.
Reducing Dependence on Water
The problem with fresh water is that getting it from contaminated water is a bear. The atmosphere does a fairly good job of purifying water through evaporation and precipitation, but this process isn’t necessarily sufficient to meet demand. Getting fresh water from seawater in large quantities is an energy-intensive process—and, as mentioned above, generating electricity itself uses lots of water (depending, of course, on the technology).
One of the chief ways that data centers can reduce water consumption is greater energy efficiency. Furthermore, the benefit is multiplied for facilities that use water-based cooling systems: not only does their energy consumption (and bill) fall, along with their water consumption for cooling, the utility provider also saves water because of less demand, all things being equal. (Thanks to the Jevons paradox, however, all things may not be equal—but that’s another matter.) Other approaches, such as water-side economization and so forth, can also deliver benefits. Heavy water users may be able to arrange with their utility providers to use grey water rather than potable water.
As with energy, however, it’s important to note that water usage in the U.S. is not on a runaway trajectory. That’s not to say we should be slack on efficiency efforts, nor to say that water use isn’t a perennial issue, but it’s important to recognize progress and to avoid becoming too shrill—particularly about data centers.
Hall of Shame
All that being said about water usage, the importance of efficiency and conservation, the limited supply of fresh water, and what data center operators are doing to reduce their reliance on water, we would be remiss not to identify at least one case of a data center operator on the other end of the spectrum. Perhaps the leading candidate for the badge of dishonor with regard to water use (or waste, as it were) is the NSA’s Bluffdale, Utah, facility.
Apart from serving a nefarious and (at best) Constitutionally dubious purpose, this million-gallon-per-day-guzzling data center sits in a desert—and one that is currently suffering a severe to extreme drought to boot. The same U.S. government that bloviates about water efficiency apparently saw insufficient value in picking a location with a plentiful supply, instead going to the opposite extreme. Then again, the same organization that can’t effectively manage its enormous tax revenues without running up a nearly $20 trillion debt probably can’t be expected to conserve precious resources, either.
Although data center water use is difficult to quantify and, furthermore, varies greatly among facilities (all data centers use energy, but not all use water directly), identifying a trend in this area is a tough task. Overall in the U.S., water usage appears to have leveled or begun a decline in absolute terms, and per-capita usage has seen an accelerating falloff. Efficiency has therefore delivered measurable benefits given the rising population. Some data center operators can take steps to limit their own water usage in those cases where the cooling system uses water, but all can help the situation through greater energy efficiency. Nevertheless, in calling for more-careful water use, it’s important to note that big strides have been made already. On the other hand, water is a more critical resource, so it warrants attention from data center operators and others alike.
Leading article image courtesy of Sfivat