Keeping a Data Center Cool in the Summertime City

June 27, 2012 No Comments »
Keeping a Data Center Cool in the Summertime City

In pursuit of lower operating costs, companies are building data centers in remote locations where they can take advantage of free cooling. Yahoo put its first “chicken coop” data center in Lockport, New York—a few miles from Lake Erie, where the July high temperatures only average 80 degrees, allowing it to operate with 100% outside air and no chillers. Facebook is going even further for free cooling, building its first European data center in Lulea, Sweden, about sixty miles from the Arctic Circle with July highs averaging in the mid-sixties.

That approach works great when building a brand new facility and when you don’t need to have the data center close to a population center. But what happens when you must be located in urban areas where space is at a premium and summer temperatures soar?

“There is a cumulative effect because all day long the heat cooks the bricks and they cumulatively heat up deeper and deeper and don’t return to ambient temperatures at night,” says the assistant chief engineer for the data center of a multi-billion-dollar financial institution in New York City.

With outside temperatures over 100 degrees and significantly hotter on the roof where the condensers were located, the data center was at risk of having to shut down when temperatures exceeded safe operating levels. By installing a MeeFog system to cool down the air entering the rooftop condensers, the financial company increased the amount of equipment in the data center without tripping offline during summer heat waves.

Keeping up With High-Frequency Trading

In most cases, latencies of a few milliseconds are fine. Users never notice the difference. But for high-performance processing, shaving micro- or nanoseconds is what it is all about. The larger the on-die cache, the better the CPU performs. A specially designed computing appliance, therefore, can outperform an assemblage of discrete components.

Fiber optic networks are fast, operating at roughly two-thirds the 186,000 miles per second that light moves in a vacuum. Using a fiber optic cable running from Yahoo’s Lockport data center to Wall Street would get the data there in about three milliseconds. But with the growth of high-frequency trading (HFT), that latency is far too slow. Financial institutions don’t have the option of using a remote data center; the computers must be located as close to the market as physically possible to eliminate the milliseconds or microseconds it takes to transmit data.

HFT involves analyzing massive amounts of data to locate small opportunities for profit. For example, there may be a slight difference between the exchange value of the Swiss Franc between the London and Frankfurt markets, or between the bid and ask prices on a share of stock. Those opportunities may only exist for less than a second. During that time the HFT company must locate the opportunity and execute both the buy and sell transactions. Although the profit margin on each trade is small, by executing thousands or even millions of such transactions daily, companies make billions of dollars in annual profit. HFT now accounts for more than 70% of trades on U.S. equity markets, and a growing percentage of trades in other countries.

To accommodate HFT, the markets have sped up their own systems and now offer an average three-millisecond latency on transactions—far less for some. For example, Millennium IT, which is owned by the London Stock Exchange, offers a platform that can execute more than 500,000 transactions per second with less than 100 milliseconds latency total.

To operate successfully, therefore, a trading firm can’t afford to work from a remote data center, or even to use fiber optics. Instead, it uses microwave transmissions to connect the data center to the trading floor. This means that it is stuck with whatever environmental conditions exist in that city.

Heat Wave Hassles

The financial institution in question operates two mirrored data centers to service its New York City operations. One of these facilities occupies the second floor in a 14-story brick building, which was originally built in 1912 to house a department store but now is a multitenant structure hosting computing and telecom equipment. The 87,000-square-foot data center contains about 30,000 square feet of white space for the computing and storage equipment and a small amount of office space. The rest of the space is for battery rooms and other ancillary equipment.

Unlike rural data centers, which can pull in enough outside air to keep the data center cool, this data center is in the middle of a dense urban area where the streets and brick buildings absorb the heat day after day and don’t fully cool down at night, gradually raising the temperature inside the building.  To make things worse, although the urban ambient temperatures themselves are higher than the surrounding countryside, the rooftop temperatures can exceed the ambient temperatures by another 10 degrees.

This NYC data center uses an indirect cooling system with computer room air conditioning (CRAC) units in the white space. Glycol runs through heat exchangers in the CRAC units and is then pumped to rooftop units where it is recondensed and the heat dissipated into the outside air.

When the data center went through a major upgrade that added to the heat load, there was enough room on the roof to install five additional condensers to supplement the eight already in place. This arrangement works for most of the year, but for a few weeks each summer it is insufficient. The cooling system operates properly as long as the glycol temperature can stay below 100 degrees. When it exceeds that temperature, the cooling system can trip off line.

“We have dry coolers on the roof, so we don’t have the benefits of evaporation,” says the data center’s assistant chief engineer. “If we have 1,000 tons of refrigeration at 100 degrees, when it goes to 110 degrees capacity drops and we are just shy of what we need. On very hot days when it gets above 110 degrees on the roof—which we reached about 10 days last year—the units would have tripped offline on high head pressure.”

Cooling Down the Inlet

To keep the data center operating, since there was no room for additional rooftop condensers, the financial institution needed to find a way to get more cooling out of the condensers it did have by bringing down the inlet air temperature. After experimenting with using lawn water sprinklers to spray water into the inlet air on the bottom of the condensers, it decided to put in a more efficient and controllable MeeFog system to keep the glycol temperature and pressure within limits.

Mee Industries has been building fogging systems since former Cornell University researcher Thomas Mee, Jr., founded the company in 1969. These systems use high-pressure pumps and specialized impaction pin nozzles to break the water down into a fog of droplets. At 2,000psi, these droplets average less than 10 microns in diameter, or about one-tenth the width of a human hair. The droplets rapidly evaporate, lowering the temperature of the air.

Cooling nozzle for data center

Nozzles produce billions of tiny fog droplets that evaporatively cool the inlet air close to the wet bulb temperature.

The company designed a system for this application that consisted of a single 10HP, 480V Grundfos CRI-5 pump with Allen Bradley controllers to pressurize the water for all use by the eight original condensers. Three-quarter-inch stainless steel feed lines bring the water from the pump to the 90-nozzle fogging arrays placed in the bottom inlet of each of the condensers. A Marlo, Inc., water-softening system keeps minerals from clogging the nozzles or building up on the condenser fins.

“The MeeFog system pressurizes the water to just under 1,000psi, and when it comes out of an orifice in such a fine fog, it cools the outside air and drops the temperature of our glycol by about six to eight degrees,” says the engineer.

Putting It to the Test in the Data Center

Once it was installed, the fogging system was quickly put to the test in a heat wave last summer. By the third day of the heat wave, the rooftop temperature hit 115 degrees, even though the air temperature was just 103 degrees.

“Since we were above our 100 degree max, our cooling capacity was below what we needed,” says the engineer. “We didn’t wait until we had units failing, but fired up the MeeFog unit as soon as the temperature hit 100, and the glycol temperature dropped about six degrees. As the day progressed and the outside temperatures climbed, our glycol temperature didn’t get any higher, so the fog cooling effect was pretty substantial.”

Cooling data center

Nozzles produce billions of tiny fog droplets that evaporatively cool the inlet air close to the wet bulb temperature.

Nozzles produce billions of tiny fog droplets that evaporatively cool the inlet air close to the wet bulb temperature.

Currently the system is only cooling the bottom inlet air, which is enough to meet the original intention of keeping the data center online during heat waves, but an engineering firm has been to the site and is looking at fogging the condensers on three sides to provide additional cooling and reduce the amount of energy required to cool the data center. If successfully implemented, this system addition could considerably reduce overall cooling costs.

Nozzles produce billions of tiny fog droplets that evaporatively cool the inlet air close to the wet bulb temperature.

Add Comment Register



Leave a Reply