At the top of the data center world are the mega data centers—facilities whose massive scale dwarfs the computer closets (by comparison) of most companies. But the trend toward larger data centers among some large companies contrasts with another trend elsewhere in the industry: many data centers are getting smaller. The means and motivations of these companies that are downsizing their IT operations vary, but a number of driving factors play a major part in most cases.
Does Consolidation Make Data Centers Bigger or Smaller?
Many larger companies (and organizations like governments) are seeking to reduce costs and operating complexities by consolidating their data centers into fewer, larger facilities. In some sense, then, consolidation is driving increased data center size. But it is also a driver toward smaller data centers, since within a data center it involves reduction in the amount of infrastructure—an attempt to do more with less. The difference in the result of consolidation depends on the situation: in some sense, consolidation can mean two different things. A large company consolidating many data centers into a smaller number of facilities will most likely end up with larger data centers; a smaller company consolidating within a data center (i.e., eliminating excess server capacity; replacing older servers with fewer, higher-power servers; and so on) is likely to end up with a smaller data center footprint (for example, it might convert freed space into an office area).
Thus, consolidation seems to be driving two disparate trends, but a number of factors demonstrate clearly the reasons why many companies are aiming smaller instead of larger. Individually, “smaller” may simply mean designing a smaller data center than might otherwise be expected, moving to a smaller facility or consuming less colocation space, or actually reducing data center footprint within an existing facility.
Money (or, more specifically, a lack of it) is a major motivation in companies moving generally toward smaller data center designs. Here are a few of the driving factors.
- The economy isn’t recovering as much as you might think. Although employment numbers may be down slightly, they are still high (and they don’t really represent the employment situation accurately), the overall economy is seeing little by way of recovery. Many companies are still struggling with tight budgets, and for those that view IT as a means and not an end, data center expenses are often viewed as necessary evils. IT budgets are thus unlikely to see much in the way of increases, requiring the data center manager to do more with less. Furthermore, the lack of funds makes construction costs even more burdensome, and with outsourcing offering a pay-as-you-go option, many companies see a way out of the capital costs of building a data center.
- Energy costs continue to rise. Name your reason—political tensions in the Middle East, inflation, regulations, lack of new energy production infrastructure, growing demand in emerging economies or all of the above—energy costs continue to rise, making the proposition of running a data center a headache to companies concerned about expenses. For those that do operate or plan to build a data center, smaller sounds better from the perspective of cost: less infrastructure generally means lower costs. Higher operating costs also naturally puts a pinch on available capital, meaning a company’s maximum capacity is decreased.
- Data centers eat water, too. Energy is the most widely reported resource consumed in large quantities by data centers, but water is used by many as well. Some consume large amounts of water, putting pressure on utility companies in extreme cases. Like electricity, the cost of water is unlikely to see a decrease anytime soon, further driving downsizing as a means to stay within budget.
Virtualization and Data Centers
If consolidation is peanut butter, then virtualization is chocolate—in many ways, they go together. Virtualization pools compute, storage and networking resources and reallocates them according to demand in a manner independent of the capacity of a given server or storage drive, for example. This approach thus enables greater utilization rates for equipment, meaning a facility can generally get by with less equipment (and that’s where consolidation comes in). As long as the combined resources can handle peak workloads, a company can minimize its IT infrastructure—and, concomitantly, supporting infrastructure like power distribution and UPS systems, cooling and floor space.
Free Cooling and Data Centers
ASHRAE’s recently expanded temperature and humidity guidelines enable wider use of free cooling—year round in many locations (for certain allowable ranges). In any case, greater reliance on air-side or water-side economization means less need for mechanical cooling infrastructure. In turn, this means a smaller data center. Likewise, even for traditional cooling infrastructure, best practices enable greater efficiency, which reduces the amount of equipment needed to maintain a given operating temperature. In both cases, best data center practices enable reduction in data center size for a given capacity relative to a facility that doesn’t implement such practices.
A modular approach to data center design and construction enables right-sizing of infrastructure so that a company need not invest as much capital now to meet future needs—capital that essentially is wasted (and loses value) over the time during which the equipment goes unutilized (or underutilized). By only adding capacity—whether it be uninterruptible power supplies, cooling infrastructure, IT resources or even floor space—only when it is needed, companies can operate a data center that is as small as possible, saving both capital and operational expenses (since, for example, unutilized servers still consume power as long as they are turned on). Modular expansion may require a little more planning from the start of a data center project, but it can pay significant dividends in savings. Thus, even in the face of increasing demand for data center services, companies can still aim small with the confidence that they can expand when needed, should growth require it.
IT architectures like blade servers can vastly increase power densities in a data center, reducing the required amount of floor space for a given compute capacity. Although this may require greater cooling infrastructure (beyond certain power densities, even air conditioning can be insufficient), it can be worth the tradeoff in some instances.
New Process Technologies
In accordance with Moore’s Law, process technologies for processors continue to shrink, providing a given level of computing capability in a smaller area and smaller power envelope. A server can thus pack more of a compute punch in a given volume and for a given power budget. Unfortunately, however, demand is scaling right along with compute capabilities—thus, smaller, faster processors don’t enable as much of a data center shrink as one might hope. Nevertheless, Moore’s Law has enabled companies to avoid sprawling data centers.
Outsourcing and Data Centers
The ultimate data center downsizing involves outsourcing. Outsourcing options run the gamut from one or two services—such as backup in the cloud or colocation of servers—to complete elimination of a data center in favor of cloud computing. For some companies, the demands of constructing and operating a data center are just too much, particularly if IT is highly peripheral to the company’s business. To some extent, a company that outsources to shrink its data center is really just passing the buck: someone must provide the data center capacity. (Ah! Maybe that’s why mega data centers are becoming more common!) The following are some noteworthy options for IT outsourcing.
- Colocation. Colocation is the older brother of basic hosting services. Instead of counting on an Internet company to provide everything but the HTML, however, colocation customers count on the provider to essentially be responsible for all of the infrastructure they would need to run their servers in a data center. The company just installs the servers, purchasing floor space, bandwidth, cooling capacity, security and power distribution from the provider. The colocation provider is able to amortize the infrastructure costs across a wide base of customers. The result for the customer is a smaller—or nonexistent—data center, at least in a sense. Visitors to the company’s campus wouldn’t find a building or room dedicated to server hardware; they would just find individual computers with network connections. Colocation is thus outsourcing of data center infrastructure, short of the IT equipment.
- Public cloud. The public cloud is probably what most industry observers would think of first when they see the words IT outsourcing. The public cloud goes a step beyond colocation: it involves outsourcing of the data center infrastructure and the IT equipment as well, leaving customers with nothing but a bill for services rendered. Capital expenses are removed entirely from the equation. (Again, this doesn’t mean data center capacity evaporates like, well, water—someone must carry the resource load, it just isn’t the customer.) Furthermore, the customer isn’t required to deal with software or hardware problems that might arise on the servers; it’s all the responsibility of the cloud service provider. This is the ultimate data center downsizing: shove off the entire mess on someone else, and just pay for the end result: networking, compute and storage services. As with anything that sounds too good to be true, the cloud carries with it a number of concerns, including matters of security and cost. (Whether cloud computing is truly less expensive in the long term is a matter still under debate. In all likelihood, it depends on the customer.) The public cloud is not necessarily an all-or-nothing proposition, however: companies can always go for something in between. If a company has an existing data center but needs greater capacity, it can outsource that extra capacity to the cloud, maintaining some resources in house but completely sidestepping the need for a capital-intensive data center expansion.
Outsourcing enables companies to reduce their data center sizes (either by eliminating expansions, by building facilities that only partially meet company demand and outsourcing the rest, or by not building a data center at all). But in a sense, this is a ghost effect: the net data center capacity of the industry at large is not decreasing (although larger data centers may offer greater efficiency compared with smaller facilities).
Mega data centers make big headlines, but for many companies, smaller is better when it comes to IT. A number of factors are driving companies to build smaller data centers compared with what they might have done several years ago. These factors include a stagnant (at best) economy, rising energy prices, improvements in cooling technology and a greater reliance on free cooling, virtualization and consolidation, modular data center design strategies, higher-density deployments, and improved semiconductor process technology. Colocation and the cloud—collectively, outsourcing—also play a major role in decreasing data center size for many companies, as they eliminate the need for these companies to maintain infrastructure on premises. But outsourcing is something of a ghost effect: the data center capacity and infrastructure doesn’t evaporate, it is simply moved to another company. And this shift is part of what’s driving the growth of mega data centers that provide cloud services. Nevertheless, for many companies, smaller facilities are indeed better.
To read more of the May issue of DCJ Magazine click here