In computing, size and performance were once inextricably linked, with the most powerful computers for many years taking up huge spaces. And because most of these behemoths could only be found in universities or other academic institutions, government buildings, or the headquarters of the biggest businesses, they often had rooms built around them.
Now, fast-forward to the present day, and consider: you may well be carrying around a similar amount of processing power, which was once packed into one of these mega-machines, in your trouser pocket—your smartphone.
Not Just Computers
A range of other devices and technologies have followed a similar route, but developments in all of them have been driven by the scaling-down in size of the most basic components, such as computer chips and moldings of man-made materials. At its most extreme level, these developments are known as nanotechnology, which is a term given to components having dimensions of one billionth of a meter—or even smaller. To put this scale into context, a nanometer is one hundred-thousandth of the width of a human hair.
We and everything around us are made up of these tiny particles, and the specific way in which they are arranged is what makes human beings all unique or, conversely, enables us to make goods that are consistently identical in large quantities. The most exciting aspect of substances when they are reduced to the nanoscale is that scientists have found that many do not behave in the way these experts would expect from studying them in their classic physical form. As a result, scientists are being kept busy breaking down substances to the nanoscale so that they can study the different ways in which they do behave, as well as finding ways in which they can be made as stable as possible. The obvious problem of such research, however, is that special types of microscopes are needed to let us see these particles.
Reversing the Common Conceptions
The problem faced by developers assembling typical-sized data centers is principally how to cope with the heat that is generated by their operation. As the manager of one such large installation told Information Age magazine, such centers “are built for a different kind of specification—we can’t cope with miniaturization.”
But another, equally pressing, matter is the demands that running today’s data centers places on power-supply networks. This is why data center and infrastructure management is increasingly viewed as a specialist function; many businesses simply lack the space to accommodate the equipment needed to store and process all their data efficiently. At the same time, they are keen to minimize the risk to continuation of their power supplies, which goes with trying to maximize the capacity and capabilities of their computer systems while limiting the amount of space needed.
A major hope for advancement lies in the possibilities for modular construction of data centers. Whereas once they were built with capacity that was seldom expected to be necessary for many years, the newest modular setups mean not only is less space needed in the initial stages, but also setup times can be considerably reduced. Such data centers can be provided in remote locations that were once considered impractical for such applications.
A Data Center Dichotomy
An outstanding technological challenge facing IT administrators and those in charge of configuring IT systems is that although an increasing number of functions must be integrated into a single device, the silicon that houses all this capability is continuing to shrink. As a result, the technical minds whose job is to integrate all these tasks onto a single chip must concentrate their efforts on maximizing the efficiency with which each individual task is performed, in addition to minimizing any negative impact on the business in performing its myriad tasks.
As ComputerWeekly points out, centralization of increasing numbers of functions is no “silver bullet” for tackling this dilemma, no matter how tempting the prospect might be for those who need to pay attention to an organization’s bottom line. For although this approach might be a first-stop solution when a business is looking to cut costs, the topic of outsourcing must be approached with great care so that the right data- and infrastructure-management partners are chosen.
The Biggest Challenge
The main challenge is likely to be reconciling our demands for data centers that can handle our still-growing appetite for data with the global recognition that it must be done in a way that doesn’t put undue pressure on power supplies. Miniaturization now means that data centers of as little as 2,000 square feet can house up to 20 petabytes of storage. Driven by the concept of virtualization, which enables division of a physical server into a series of multiple virtual servers, this trend means fewer servers are used more efficiently, leading to both cost and resource savings.
This challenge is an essential consideration when choosing the location, layout and design of any data center; otherwise, the set-up and operating costs of such an installation could easily consume all the savings from centralizing the data-storage and data-handling processes in the first place. Shared data centers undoubtedly represent the future of how we will take care of our infrastructure-management needs. But the fact remains that such buildings are some of the most power-hungry anywhere in proportion to their physical size.
So, for the foreseeable future, we will continue to see the trend of locating data centers in less crowded locations, thanks to the continuing need to build in more capacity than is actually required and to provide a good degree of “future-proofing.” The central challenge for data center managers, therefore, is to offer all this capacity, including for emergency backup systems, while controlling the costs of operating and maintaining their own buildings and infrastructure.
Even if nanotechnology does bring the promised reductions in physical space, miniaturization will remain a major consideration for those involved in data center management for the foreseeable future.
For more from Geist, visit http://www.geistglobal.com/.
About the Author
Steven Cox has worked as a professional writer for more than 20 years, including for regional newspapers and national specialist magazines in the U.K. He now specializes in online marketing, including all forms of web content such as guest blogs and in-depth feature articles. Outside work, he enjoys traveling, soccer, restoring old railway engines and relaxing in a sunny beer garden (with the right refreshment of course!).