As one of a business’s major investments, a data center must be able to justify the value it brings to the organization. Whether building a data center from scratch or seeking to invest in and optimize an existing location, businesses must be able to calculate factors such as the total cost of ownership (TCO) to ensure that their decisions make sound financial sense. But they must avoid some common pitfalls that organizations that make these comparisons harder.
Measurement vs. Prediction
The first trap many businesses fall into is assuming that historical measurements will allow them to calculate TCO and manage their costs—for instance, using past examples of PUE, equipment-refresh rates and other factors. Such information, however, is of little value in a vacuum; without a target to compare it with, businesses will have little idea of how their data center is performing and whether they need to adapt their operations. As a comparison, imagine driving a car without knowing the local speed limit. You can tell your speed from the speedometer and even use other information such as fuel efficiency, tire wear and engine temperature to get a better idea of performance. Yet without knowing the speed limit, formed on an entirely different set of calculations, you would still be utterly ignorant of whether your driving was likely to result in a fine, or worse.
Similarly, businesses should first predict their TCO using the appropriate tools, calculations and information, such as the expected energy use of each piece of equipment in a data center. They can then evaluate their historical measurements against these predictions in order to see that they are meeting expectations, and then act accordingly. But when making these calculations, businesses can fall victim to the second trap.
Hard calculations involve using tools that cannot cope with the required complexity. Although nobody would use a notepad, pen and pocket calculator to predict their TCO, a lack of specialized tools means that businesses will still fall back on tried and tested methods such as spreadsheets. Although spreadsheets are a fantastic multipurpose tool that the world of business still relies on, however, they are still inadequate for the task of predicting something as complex as data center TCO. This situation was borne out by recent research from the University of Hawaii, which found that 88% of spreadsheets, and at least 1% of all formula cells, contain errors. While this 1% may seem small, the effects of these errors can increase exponentially. A startling example is the discovery that a Harvard economics paper, used by many as justification of governments’ economic strategies, had omitted certain cells from its calculations. Together with other issues, these omissions meant that the very principle of the study was shaky at best, and completely wrong at worst.
The Cost of Complexity
Although data center managers’ spreadsheets are unlikely to damage national economies, there are still issues of complexity and often compromise that can greatly reduce the accuracy of TCO calculations. To begin with, the data that must be gathered to perform calculations exist in various domains and are recorded in differing units and dimensions. Kilowatts, kilowatt-hours, power usage effectiveness, square feet or meters, BTUs, degrees Celsius or Fahrenheit, capex, opex, time and date are just some of the units and measurements that must be reconciled in order to understand the true cost of a data center. When performing such complex multidimensional calculations, any slight error can greatly skew the TCO estimate—regardless of how time-consuming the actual equations can be.
When facing such a complex task, many will naturally attempt to simplify it. The most common way of doing so is by attempting to simplify those areas that are too multifaceted to analyze or understand in detail even, with spreadsheets. This effort in turn results in less accurate TCO calculations simply because final figure is based on incomplete inputs. Since those creating and maintaining these spreadsheets will be aware of this shortcoming, they will include error margins in their calculations, essentially settling for a final figure that they deem to be “close enough.” But a single TCO calculation could involve a combination of calculations and inputs from a variety of teams and sources. This in turn means that each of those calculations may have been based on the same “close enough” approach. As a result, any final figure for TCO will be based on a series of estimates across a number of spreadsheets, each of which has a high chance of containing errors in what may be critical cells. Once data has been fed across these spreadsheets any system-level behaviors or anomalies are nearly impossible to predict.
Businesses need to be sure that they are predicting, as well as measuring, their costs. When doing so, they must ensure that a combination of complexity, human nature and blunt tool sets is not dragging their calculations away from accuracy into the realm of “close enough.” Without this extra care, any data center can quickly turn from a major source of investment return into a financial black hole.
Leading article image courtesy of gruntzooki
About the Author
Zahl Limbuwala is CEO and cofounder of Romonet and is passionate about the data center and ICT industry. Starting his career as a chartered electrical engineer, he then moved into IT systems engineering, network engineering and later software development. Zahl is a chartered engineer, chartered IT professional and Fellow of the BCS. He has 20 years of experience in companies such as Digital Island, Real Networks, Cable & Wireless and many engineering firms writing control and automation software for production-line automation and robotics.