For data center managers surveying today’s IT landscape, the view can be daunting. Across a vista dotted with cloud offerings, virtual servers, employee-owned devices, inherited and acquired assets, and disparate systems, IT leaders see a murky horizon, one in which it is increasingly difficult to determine exactly what applications the data center is running and how to manage them. It’s a complex backdrop for any organization, and it requires forward-looking technology professionals to adopt solutions that reduce that complexity.
The typical mid-size data center has hundreds to thousands of physical servers, storage devices, networking tools, IP-enabled uninterruptible power supplies (UPS) and heating, ventilation and air conditioning (HVAC) devices, to name a few. If we look below that physical layer, the same data center has thousands of virtual machines, software installations, versions, editions and releases. Beneath that software layer, we see that the definitions of those versions or editions or releases are inconsistent. For example, the same Oracle database can be defined as “Oracle 11g,” or “Oracle version 22.214.171.124,” or “ORA_11_2_EE.” How can IT track and manage such inconsistencies? This is only one version of one product from one vendor. Oracle has 10,391 and 422 software product releases and hardware models, respectively; IBM has 13,078 and 64,862; HP 7,982 and 64,036. And these are software product releases and hardware models from only three vendors. Since there are more than 12,000 IT vendors, you can see the challenge that is slowly bringing IT to its knees.
In the meantime, IT executives are spending millions of dollars attempting to address this problem with solutions such as software asset management and configuration management databases (CMDB). However, unless the fundamental issue of normalizing all the data to a common language is addressed, such solutions will continue to fail to deliver the value they promise. In the example above, the Oracle database should be normalized and referred to with the official vendor name: Oracle Database version 11.2 Enterprise Edition. This normalized representation should be provided regardless of whether it comes from a data center discovery system like HP-DDMI, IBM TADDM, BMC ADDM, BladeLogic Server Automation or from a purchasing system. Unless enterprises adopt a common language for IT, initiatives such as migrations, consolidations, application rationalizations, audit support and attestations will continue to be manual, expensive, cumbersome and error prone.
Four possible fixes
Before we embark on a discussion of how to attain such a normalized view of a data center, let’s examine what possible options exist to fix the data center complexity problem:
Option one: IT could try to simplify everything by buying all of its software from one vendor and all of its hardware from another. This would, in theory, be the easiest way to control the amount of data related to IT assets. But here’s the problem with this approach: there is no vendor capable of providing everything an organization needs. Even if there were such a magical provider, the data center would first have to dump all of its legacy infrastructure and applications, which would be cost-prohibitive and disruptive to the business. The reality is that most organizations need to have a variety of technologies and systems, and different vendors provide that variety.
Option two: IT could deal with complexity by adopting more management tools. There are certainly plenty of vendors eager to sell them, but these tools don’t integrate existing systems. In many cases, they spit out more data – not actionable information. In truth, most management tools exacerbate the problem they purport to solve.
Option three: This is the do-nothing approach, which might seem harmless. It is not. The IT leader who ignores the problem of disparate systems and lack of information risks spending 80 percent of his budget just keeping systems running. IT will not be able to provide the business agility that is expected from data center leaders.
Option four: We should label this approach “the only option,” since all the others come with significant drawbacks and fail to solve the core problem. Data center leaders must refer to and manage their different systems in a consistent manner, and that requires the adoption of a common IT language and normalizing the data against it.
Introducing a common language to your data center
Normalizing your data center to a common language typically entails either two or three steps, depending on your size and maturity level:
Step one: Establish a reference catalog that includes a taxonomy of all IT vendors with their associated products, models, versions, and editions, as well as attributes such as software support levels, software compatibility (i.e., Windows 7 compatibility), end of life dates, hardware consumption, etc.
Step two: Normalize your management tools such as HP-DDMI, IBM TCM, BMC ADDM or BladeLogic Server Automation, where you filter out unimportant data and correct vendor names, models, editions and versions, and then organize them against your reference catalog taxonomy. At the same time, enrich that data with relevant attributes such as support levels, end of life information or compatibility.
Step three: Use the reference catalog and empower your enterprise architects and your sourcing department to define standards and drive purchasing.
The common language of IT in action – use cases
Software asset management: There is financial incentive to make this a priority. One large financial services company expanded its service automation module (SAM) program to normalize the software inventories from its BMC ADDM and IBM TADDM discovery. The ability to gain a consolidated, single-pane view across both vendors’ products provided operational depth and a common representation of product lifecycles that helped the organization make effective asset succession decisions.
Application rationalization: An interesting rationalization use case comes from a consulting firm that specializes in helping Oracle customers optimize their software license spend. It performed a software normalization and discovery effort at a Midwest energy company to right-size server licenses.
The firm found that its client was under-licensed by more than $10 million. By normalizing its licenses, the firm was able to consolidate its client from smaller systems to larger ones, take greater advantage of virtualization and cut the true-up bill by half. A secondary benefit was a 30 percent reduction in operational costs through consolidation to fewer physical servers with uniform software stacks for easier administration.
Data center consolidations and mergers and acquisitions: Even companies with award-winning datacenters struggle with these enormous tasks. One such company using a reference catalog discovered as many as 3,000 unmanaged servers on the premises after bringing multiple entities together. The team achieved a 66 percent reduction in space and power footprint in infrastructure consolidation projects and an approximate five-fold increase in computing capacity with this information.
The reference catalog was able to discover gains by identifying older, lower density technologies not compatible with virtualization and targeting them for upgrades or decommissioning. The same reference catalog was able to track the physical dimensions, power consumption and heat dissipation for servers, helping identify less efficient servers.
The value of a common IT language
Today’s enterprises are dealing with a serious problem in their IT environments; they are drowning in IT-related data. The data is produced by IT management tools, purchasing systems and planning solutions, and it is disconnected, inconsistent and incomplete. This prevents teams from making timely and confident business decisions.
A comprehensive IT reference catalog is the first step in addressing the inconsistencies and gaps in vendor data. Normalization is the second step to filter, correct vendor names, versions, editions and models, and add categorization, relevant external data, and more. This common language enables a complete, accurate and consistent view of the data center.
Adopting a common IT language supports data center consolidation and allows data center professionals to more effectively harness information for decision-making. With a coherent method to survey the entire data center, teams can deliver and interpret technology information, support smarter business strategies and lower their costs.
Constantin Delivanis is CEO and co-founder of BDNA Corporation, creator of Technopedia, the world’s largest IT reference catalog. With more than 450,000 hardware and software products listed from over 11,000 vendors, Technopedia delivers information and technology that enables the common language of IT.