As data volumes grow, data centers must evolve. These facilities are known in some circles as the “digital core,” and as such they are having to adopt to an increasing array of technologies. Large technology vendors also tend to say they care about their customers, and perhaps they do, but they’re also very keen to sell their wares, from servers to network infrastructure. Add the discussions about being at “the edge” and you’ll find yourself in a jungle that’s becoming harder to navigate.
Talk about edge data centers and the digital core is likely to confound most customers. The IT industry is not the only culprit; most industries like to create terms that are meaningless to everyone except those working with those industries. Yet ironically, terms are often created to market something that has existed for many years in another guise. So where does it leave you? Well, it requires you to either hire the right experts to help you to cut down this jungle or figure out what you really need on your own.
What you’ll find, though, is that data volumes will increase, and research on the web suggests that digital transformation and the Internet of Things (IoT) are helping drive this trend. Richard Harris, executive editor of App Developer Magazine, went so far as to suggest in December 2016 that “in 2017, we will create more data than ever before, creating new challenges around consuming that data to make strategic and tactical decisions.”
He added, “More data was created [between 2014 and 2016] than the previous 5,000 years of humanity.” He forecast that the type of data that everyone will create is “expanding rapidly across a wide range of industries: biotech, energy, IoT, healthcare, automotive, space and deep sea explorations, cyber-security, social media, telecom, consumer electronics, manufacturing, gaming, and entertainment.” The list of potential data sources is constantly growing, so you need a data center that can cope with its growth. The data center of the future must also be able to manage the growing varieties of data.
Chris Skinner, cofounder of the Financial Services Club, writes in the Finanser that “banks without a digital core will fail.” But what does he mean? He says data is the key to disruption, and most people will agree with his statement. But he was asked how he defined the term. One reply was that there isn’t one, but it seems to refer to a central point of systems, such as a mainframe.
Skinner writes that the markets don’t operate in this way anymore, so he thinks systems should be spread across server farms in the cloud to avoid a single point of failure. He finds, however, that many people are misinterpreting what the digital core actually means. That’s no surprise to me because new terms often mean different things to different people. Skinner therefore describes the digital core as being “the removal of all bank data into a single structured system in the cloud [where] the data is cleansed, integrated and provides a single, consistent view of the customer as a result.”
In 2015, technology giant SAP described the digital core as follows: “A digital core empowers companies with real-time visibility into all mission-critical business processes and processes around customers, suppliers, workforce, big data and the Internet of Things. This integrated system enables business leaders to predict, simulate, plan and even anticipate future business outcomes in the digital economy.”
Among other examples of where a digital core can serve, the company adds, “The same digital core can be used to optimize manufacturing, moving from batch orders to real-time manufacturing resource planning to always meet demand. Further, using information collected by assets and the Internet of Things, the assembly line and enterprise resource planning (ERP) can be synchronized for increased cost efficiency and asset utilization.”
SAP believes that “companies that lose the complexity that is weighing them down will be able to face the market disruption happening everywhere.” To be successful requires your firm to have real-time visibility and integration across all business processes. Visibility is about allowing your business to understand how information flows into and out of your company, or perhaps even into and out of your data center.
The digital core also digitizes mission-critical processes based on a single source of information that SAP says “interconnects all aspects of the value chain in real time.” It includes workforce engagement, assets, IoT, supplier collaboration, business networks and customer engagement in an omni-channel experience to empower better decision making and thus gain an advantage in today’s digital economy.
By contrast, TechTarget defines edge computing as “a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible. The move toward edge computing is driven by mobile computing, the decreasing cost of computer components and the sheer number of networked devices in the internet of things (IoT).”
An argument for edge data centers and edge computing generally is that it can help to reduce the impact of network latency to the edge user or process. But it doesn’t solve the problem of getting the data to and from the edge in the first place. Even so, just because the data is distributed across both the edge and the core, data-protection policies still apply, and perhaps more so. Ideally, data should be stored and backed up in at least three different data centers or disaster-recovery sites. Many organizations make the mistake of storing data too close to a circle of disruption to mitigate the effects of throughput latency. Doing so can lead to a total disaster should the worst happen. Now there’s no need to place all of your eggs in one basket. Data-acceleration solutions enable you to mitigate the effects of data and network latency, speed up data flows, and reduce packet loss at a distance. The result is better data flow to and from the edge as well as between data cores.
Edge computing may not give you all the answers when you’re considering how to create the data center of the future. Vendors will also be happy to sell you WAN optimization, but this technology can’t handle encrypted data in the way that a data-acceleration tool can without compromising network efficiency. How? Data-acceleration solutions use machine learning to increase throughput.
The time-honored alternative of increasing your bandwidth won’t necessarily improve your network’s throughput, but it could cost you the Earth without addressing the limitations created by physics and the speed of light. It’s therefore important to consider what’s motivating a vendor—your interests or only its own? A vendor with your interests at heart will be able to explain its claims in plain English and demonstrate their veracity.
Although the jargon may be flashy, it does no good if it puts customers on edge. Vendors should explain and demonstrate the benefits of a given technology in a language customers understand. Without proper communication, creating the data center of the future will be impossible, regardless of whether it includes a digital core, edge computing or another approach.
So the key to creating the future data center is for vendors to offer customers what they need today to meet tomorrow’s needs. You need not replace all of your legacy infrastructure for latest technology, though. You can achieve much with what you already have—particularly when combined with data acceleration, which can help you create the data center of tomorrow, today.
About the Author
David Trossell is CEO of Bridgeworks, which he joined in 2000. David is a recognized visionary in the storage-technology industry and has been a major influencer in developing Bridgeworks’ intellectual property and leading technology edge. Alongside his efforts with Bridgeworks, David has authored or coauthored 18 international patents in his drive toward and passion for transformational IT.