If you look at the world’s top five companies by capitalization from 10 years ago, the majority are businesses built using the traditional methods of innovating and manufacturing products. They are solid companies with physical assets that contribute to their wealth. If we move forward to today, most of our top five companies have been displaced by those that focus on data. Whether it’s data control, exploitation or analysis, these companies are increasingly data driven.
Thanks to automation, big data and artificial intelligence (AI), the value of data has never been higher. Many organizations have to come to the uncomfortable realization that the data they hold, or mine, is fast becoming a more valuable asset than traditional assets such as personnel.
One example is Shazam. According to the Wall Street Journal, Shazam is now valued at $1 billion thanks to the data it mines—yet it’s a mere startup! Because of its ability to “identify the media playing around you, explore the music you love, discover song lyrics from your favorite artists,” its business model, at a high level, is very simple and highly effective.
Shazam takes information from consumer requests for songs and music and generates live up-to-date intelligence about the market, while still providing a service to the consumer. It can then sell this valuable information to the music industry. Who needs music charts or opinion polls when you can base your business decisions on data from over 100 million music lovers? It’s no surprise that Shazam is now taking this model into theaters, television and advertising.
With any data-driven organization, however, there are some important fundamentals to consider. First, you need the infrastructure to quickly capture and analyze this data. But the data volumes can be colossal, generating problems. With Shazam, which has more than 100 million users, the subscribers create a huge amount of data that, when combined with AI, can be used to predict trends in consumer music choices. This capability becomes valuable intelligence to the music industry, allowing it to focus its advertising and promotional dollars to best effect. The value of this data increases exponentially if it can be analyzed and employed at speed.
In many Western economies based on service, financial and knowledge industries, data starts changing from a burden to a valuable asset. Like any asset, these highly valuable, ever increasing data volumes must be protected and stored in the same manner as any other asset. For the data-driven organizations, this requirement matters even more.
The value of this data should raise questions. One of the most important should be, “Do we look after the data well?” It also means organizations that rely on data for their sales revenue need to ask themselves more seriously and more frequently about how they secure that data. Indeed, as part of their due diligence, they must ask a number of searching questions to prepare for either manmade or natural disasters.
If organizations plan for such possibilities now, and test their assumptions regularly, they will be more prepared to prevent and minimize a disaster’s impact on the business. Prevention—or business continuity, as it’s often called—should be the foremost priority, because it’s far less expensive than finding a cure after disaster has struck.
Yet organizations cannot be complacent; they should still invest in disaster recovery, and they should avoid locating their data centers in the same circles of disruption. To place data centers and disaster-recovery sites in close proximity is just asking for trouble.
So to avoid putting business at risk, regular reviews of the current situation are necessary, and testing should involve a number of scenarios such as distributed denial of service (DDoS) and hacking attacks. Companies must also test how well they can respond if one of their data centers goes offline and understand how this occurrence would affect operations. If they have the right data backup and restore solutions in place, the business should still be able to continue. That is, as long as it’s monitored and managed, just like any insurance policy.
In the short term, the idea is to augment and to automate. As the use of AI grows, this technology will take over certain tasks by analyzing data and reacting accordingly. With experience, it can improve and predict. In the case of Shazam for example, AI can predict the next big hits. That’s great, but what are the business risks of such a company, and how will they change over the next year? To maintain a viable digital business, firms must back up and secure data, enabling them to do real-time analysis so as to ensure they make the correct predictions as well as the right decisions.
Companies are extremely data driven today, but do they realize the extent and importance of that characteristic? To all appearances, they acknowledge they are data driven and certainly use the data constantly, but when it comes to protecting that data, they are less driven. The problem is cost rather than business risk. The value of data seems to be overtaking the value of people. But the value of data is as critical as the value of people. Both are crucial for success. Data may be like gold, but gold is hard to mine and transport. To achieve maximum value and security for that data, it should be easy and secure to transport.
There is a balance in the workplace between employees, automation and data. Yes, data is an important asset that requires insurance and protection at all times. Nevertheless, if you lack the right people to carry out such functions, and you can’t hire them—especially in the cloud world—then you have a problem. You need the right skills to secure data; the companies that have the right people, and the right data strategies, are therefore more likely to succeed.
So, do enterprises look after their data well? We suspect not. In my experience, I find many enterprises are actually compromising their data. Protecting and storing data, as well as restoring it when necessary, is essential. Available technologies include Zerto for disaster recovery and Rubrik for archiving, as well as backup providers such as Asigra. They are newish companies looking to change the way we handle our data. What I have found is that businesses are still not looking at the way they are moving that data. They continue to do the same thing they’ve done for the last 10 years, despite the world moving on—and fast.
These are the challenges for CIOs: uncompressible data in the form of video images and rich media. As yet, nobody has handled anything at these volumes before, not to mention large data sets and the growing demand to move encrypted data for security purposes. Compared with 10 years ago, much larger bandwidths are available. So the challenge for companies is completely different from that of a decade ago. As Gartner says, although companies do have DR plans, they don’t test them sufficiently. So although companies may have a plan, what’s missing is the process of testing it regularly.
In the case of Shazam and Uber, for example, success is based on data. If they don’t protect their data, their business model falls. If an organization depends on data for its business, then insuring it and protecting that data is even more critical, in addition to having ready access despite any disaster.
Shazam’s data is an integral part of its business model and success. Yet the company is looking at other business models. According to the Telegraph, it has garnered one billion downloads of its mobile app while achieving profitability for the first time since it launched 15 years ago. “[We were] growing a business in a collapsing market,” company cofounder Dhiraj Mukherjee told the Guardian in December 2016. He added, “The Internet bubble burst in 2000 [and] there were startups going bust everywhere.”
The importance of data means all companies must analyze their business-risk profiles regularly. They must look at their recovery-time objectives (RTOs) and recovery-point objectives (RPOs) to understand which data truly needs to be accessed in or near real time. This process isn’t about trading data instantly; it’s about every other type of data that’s near real time and asynchronous. And with the RTO and RPO objectives in mind, organizations must know how quickly they can recover.
Yet different companies will have different RTOs and RPOs, as some data functions are more critical than others. So, it’s important to analyze which data is critical and needs to be backed up without delay and which data can be restored at a more leisurely pace. Speed carries a cost, but given the right acceleration technology, you may find it to be cheaper than you think. The right solution may be your best insurance policy because it can enable you to reduce the time to back up and restore data.
A backup is only complete when you have that last byte of data. The quicker you can back it up across a network, the quicker you minimize the risks. Increasing the number of backups you can perform in a day will also minimize the risk. Imagine you are only able to back up or replicate your data once a day, and then something happens—maybe an electrical issue or a hack. You lose your data and you’re in severe trouble. But if you back up four times a day, you can get back up and running very quickly, and the speed with which you restore your data minimizes your business risk.
This is how data acceleration fits with business risk. Without it, you could be putting your data-driven business at risk at a time when WAN optimization is simply no longer the solution. After all, WAN optimization doesn’t accelerate anything—it just condenses the amount of data to be sent. And you can’t encrypt it. Now that’s worth thinking about!
About the Author
Jamie Eykyn is chairman of data-acceleration company Bridgeworks. A serial entrepreneur, he founded Shuttle Technology Limited, which was subsequently sold to SCM Microsystems in 1999. Following the sale, Jamie has concentrated on building a portfolio of technology companies in which he is involved. He is an investor in Bridgeworks, which has developed patented transformational technology that uses machine intelligence to accelerate transfer speeds and reduce packet loss when moving large data volumes across the WAN.