As we consider the future of data storage and transport, it’s important to take a look back at where it all started. The first “data centers” in the 1960s housed huge mainframe computers that needed their own rooms owing to size, noise and heat considerations. These computers cost millions of dollars and were rented and shared by multiple organizations, as creating mainframes of their own was cost prohibitive.
Then came the 80s, when personal computers created a boom in the microcomputer (server) industry. These servers ended up in mainframe rooms, replacing old equipment with a much smaller footprint and creating a leaner data center. And since computers no longer cost millions of dollars, organizations could assemble banks of servers into rooms, which grew progressively larger. What started as single-room environments became dedicated buildings with thousands of servers.
From there, the advent of the Internet and innovations in software delivery led to the cloud as a way of meeting the quickly expanding need to easily share software and services not only with remote offices but also with a growing mobile user base. This approach worked for a few years, until hackers became smarter and the protection of private and customer data became a board-level concern.
In the past five years, hybrid cloud offerings have evolved to solve these challenges, promising protection and management of critical data in private, on-premises infrastructure and giving organizations the ability to host customer-facing applications in the cloud. Unfortunately, application-layer, IoT and DDoS attacks continue to plague today’s corporate networks.
In light of the current data-breach culture, cybersecurity has now become an industry so big that the network stack seems to add a new security solution every day.
Regulations Abound, But Are They Solving the Problem?
Fear of data breaches and the corresponding desire for data privacy, whether personal or organizational, has led to worldwide jurisdictional restrictions and stringent laws regarding how data is moved between countries. What’s worse is that nations have the legal right to monitor, copy, save and try to decrypt any data as it passes through their jurisdictions.
The dirty little secret of today’s Internet is the fact that any data that passes through it, whether public or private, requires a public address header for routing encrypted packets to the proper network. This situation provides ample opportunity for surreptitious targeting and decryption of sensitive data. It seems that no matter what new restrictions are enforced, data remains unsafe.
The fact remains that organizations of all sizes are subject to leaky Internet and leased lines. Today’s cloud environments run across hybrid public and private networks using IT controls that aren’t protective enough to stay ahead of real-time cybersecurity threats. Sensitive data can be exposed to acts of industrial or political espionage through unauthorized access to enterprise computers, passwords and cloud storage on public and private networks.
This is the great irony: a system designed to help people freely communicate around the world is being surreptitiously exploited in a way that prevents exactly that. The Internet was intended as a sustainable tool for bringing the world closer together, but it has rapidly become divided by a quagmire of protectionism—the reverse of promoting global information sharing. Clearly, a change is in order.
Unsustainable Energy Consumption
According to the Department of Energy’s Lawrence Berkeley National Laboratory, the number of data centers in the United States continues to increase; the total server installed base is projected to grow by 40 percent from 2010 to 2020. And though they are becoming much more energy efficient, they still account for almost two percent of total U.S. electricity consumption.
That’s in the United States alone. Data centers of all sizes are multiplying across the globe at an alarming rate, consuming a disproportionate amount of energy and resulting in a huge carbon footprint. The negative impact on the planet is significant. Ian Bitterlin, Britain’s foremost data center expert and a visiting professor at the University of Leeds, recently commented, “If we carry on the way we have been, it would become unsustainable—this level of data center growth is not sustainable beyond the next 10 to 15 years. The question is, what are we going to do about it?”
A New Vision: The Space-Based Hybrid Cloud
Perhaps instead of looking to the Earth for places to store and move our data, we should be looking to the sky. Imagine a world without borders, where data flows freely without limitation. Where there are no jurisdictional barriers interfering with the exchange of information or ideas. A world where information can travel across the globe in less than a second. This is a world where information is secure, safely traveling above and beyond the Internet and all leased lines. It’s a new way of conceptualizing data transport and storage—and it is possible.
New technologies have been conceived that now provide an independent space-based network infrastructure for cloud-service providers and their enterprise and government customers to experience secure storage and provisioning of sensitive data around the world. By placing data on satellites that are accessible from everywhere via ultra-secure dedicated terminals, many of today’s data-transport challenges will be solved. This approach will provide a safe haven for mission-critical data—a place without interruption or exposure to any surreptitious elements or unintended network jurisdictions.
In this way, providers, large enterprises and government entities can take advantage of a new way to store and transport data. Even better, this model saves money as well as carbon emissions. As a result, cloud-service providers will be able to offer better services at a third of the cost of doing business today because they will not have to add capex and opex for expansion. Major corporations who deal with mission-critical data, whether in health care or pharmaceutical, military or financial, will achieve major market differentiation while reducing their carbon footprint globally. CSPs and their customers don’t have to keep investing in more infrastructure and paying huge electricity bills.
Uniting the Globe
Ultimately, new technologies in space are about the global unification of communications. Creating a network where data can flow freely around the world without restriction and without fear of interception or theft will enable CIOs to virtually provision any remote office in less than one-third of a second without any latency, jurisdictional or cybersecurity issues whatsoever. The Internet, with all its good intentions to bring the world together, has driven communication wedges between parties owing to security concerns. A change is on the horizon that will ameliorate this issue, however, and enable those parties to communicate across the globe seamlessly.
Leading article image courtesy of USAF
About the Author
Scott Sobhani, CEO and cofounder of Cloud Constellation Corporation and the SpaceBelt Information Ultra-Highway, is an experienced telecom executive with over 25 years in executive management positions, most recently as VP for business development and commercial affairs at International Telecom Advisory Group (ITAG). His previous positions include CEO of TalkBox; VP and GM at Lockheed Martin; and VP, GM and senior economist at Hughes Electronics Corporation. Scott was responsible for closing over $2.3 billion in competitive new business orders for satellite spacecraft systems, mobile-network equipment and rocket-launch vehicles. He has coauthored “Sky Cloud Autonomous Electronic Data Storage and Information Delivery Network System,” “Space-Based Electronic Data Storage and Network System” and “Intermediary Satellite Network for Cross-Strapping and Local Network Decongestion” (each of which has a patent pending). He has an MBA from the University of Southern California and a bachelor’s degree from the University of California, Los Angeles.