Cloud computing has become a fixture in the IT landscape over recent years. Some debate has even arisen over whether the development of cloud computing counts as evolution or a revolution. Leaving such questions to others (evolution and revolution both signify change, perhaps simply at a different rate or by more or less gradual steps), a brief (and broad) look at the history of the cloud may provide some indicators as to where it’s headed next.
The origins of the cloud are often seen in mainframe computing in the last century. This is a matter also up for some debate, as some cloud proponents like to treat cloud computing as an entirely new phenomenon. Skeptics (or, perhaps, just less excitable types) sometimes see the cloud as nothing new at all, but rather just a rebranding of a computing model that has been around for decades. As with most such arguments, reality is probably somewhere in between. The fundamental model of centralized resources certainly can be traced to mainframe.
This trend of centralization is not trivial, however. As computing technology became smaller, cheaper and thus more accessible to more businesses and consumers, computing power began decentralizing. In businesses, for example, getting computer time became less and less a problem—unlike early computers, which were bulky, expensive and difficult to operate. The connectivity of the Internet, leading to the cloud model, essentially involves a recentralization of compute power. “From a business and management perspective [cloud computing] signifies a return of some of the characteristics of mainframe computing, in terms of power, expense, and centralization of operations and expertise,” notes Peter HJ van Eijk at CircleID (“Cloud Is the New Mainframe”).
The cloud is certainly more sophisticated than mainframe systems in, say, the 1970s or 1980s, and the scope of access is greatly increased. Depending on the particular service, you can potentially access the cloud from anywhere with an Internet connection—although even this isn’t fundamentally different from using a remote computer with a modem to dial into the VAX system at company headquarters to access email or run some “online” program. So, is the cloud revolutionary or evolutionary? That may simply depend on which aspects of the cloud you want to emphasize relative to mainframes or other computing models.
Definition of the Cloud
Perhaps you’ve noticed how a word or phrase can spread through a population like a virus (“memes”). In the previous decade, the term cloud or cloud computing became popular, but questions arose as to what exactly it referred to. Is the cloud just a rebranding of the Internet? Does the cloud follow a particular computing model, or is it nothing but a matter of remote access to a centralized service of some sort? Worse, do cloud computing and the cloud mean two entirely different things? How are they related?
The result is naturally that a precise definition of the cloud and cloud computing is difficult to pin down. This may not be all that surprising, however. Try to define the term computer. If you ask a dozen people what it is, you’ll probably get 12 (or more) different answers. They may all be similar but simply focus on different aspects of what a computer is. (Is a slide rule a computer? On some definitions, it is. It’s just not a digital computer.) The same is true with the cloud. Cloud computing most likely falls into the “I know it when I see it” category—and nothing is wrong with a term that is a little fuzzy around the edges.
But what can we say about the cloud? Given the fact that clouds have been meaningfully differentiated into public and private versions—that is, Internet-style networks and private corporate-style networks—the cloud is at its heart a network. That may seem a trivial conclusion, but it provides a good foundation to start with. The metaphor of the cloud comes from network diagrams that depict a portion of the network—which may be too complex or indeterminate to show accurately—as a cloud, with subnetworks, terminals or other devices connecting to one another through unspecified pathways. Cloud computing is often defined as a model using centralized, virtualized compute power to deliver applications and other resources as a service (rather than a product) over the Internet or some similar network. (Unfortunately, many definitions of cloud computing take up entire articles, so you may think this one is lacking on some points, overbearing in others or simply mistaken.)
According to Gartner’s “hype cycle,” cloud computing is currently deflating from its peak of hysteria, with several years still standing between disillusionment and productivity. The question of cost has followed a similar path over recent years, seemingly. Cloud computing was expected to be the inevitable replacement for distributed computing if for no other reason than it is cheaper. The massing of resources and application of virtualization technologies is supposed to reduce cost per unit of compute power through greater utilization, and it enables amortization of costs across many users (beyond the company that owns the resources) as well as greater scalability (both up and down). Conceptually, this seems a simple matter, but numerous questions have been raised as to whether cloud computing really is cheaper. There actually may be no hard answer: Forbes notes, for example, “So is cloud computing really cheaper? The answer really does come down to how closely you are able to manage, track and adjust your infrastructure” (“Is Cloud Computing Really Cheaper?”).
Cloud computing can, however, convert capital expenses into operational expenses—a particularly large benefit for companies (and consumers) in an economy heavy laden with debt and a lack of investment capital. And this may be the greatest benefit to some companies, even if long-term costs differ little.
Security is a major concern for all aspects of IT, but it has been one that has dogged cloud computing with a particular vengeance. Like cost, however, security may not be an inherent advantage or disadvantage for the cloud: it may simply be something that users must evaluate in each separate case to determine the safest approach (“Has Network Security for the Cloud Matured?”).
Where Is Cloud Computing Headed?
Given the historical origins of (or, at least, precedents for) cloud computing, as well as the evolution of cost and security concerns, cloud computing is likely to stabilize as one tool in the IT toolbox—neither a complete replacement of other compute models nor a fad that will disappear in a few years. Companies are trying different mixed approaches to cloud computing, ranging from purely public cloud computing, to purely private (company-owned) clouds, to hybrid approaches that attempt to garner the benefits of each while minimizing downsides. No model serves all purposes, and cloud computing is no exception.
Yet to be seen is the effect that environmental concerns will have on the cloud. Distributed computing—although possibly less efficient—forms less of a target than mega cloud data centers, with their multi-megawatt power appetites. Such concerns don’t have an impact on the technical aspects of the model, but they may have practical effects, such as higher costs through regulations and taxes.
However you define it, cloud computing has earned itself a place in IT. It won’t ever see a wholesale takeover of computing, but it will remain an indispensable tool. The question is simply when it will reach a plateau of adoption in the market and how it will be used by companies in conjunction with other computing models.
Photo courtesy of pr_ip