August 25, 2008
Cloud computing is gaining attention among technologists and the wider public, yet its definition remains a bit, well, cloudy. If you wouldn't mind one more cloud computing definition, I rather like mine.
In cloud computing, you care deeply about access to computing, storage, and network resources. You could not care less about where the servers carrying out these processes are located (as long as you keep in mind a few compliance requirements), nor how they are implemented. With cloud computing, what you don't see is what you get, and as long as you get that resource availability for an affordable price, there really is not much need to ask what is going on behind the scenes.
This "traditional" concept of cloud computing has some big stumbling blocks at the moment. My belief, however, is that many of these issues can be addressed by doing something intriguing with the cloud computing idea: bringing it inside your own datacenter.
How Do You Avoid the Current Limitations of "Traditional" Cloud Computing?
The benefits of a “traditional” cloud computing environment tend to center around cost, as well they should. Infrastructure implementers such as Amazon can provide a cloud service with extremely low operating costs, massive economies of scale, and minimal complexity. Yes, it's computing to the extreme and, one way or another, we'll all be better off because of it. End-users benefit from the service provider's economies of scale, immediate availability of computing resources and the resulting low costs. Even better, customers are billed for usage; in other words, you only pay for what you use -- and nothing else. This concept is a key to managing datacenter resources efficiently and is not only friendly to IT budgets, but also to the environment, as fewer servers can be shared by many users. Even greater savings and environmental benefits can come from automating power management to turn off servers when they are not needed.
There are, however, some lingering issues that dampen cloud computing's otherwise sunny disposition. Chief among the roadblocks to full-fledged, no-holds-barred cloud computing are security, availability and the threat of lock-in to a single cloud service provider. As soon as any of your computing and storage resources find themselves outside of your four walls, there are some new concerns that need to be addressed. The thought of pertinent, sensitive information existing somewhere out of your control is not altogether comforting. Furthermore, availability has become an issue. For example, Amazon's S3 service and a few others have, at times, struggled with resource availability, and service- and operating-level contracts still are unproven.
Along these lines also comes the concern of “platform lock-in.” Depending on how you plan to use cloud computing, lock-in can prove to be a frustrating restriction. While this only negligibly affects the raw, “infrastructure-as-a-service” version of the cloud, lock-in can cause problems at the “platform-as-a-service” and especially “software-as-a-service” levels. Software compatibility is limited, at best, across cloud providers. Platform portability is not an option, nor is architecture portability, at least for the time being.
Another aspect of this is that today's cloud providers only support a limited application architecture which does not accommodate most existing applications. The result? You will usually have to develop new applications compatible with the current limitations of your cloud computing service provider's environment. Now, vendor lock-in does not necessarily have to be a roadblock if you have a plan for managing it. But with the initial cloud providers still in the throes of tweaking their infrastructure, lock-in is a risk that must be evaluated and addressed.
Cloud Computing Inside Your Four Walls
Cloud providers will continue to improve their services, and customers will continue to gain ever more from the cloud. And while the technology is truly still in its infancy, I expect the next decade to usher in a new era of computing with cloud computing as one of its driving forces. In the meantime, careful consideration should be taken to decide under which circumstances external cloud providers make sense for your business.
However, as cloud computing has taken the fast track and jumped into our day-to-day vernacular, our industry may have gone ahead with a rather peculiar assumption. The business of cloud computing boils down to two things: the architecture of the cloud, and the service provider that is responsible for delivering that architecture. The term “cloud computing,” at its core, refers to the architecture for how data is accessed, stored and processed. Cloud computing is a way for IT to be more effective and more efficient. In my opinion, many of the problems surrounding cloud computing come from the approach of the service providers, not from the underlying concept. The more control you have over the way cloud architecture is delivered, the more it will be able to help your business.
All this leads to a very interesting permutation of cloud computing, and perhaps one of the most immediately actionable and practical ones: cloud computing does not have to be limited to external providers.
What if you could construct your own internal cloud within your own four walls, to maximize the benefits of the concept without being held up by its limitations? And what if you didn't have to change your current applications or data while still being able to leverage external cloud resources where they best fit your needs? For many of us, this is, in fact, the ideal application of cloud computing. The internal cloud negates the limitations of current cloud providers, and enhances the cost and availability benefits. Here's your chance to be a little selfish, take advantage of cloud-style architecture for yourself. Set up a cloud within your own shop and you are experiencing, as they say, the best of the both worlds -- and maybe getting a jump on the future, to boot.
So, say you did bring cloud-style architecture under your own roof. What, exactly, would an internal cloud look like? Internal cloud architecture, if done right, allows your existing control and security to remain intact. It keeps your proprietary business processes and workflows -- the unique and differentiated way you knit your applications together to deliver business value -- under your control. Your data never leaves the building, and compliance rules and procedures would be maintained. Keeping the cloud inside your own environment will mean adding new capabilities to dynamically manage your resources. There's a bit of work involved here, but the result would be the pooling of your datacenter resources -- automatically allocating and de-allocating them based on your priorities and policies and optimally managing the ebb and flow of both capacity and demand.
Similar economies of scale could be achieved as those being hyped by external providers, but you'd be using your own infrastructure, applications and operations policies. The efficiency improvements are all yours. If you choose wisely, you can also make cloud infrastructure lock-in a non-issue (by virtue of there not being a service provider to lock you in). Similarly, you can implement an internal cloud without having to make any changes to the existing application software or architecture. Finally, implementing an internal cloud could also enable you to weave outside service providers into your strategy as their cloud offerings become more general-purpose.
Lastly, but arguably not least, if tracking and managing datacenter costs are a priority for you, metered billing can be adapted to the internal cloud. And with that, you have your very own cloud, delivering all of its potential while minimizing the possible pitfalls encountered by outsourcing. Easier said than done, to be fair, but most things tend to be. Cloud computing will drastically alter the IT landscape in the coming years, and it will be in your interest to make sure you have started experimenting with an internal cloud. Early work in this area will not only give you experience with all the benefits of cloud computing, but also provide some useful ways to steer clear of some of the concept's stormier issues along the way.
About the Author
Bill Coleman is CEO of Cassatt Corp., a San Jose, Calif.-based software company. Previously, he founded and was the first chairman and CEO of BEA Systems Inc.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.