January 17, 2012
If 2011 was the year that cloud computing gained public acclaim as a revolutionary new way to use computing and storage resources more efficiently, 2012 may be the year that the IT industry's predominant focus on the public cloud begins to give way to a greater appreciation of hybrid cloud computing.
Public clouds – which allow multiple users to share massively scalable arrays of servers, storage and other resources through the Internet for maximum cost savings – dominate the current cloud computing market. Most enterprises, however, still rely largely on either internal or externally hosted private clouds that reserve computing resources exclusively for their own organization.
Hybrid clouds, on the other hand, combine aspects of both these models, providing the control and legacy compatibility of dedicated internal resources where needed, while shifting other less-demanding or less-critical applications to the off-site cloud. Hybrid clouds, as I view them, can encompass not just a blend of internal and external computing resources, but also both dedicated physical servers and virtual servers. This inherent flexibility can make hybrid solutions a good choice for adding new capacity or capabilities to existing legacy systems while still gaining some of the cost and scalability benefits of the public cloud.
Amazon Web Services, with revenues approaching $1 billion a year, has become far and away the industry's largest cloud services provider while concentrating almost exclusively on the public cloud. Yet Amazon, even when combined with a host of lesser-known public cloud providers, is barely scratching the surface of cloud computing’s ultimate market potential.
While the public cloud market is probably at least 10 times larger than the hybrid cloud market today, I expect the hybrid cloud to begin closing that gap. It wouldn't be surprising, in fact, to see hybrid implementations reach 50 percent or more of the total market as mainstream enterprise users grow increasingly serious about moving more of their applications and infrastructure to the cloud.
Enterprise cloud concerns
The truth is, most enterprise computer users aren't willing, or able, to entrust their mission-critical applications to the public cloud. Their reluctance is partly due to concerns about security, reliability and regulatory compliance in a shared computing environment. Equally important, however, is the realization that there are some things the public cloud simply won’t do.
The problem is that while some applications run just fine on virtual servers, others don’t. If you’re using multiple virtual machines (VMs) or sharing servers between multiple users, you run the risk of one VM using too much of the computing, storage or bandwidth resources. And when the remaining users must vie for a small, fixed pool of resources, applications are prone to crash or perform poorly.
Unfortunately, users in a public cloud environment simply don’t know what demands other users are going to place on their shared computing resources. As a result, enterprises often don’t know whether their applications will even work in the public cloud.
My experience shows that some applications – particularly large enterprise resource planning databases, accounting systems and other I/O-intensive applications – require the full capabilities of a dedicated hardware system. That’s why offering both physical and virtual dedicated server hosting, in addition to a full range of public and private cloud services makes sense. I have learned that when it comes to building and operating effective cloud-based systems, “one size fits all” is often not the best approach.
There are situations where it makes sense to keep at least some data and IT resources in-house. But there clearly are significant benefits to be gained from moving web servers, storage and other easily managed functions into the cloud. Just remember that whenever you’re ready to venture beyond your own data center and into the cloud, there are a wide variety of options available.
Start by identifying your specific needs and objectives, and then consider what makes sense to move into the cloud and what doesn’t. And unless you’re ready to make a major change, be sure to choose a solution that gives you the flexibility to run all the applications you already have.
About the Author
Denoid Tucker is Vice President of Technology for StrataScale, Inc., a Sacramento, Calif.-based provider of public and private cloud servers, dedicated servers and hybrid hosting services.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.