November 05, 2008
I don’t think I’m forcing an analogy by comparing yesterday’s historic election to the current shift toward on-demand computing models. Both can be viewed as referendums on past practices that have left constituents in dicey situations, and both will take some time to bear the promised fruit.
If Barack Obama is successful in delivering the kind of change he promises, we can’t expect it will happen overnight. After all, it took years to get the United States into its current position, and it likely will take years -- assuming all goes well -- to dig it out. Laws need to be passed, plans need to be made, budgets need to be balanced, programs need to be established and processes need to be changed, and all of these will need adequate time to produce their desired effects.
If we consider cloud computing to be the Barack Obama of information technology, then we should grant it comparable time to work its purported magic. Today’s sprawling, over-packed, highly complex, heterogeneous, siloed datacenters have been years in the making. And while they have gotten the job done more often than not, the costs of maintaining them (not to mention growing them) have been skyrocketing at the same time monies allocated to these tasks have been slashed. Oh, and the demands for what IT is expected to deliver have gone up, too. There are no fast or easy fixes here.
Technologies like virtualization (and, on a smaller scale, grid computing) overcame objections and began the process of paring down the size of datacenters, but they also brought with them added complexities and, in some cases, performance issues. Newer technologies, like datacenter automation and rapidly evolving virtualization management techniques, and new paradigms, like cloud computing, promise to take these initiatives to the next level and beyond, and to eliminate some of those added setbacks. But they need time to deliver. Organizations need to practice piecemeal adoption as appropriate (no one is suggesting wholesale abandonment of current IT strategies), while vendors and providers must continue to enhance capabilities and address areas of concern (security and lock-in spring to mind).
In the end, cloud computing and other on-demand styles of computing might live up to their promises (I happen to think they will), but it’s too early to start calling for immediate results. That Microsoft, for example, is committing to cloud computing is a good sign in and of itself, and rushing it for details and deliverables probably isn’t the best idea (Microsoft possibly being guilty of rushing past releases, and all). That VMware is promising a Virtual Datacenter Operating System is groundbreaking, but it’s foolish to assume it will be bug-free -- or even interoperable with other hypervisors -- off that bat. If the desired results don’t happen in due time, then we can start calling for heads.
Assuming in this day and age we like our revolutions bloodless, well-planned, measured transformations are the only option. Whether you supported Barack Obama or you believe in the concept of cloud computing, you have to acknowledge there are inherent problems with the current state of affairs. Questioning cloud providers is a good thing and will only help them to develop enterprise-ready solutions, but we’ve got to give them the chance to fully -- or even substantially -- mature. If another year or two goes by and cloud computing still isn’t ready for primetime, cries of “Put up or shut up!” might be justified.
Posted by Derrick Harris - November 05, 2008 @ 2:10 PM, Pacific Standard Time
Derrick Harris is the Editor of On-Demand Enterprise
No Recent Blog Comments
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.