April 18, 2010
We have heard all about "cloud technology" in countless articles, but when we get right down to it, what we call "cloud technology" is actually a collection of technologies that are driven by methodologies and styles of adapting technologies to suit the mission-critical demands of enterprise HPC.
There is little consensus about what the exact nature of cloud computing is in enterprise HPC -- at least in terms of what so many in the community are calling it. Some suggest it is a technology of its own while others state that cloud is merely a style of computing. Others, including Addison Snell of Intersect 360 Research, take the concept of "style" or "forms" of computing a bit further and call cloud a "methodology" of computing. While on the surface it may seem there is little difference between these terms, with growing adoption it is important that there is some consistency or consensus. To arrive at a sense of cloud as a technology, methodology of computing, or a "new" style of computing, the question was posed to a handful of members of the enterprise and HPC community.
Cloud as a Methodology of Computing
Wolfgang Gentzsch, Advisor to the EU project Distributed European Infrastructure for Supercomputing Applications (DEISA) and member of the Board of Directors of the Open Grid Forum, suggests that cloud computing is not a distinct technology, but rather is a combination of technologies that have evolved over the course of decades to create something far more associated with a methodology versus a style. Gentzsch states:
Cloud computing is many things to many people. If, however, you look closer at its evolution, from terminal-mainframe, to client-server, to client-grid, and finally to client-cloud, (perhaps to terminal-cloud, or PDA-cloud, next), it is the logical result of a 20-years effort to make computing more efficient and more user friendly -- from self-plumbing to utility.
In my opinion, cloud comes closest to being a methodology, i.e., "a set of methods, practices, procedures, and rules" defined and applied by the IT community to provide user-friendly access to efficient computing, which are, high-level: computing as a utility; pay-per-use billing; access over the internet -- anytime, anywhere; scaling resources; Opex instead of Capex; etc.
To a lesser extent, this has to do with a specific technology which you would call cloud technology; the technological bits and pieces you need to build and use a cloud have been developed before the term cloud computing was invented, and are thus independent of cloud. In fact, already in the 90s, the idea of ASP was purest SaaS, and almost all ingredients were already there: the Internet, secure portals, server farms, ASP-enabled applications, and software companies willing to implement. But all these components were still inefficient: server farms didn't scale, bandwidth was low, portals were clumsy, and most importantly, users weren't mentally ready for ASP.
Today, all the technology is on the table to build a most efficient, scalable, flexible, dynamic cloud. However, still, the most severe roadblocks to cloud adoption today (are the same as with ASPs and grids and) come from mental barriers and considerations like privacy, competitiveness, and intellectual property issues. (See a more complete listing of roadblocks in my most recent blog.)
So, in my opinion, cloud computing is a methodology for utility computing, enabled by different modern technologies, supporting a new style of computing, i.e., computing via the Internet.
Echoing this view of cloud as a methodology of computing versus a unique set of technologies (albeit using a different approach), Bruce Maches, former director of information technology for Pfizer's R&D division and current CIO for BRMaches & Associates, stated:
There are arguments that can be made on both sides (yes or no) for all three of the possibilities. I would argue no, cloud is not a technology in and of itself. Cloud computing in the natural evolution for the use of the infrastructure built around and supporting the internet and the services it provides. There is no one single technology you can point to and say 'that is cloud computing.' Certainly there are many computing advances that enable the leveraging of hardware and software resources over the internet and allow companies to avoid building out their own expensive infrastructure. To try to lump them into one technology called cloud just doesn't quite work.
Is cloud a style of computing? This is a harder one to define as a style can be a manner or technique. It would be difficult to come up with definitive arguments to say either yes or no. Is it a methodology? Is it a discipline on how computing resources, regardless of source, are appropriately and efficiently applied to solve problems? Are there underlying governance principles that can be used to determine if cloud computing is the right answer to meet a particular need?
I would make the argument that the application of cloud computing is the overall gestalt of using appropriate methodologies to determine when to apply the 'style' of cloud computing all of which is supported by the underlying computing and networking technologies.
Enterprise and HPC Cloud as a (Not So) New Style of Computing
Weisong Shi is an Associate Professor of Computer Science with Wayne State University, where he directs the Mobile and Internet Systems Laboratory (MIST) and follows research interests in computer systems and mobile and cloud computing, Shi, who co-authored this article, suggests that from the perspective of end users, cloud is a "new" style of computing, stating:
To discuss this, we need to take a look at the history of computing. I think there are three phases of computing in the last 60 years. In the first phase (1960-1980), also known the mainframe era, the common setting was a mainframe with tens of dummy terminals. If a user wanted to use a computer, he or she would have to go to a computer room and submit the job. The advantage of this style is that end users didn't need to maintain the computer, e.g., installing software and upgrading drivers, and so on, but at the cost of flexibility. In the second phase (1980-2005), also known as the PC era, each user had his or her own computer -- this is what PC stands for, personal computer. The biggest advantage of this computing style is the flexibility it brought to us. End users can do computing wherever they want, and don't have to go to a dedicated computer room. We have witnessed the success of this model since the inception of personal computers. However, as the fast penetration of computers spreads, we envision that the computer is more and more like an appliance in our home, and end users want to treat a computer the same way they treat a TV or refrigerator. Apparently, the PC model does not work since it requires end users to install and maintain the computers by themselves, also the PC era is not very well designed for content sharing among multiple users, since the network is treated as the first-class entity at this phase.
The fast growth of Internet services, e.g., Google documents, Youtube, etc., together with the wide deployment of 3G/4G technologies, stimulate another wave of revolution for the way we use computers, i.e., cloud computing. I think we are entering the cloud computing era, where end users will enjoy the flexibility brought by mobile Internet devices (MID) and the ease of management/sharing of their content, i.e., email, documents, photos, videos, and so on, brought by cloud computing. With cloud computing, we will realize the vision of "Computing for the Masses" in the near future.
From the technology point of view, I don't think cloud computing introduces too many new challenges and new ideas. What we need to do in these systems is use the existing techniques more efficiently. For example, the Dynamo system, designed by Amazon, uses the most common techniques in the text book of distributed systems, such as optimistic replication, quorum systems, and so on. In Google file systems (GFS), we don't see too many new ideas, either. The challenge they are facing is how to get it to work in a large-scale setting, and how to use the resources in a more efficient way. In summary, I think cloud computing is more about a "new" style of computing, instead of a new technology or methodology.
When Definitions Become Stifiling
Jose R. Rodriguez, CIO of San Jose, Calif.-based Metzli Information Technology, a consulting and implementation firm aligned with IBM Dynamic Infrastructure initiatives, suggests that cloud is a style, methodology, and blend of technologies at once, stating:
If we accept Irving Wladawsky-Berger's insight of cloud computing as the evolution of Internet-based computing, it is clear that not a single but multiple technologies are at work facilitating network access to a pool of configurable computing resources (NIST). That hardware-decoupled, virtualized shared resource pool is highly-available, provisioned and released on demand (NIST) with a high degree of provider automation so as to minimize management overhead. Revision 15 of NIST lists not a single but three styles or models of delivering services via cloud computing. The first, software as a service (SaaS), provider applications are accessible from end user heterogeneous computing devices; second, platform as a service (PaaS) provides an homogeneous environment suitable for the end user deployed/managed applications; and third, infrastructure as a service (IaaS), suitable for end user arbitrary deployment and control of applications and platform, storage, processing.
It should be noted that in those styles or service delivery models aforementioned, the complexity of underlying cloud infrastructure is hidden from the end user. Hence, cloud computing is rather a methodology delineating an evolving computing paradigm having characteristics of high availability and broadband, elasticity, pooling of resources and a mechanism to measure the usage of those (NIST). Accordingly, although cloud computing may be logically categorized into private, public, community, and hybrid deployment models, Irving Wladawsky-Berger might describe the evolving paradigm as analogous to the industrialization of the delivery mechanism for cloud services: the datacenter.
As John Hurley, principle investigator and director for the National Nuclear Security Administration and DOE-sponsored Center for Disaster Recovery, notes in his discussion on the topic, "The advancements that have revealed themselves in hardware, software and networking now enable us to solve much different kinds of problems. Of even greater importance is the fact that the potential for solutions to very real, practical and large-scale problems has not only enabled us, but actually requires us to really start from the beginning, in terms of how we define the problems and the resources to address them."
In short, the definitions we need to be most concerned with are those that direct end users forward and keep innovation thriving. While it can be dangerous to put forth "mixed" information about what cloud is (i.e., consistently calling it a "technology" as if it were rooted in one single innovation), if there is greater consensus on what it is, the overwhelming majority of writing on the topic can clarify cloud for end users by adhering to one definition -- that cloud is a blend of technologies that allow for new styles and methodologies of computing for enterprise and HPC users.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.