November 26, 2007
I like to spend my holidays thinking about anything but work, so I was a little upset over Thanksgiving when thoughts of cloud computing kept penetrating my mental barriers and infecting my mind.
“Isn’t cloud computing great?” I would think. “Isn’t it bound to be the next big thing in enterprise distributed computing?” Of course, the skeptic in me also posed some questions, such as “Haven’t we learned anything about the risks involved in giving new technologies catchy labels that mean different things to different people, and nothing to others?”
As an example of the latter thought, I read a couple of blogs discussing IBM’s upcoming Blue Cloud solutions in which the authors praised the notion of cloud computing for small- and medium-sized companies that want access to computing resources while avoiding the hassle inherent with managing their own datacenters, but questioned whether large companies would risk losing their datacenter-derived competitive advantages and becoming part of the greater IT homogeny. It’s a fine take on the future uptake of cloud computing, but it also is reliant on a premise that I don’t think holds much water: Cloud computing is analogous to outsourcing.
After reading IBM’s announcement, I was in no way under the impression that IBM is about to offer businesses access to resources from any source other than their own datacenters. To me, the message IBM seems to be sending is that its Blue Cloud solutions will allow businesses to form a cloud from the business’s collection of resources and that those resources will then be available to users in any location without worry of where their jobs are running. In addition, just like the architecture mimics what we are seeing with the Web, the cloud also can be used to host services that will be available company-wide in the same way the services hosted by Google and Yahoo are available over the Internet. It is very similar to what we already have seen offered by companies like Cassatt, and it does not involve accessing computing resources outside the corporate firewall. In fact, while cloud computing certainly can be an external solution (see Amazon Elastic Compute Cloud), I foresee its early success being mainly on the in-house front.
And, as we have seen with grid computing, it’s not only an issue of how specific vendors or users define a technology, but it also is an issue of whether entities even decide to use the term. Aside from “cloud computing,” terms like “fabric,” “application virtualization,” “datacenter virtualization,” “grid-based application platforms,” etc., all are used to describe similar, if not nearly identical, approaches. One example of this is my recent discussion with Yahoo about its new university partnership program. Whereas “cloud computing” was everywhere when Google and IBM announced their university project, it did not arise once with Yahoo, which seems content referring to Internet- or Web-scale computing.
Speaking of Yahoo’s new university partnership program, I was told by Yahoo vice president of worldwide research operations Ron Brachman, who also heads up academic relations for the company, that the partnership with Carnegie Mellon University is just the first step in Yahoo’s mission to vastly increase its R&D relationships with universities across the globe. “We know quite well we are not the only ones to advance the science and technology of the field,” said Brachman, which is why it is important to work hand-in-hand with the tremendous research talent that academia has to offer. In regard to this specific program, other universities will be brought on board once testing to determine whether the infrastructure is “up to snuff” is complete, but the company also has plans to work with universities to advance open source Web-scale computing in many other ways, including the possibilities of fostering a Hadoop summit meeting or funding specific research projects.
“It is very much our intention to go global with this idea because as we’ve seen with other open source work, the broader the community, the more robust the outcome,” said Brachman.
Of course, the program goes beyond what Brachman calls a “deep partnership” around joint research, as it also is intended (as is the Google/IBM project) to alter the computer science curricula at leading universities with the aim of producing future software engineers familiar with the kinds of computing that are done at Yahoo. According to Jay Kistler, vice president of the engineering for the systems tools and services group at Yahoo, there is a “bit of a shortfall” in current curricula when it comes to teaching this style of computing, and most students -- both undergraduate and graduate -- are not exposed to it in any meaningful way.
Brachman echoed this sentiment, adding that, “As we look to hire new graduates, both at the undergraduate and graduate levels, we find that in most cases people are coming in with a good, solid core computer science traditional education … but not a great, broad-based education in all the kinds of computing that are near and dear to our business.”
However, noted Brachman, Yahoo is seeing a lot of excitement among university faculty and students around the real-world data and computing “that Yahoo lives and breathes every day,” and although still isolated, instances of universities updating their curricula to address Internet computing are increasing.
In regard to the similarities with Google’s recently announced program, Brachman didn’t go into much detail other than saying that there are some differences in the approaches of the two companies (although they are not glaringly obvious to me), but in the end, the more people who are involved in spreading Web-scale knowledge, the better off everyone will be.
Moving on to this week’s issue, we see something similar to cloud computing pop up in the storage space with Seanodes’ new virtual storage pool solution, Exanodes. And although Thanksgiving makes for a slow news week, there are several announcements worth reading, including: “Digipede, Zeliade Deliver High-Performance Financial Apps”; “Sun, Zeus Team on Application Traffic Management”; “iTKO Announces Service-Oriented Virtualization”; and “DataSynapse Appoints New President.” Also, be sure to check out our Q&A with Layered Technologies president and COO Todd Abrams, who enlightens us on DynaVol, the company’s new online, virtual grid-based storage solution.
Comments about GRIDtoday are welcomed and encouraged. Write to me, Derrick Harris, at email@example.com.
Posted by Derrick Harris - November 26, 2007 @ 11:27 AM, Pacific Standard Time
Derrick Harris is the Editor of On-Demand Enterprise
No Recent Blog Comments
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.