September 30, 2008
Although I'm not the first one to say it, I'm going to say it nonethess: Larry Ellison has a point about cloud computing being "gibberish." Currently, it is largely driven by hype, trendiness and vagueness, and Oracle is far from the first software vendor to use the term to its advantage. (If you haven't heard Ellison's comments, you can get the gist here.) But what Ellison -- like oil and coal stalwarts in the energy sector -- fails to understand (or refuses to acknowledge publicly) is that the long-term benefits of "fashionable" trends like clean energy and cloud computing are so strong they all but guarantee success.
I wrote last week that Oracle was smart to enable use of its products on Amazon's EC2, and I stick by that stance. After all, that is Ellison's point exactly: market-wise, at least, Oracle can claim a cloud offering without investing many resources in actually building a cloud offering. What Ellison's statement made me realize, though, is that this isn't just a first step -- it seems that Oracle's cloud policy really is repackaging and relabeling with, seemingly, very little effort or innovation thrown into the mix. And that's a problem.
Simply repackaging existing products as cloud-ready is not much different than touting clean coal or nuclear energy as clean energy technologies. Technically, the statements might be accurate, but underlying reasons for excitement around cloud computing and clean energy are deeper than just Internet access or, in the case of energy, that it's "clean." Clean coal, for example might be cleaner, but it is not clean -- and don't underestimate to costs to develop the technology. In addition, coal mining is still a dangerous, laborious process that requires untold financial and human resources to pull off -- and the supply wlll run out eventually. Nuclear energy, for its part, is clean-burning, but it, too, is costly to produce, and waste storage is an issue that is not going away. (Currently residing in Las Vegas, Yucca Mountain's backyard, I can attest to that.) I won't even get into the whole meltdown issue.
Truly clean energies -- wind, solar, geothermal, etc. -- are so lauded because they are both clean and renewable. And once the upfront infrastructure and development costs are offset, the ROI is through the roof. In the end, everyone benfits -- providers get a free and endless supply of energy, customers pay less as a result, and the atmosphere gets a break from exorbitant greenhouse gas emissions. Yes, they are new, different and not-quite-ready for primetime, but they are the future -- a confluence of environmental, economic and even national security concerns will ensure this outcome.
The same holds true for cloud computing. Repackaging an existing software portfolio and making it available on EC2 is a starting point, but it is not a successful long-term strategy for addressing the needs of enterprise IT consumers. We all know that IT budgets are being slashed at the same time demands for IT performance and capabilities are skyrocketing. So, while being able to house its Oracle software in a cloud might let customers save some cash on physical boxes, it is far from the answer to user woes. As Geva Perry points out in his blog (ignore the inherent vendor bias), this solution still requires customers to pay full upfront Oracle licensing fees, and there the software doesn't seem to have been optimized to operate in an on-demand environment. From an OPEX standpoint, I can't imagine there is any less demand for DBAs. In fact, they might even need to be more skilled to manage this environment in the cloud.
What customers need, though, is a combination of lower CAPEX, lower OPEX and improved performance. Done right, a cloud solution not only lets customers save themselves the cost and hassle of managing a bunch of physical boxes, but also to pay less for licensing via a utility billing model, work with a lightweight stack optimized for operating in a virtual environment, and allow for on-demand scalability when demand necessitates more resources. When the stars do align (including the IT Polaris --security), these are among the attirbutes that will make cloud computing the way enterprise IT is done. Software vendors who get the "the cloud" get this.
Just like oil, coal and even nuclear energy have got the job done thus far and made a lot of money doing it, so, too, has Oracle capably handled enterprises' database needs. But external forces -- some of which are direct results of these models (e.g., global warming or sky-high licensing fees and architectural complexity) -- are making these methods increasingly problematic, and new solutions are stepping up to replace them (as much as is possible, at least). Like clean coal, Oracle's Amazon enablement is a first step, but that's it.
In this case, Ellison's lipstick-on-a-pig vision will not reap results in the long run. In the years to come, he'll either change his tune or become irrelevant. Heck, even Microsoft is taking note.
Posted by Derrick Harris - September 29, 2008 @ 11:27 PM, Pacific Daylight Time
Derrick Harris is the Editor of On-Demand Enterprise
No Recent Blog Comments
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.