October 03, 2008
It's as if we literally cannot go through a week without some breakthough announcements in cloud computing. On Wednesday, that news was that both 3Tera and (drum roll, please) Amazon are bringing Windows support into their cloud offerings. (I know, I know: GoGrid has been offering Windows virtual servers for some time -- but Amazon being the cloud poster child and all ...)
First, we have 3Tera, whose AppLogic 2.4 solution includes "support for virtual appliances running Microsoft Windows Server incorporated in all infrastructure components necessary to run Web applications including storage, networking and load balancing." AppLogic already supports Linux, Open Solaris and Solaris 10, and now, says 3Tera, "Users of AppLogic can now utilize the most popular datacenter operating systems in their applications, and even mix and match operating systems within applications as needed."
Possibly bigger news, however, is that Amazon EC2 developers will soon be able to run Windows instances -- Windows Server and SQL Server -- in the Amazon cloud. Amazon cites the ability to run Windows as one of its most requested capabilities, and I can vouch for that. I cannot count the number of times I have been at a conference or on a phone call and heard someone -- either a potential user or an analyst -- bemoan the absence of Windows support within EC2.
What does Windows support for EC2 mean? Well, it means countless millions of .NET-based Web sites and applications will be able to run better in Amazon's cloud, and this includes, I presume, Exchange. Aside from testing and development, the overwhelming majority of cloud commentators -- if not all of them -- understand that non-mission-critical, non-business-differentiating applications like e-mail will be the first production applications to move into the cloud. If potential adopters have been using Amazon as a measuring stick for whether the cloud is ready for their needs, including Windows might be a big step in the right direction.
But I'm no dummy (or so I like to think), so I'm not ready to declare this the day cloud computing became viable. Amazon still has a long way to go toward making EC2 enterprise-friendly, and I've heard nothing yet about the premium that will be paid on the Windows images. Microsoft might be serious about getting into the cloud game, but it's not about to let potential profit fall by the wayside. When this all becomes a reality and we know how the pricing looks, etc., it'll be a little easier to make a judgment.
All I can say for now is: Curse these providers for not giving me a break from cloud computing. I really need one -- I just know there is more to the on-demand ecosystem -- but, well, I'm a hypocrite. I'm as sick and tired of cloud hype as the nexy guy, but as long it's the next big thing, I'll be here talking about it. And in this case, it's one more step toward being the next thing. Period.
Posted by Derrick Harris - October 02, 2008 @ 11:20 PM, Pacific Daylight Time
Derrick Harris is the Editor of On-Demand Enterprise
No Recent Blog Comments
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.