December 10, 2007
Virtualization certainly has been a hot topic around these parts lately, and this week is no different. In fact, some might say it’s the only technology that matters, leaving all other enterprise infrastructure models in the dust as it moves from basic server consolidation into the higher-concept world of automation and high availability.
The 451 Group is one organization that might back up that statement, especially after the transformation it has witnessed the virtualization market undergo in the past year. Of these changes, the most staggering has to be the 833-plus-percent year-over-year increase in companies peddling virtualization management solutions, from six last December to 50 presently (according to its recently announced report). And not only is the market expanding, but the subsectors within it are, as well, with The 451 Group noting 10 distinct areas: administration; automation; backup and high availability; capacity planning; infrastructure virtualization; monitoring; optimization; security; test lab automation; and workspace virtualization.
The mention of “capacity planning” actually segues to the next item I wanted to mention -- Luke Flemmer’s article on dynamic provisioning with Amazon EC2 or similar, internal models. According to Flemmer, managing director and co-founder of Lab49, EC2 takes virtualization to the next level and allows companies to easily and quickly ramp up additional resources to meet periodic spikes in capacity. For companies that can’t access the service due to security concerns, Flemmer says they should be implementing schemes of their own that will allow them to maintain a reasonable level of resources and not fall into the old trap of buying way too many boxes just to meet the occasional peak loads. Virtualization will play a big part in such initiatives.
Amazon EC2 also comes into play in Red Hat’s new MRG (Messaging, Real-Time, Grid) distributed computing platform, as EC2 is one of many resource centers across which MRG users can schedule jobs. You’ve probably read about this software already, but if not – or if you just need to refresh your memory – you can get the background here.
There are plenty of noteworthy items elsewhere in this week’s issue, so be sure to check out: “IBM, CARE Use Grid to Advance Microfinance in Africa”; “Platform Computing Announces Symphony 4”; “European Agencies Unite Grid, Finance”; “Sun Releases xVM Ops Center for Datacenter Automation”; “Fujitsu Announces Grid-Based CentricStor”; and “HPC, Grid Leaders Offer Webinar for Insurance Companies.”
Comments about GRIDtoday are welcomed and encouraged. Write to me, Derrick Harris, at email@example.com.
Posted by Derrick Harris - December 10, 2007 @ 10:58 AM, Pacific Standard Time
Derrick Harris is the Editor of On-Demand Enterprise
No Recent Blog Comments
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.