November 27, 2006
Following the conclusion of VMworld
2006 in Los Angeles, virtualization solution providers Entisys
Solutions, Foedus, Solutions II, Govplace and The Pinnacle Group
announced that "Total Enterprise Virtualization" -- the combination of
virtualization solutions into comprehensive, enterprise-wide, virtual
infrastructures, comprising desktops, servers and storage -- will drive
the next phase of infrastructure virtualization.
"We see the adoption of Total Enterprise Virtualization quickly becoming the only viable way for companies large and small to gain maximum value from their desktop, server, and storage systems," stated Aaron Schneider, director of Services at The Pinnacle Group. "By virtualizing the total infrastructure versus one element or another, the benefits become exponential."
Total Enterprise Virtualization has been made possible, primarily, by the strong adoption of VMware server virtualization and the maturity and affordability of new virtual SAN software solutions like DataCore's SANmelody. In addition, an ecosystem of virtualization-complementary products and services has developed. The emergence of additional, major entrants into the server virtualization market, such as Microsoft, Virtual Iron and XenSource, is a testament to the tangible business benefits that virtualization affords -- chiefly, consolidation, flexibility and hardware independence, among others.
"We now have the products and services expertise to design and implement Total Enterprise Virtualization solutions," said Mike Strohl, president, Entisys Solutions, a VMware Authorized Consultant and DataCore partner. "VMware server virtualization opened the broader market to understanding that virtualization can liberate IT resources and services from the confines and limitations of physical devices, enabling them to be used more easily and efficiently. Once that is understood, it really doesn't make sense to stop with servers, now that products such as DataCore SANmelody are available."
According to Sean Burke, president, Govplace, "Storage virtualization is the perfect complement to server consolidation. Without it, the freedom and flexibility derived from server virtualization is quickly attenuated when it runs into the physical limitations of storage. Storage virtualization picks up where server virtualization leaves off, extending those benefits across the entire infrastructure of an enterprise."
David Stone, vice-president of business development, Solutions-II, a VMware and DataCore partner, has also seen the benefits Total Enterprise Virtualization delivers to his customers. "Fundamentally, server virtualization combined with storage virtualization qualitatively changes, for the better, an organization's productivity and maximizes the use of its IT resources and services," said Stone.
When recently asked to comment on DataCore's growth and momentum in a channel predominantly focused over the last several years on VMware sales, George Teixeira, president and CEO of DataCore Software, responded, "The 'Server Virtualization Plus Storage Virtualization' message we've been advocating has been affirmed by our partners and their success with our solutions. These are very successful VMware partners who have built strong and capable consultancy practices and virtualization expertise. To these partners, our affordable, SANmelody software has become the SAN solution of choice, making it possible for them to reach a broader market for server consolidation opportunities and to accelerate the sales cycles of, those deals."
Mike Reilly, president and CEO of Foedus, concluded, "Until recently, the broader market wasn't ready to absorb the practical sense of Total Enterprise Virtualization. We are just emerging from the period of initial buy-in to the server consolidation revolution in which the astounding benefits of virtualization were revealed to the market. In addition, for many, storage virtualization had remained an expensive, high-end, data center solution, until DataCore made it practical for all implementations with its SANmelody virtual SAN solutions."
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.