December 10, 2007
LOWELL, Mass., Dec. 10 -- Virtual Iron Software, a provider of enterprise-class server virtualization and virtual infrastructure software, today announced Version 4.2 of its product. With the new release, Virtual Iron responds to the increased use of its platform in production environments and in support of advanced virtualization use cases such as disaster recovery, high availability and dynamic capacity management. The release, which includes several industry firsts, will be generally available in late December.
Virtual Iron Version 4.2 is the first Xen-based server virtualization solution to include:
The release also adds other new capabilities including:
“Mobility and availability are two of the most important criteria that end users look to in virtualized environments,” said Mark Bowker, senior analyst at Enterprise Strategy Group. “Virtual Iron has focused on providing the software and tools with these considerations in mind. The Virtual Iron 4.2 software suite provides these critical capabilities as part of an enterprise-class virtualization infrastructure solution that deliver benefits to datacenter and test environments of all sizes and across all industries.”
The server virtualization software market is expected to grow to as much as $7 billion in 2011 from $800 million in 2006. Current penetration of virtualization on installed x86 servers is estimated at just 6 percent today. Virtual Iron provides easy-to-use, enterprise-class capabilities on its next-generation architecture, significantly reducing the obstacles to mainstream market adoption. The platform combines the latest version of the Xen open source hypervisor with advanced virtualization services and policy-based automation capabilities such as LiveMigrate, LiveRecovery and LiveCapacity. It also features LiveProvisioning, a “zero touch” automated deployment capability that eliminates the need for physical installation or management of virtualization software on virtualized physical servers. Virtual Iron also takes full advantage of the latest processor-based, hardware–assisted virtualization capabilities including AMD Rapid Virtualization Indexing and Intel FlexPriority to deliver near native performance. Users leverage Virtual Iron to support a broad range of datacenter initiatives including server consolidation, development and test optimization, high availability and disaster recovery, capacity management and server hosted virtual desktops (VDI).
According to industry analyst IDC, there is an increasing demand for server virtualization solutions that can support advanced use cases such as high availability and disaster recovery. The research firm estimates that by 2010, over 60 percent of server virtualization deployments will support these types of initiatives, versus just 12 percent in 2006. As this shift occurs, user benefits are also expanding beyond reduced hardware costs to include significant increases in manageability and flexibility in the datacenter.
“Well over half of our customers today are leveraging Virtual Iron’s comprehensive virtualization platform to support more advanced use cases such as disaster recovery and high availability,” said Mike Grandinetti, chief marketing officer at Virtual Iron Software. “Version 4.2 adds to these already robust capabilities to extend support for our many end users running demanding workloads in production environments while making Virtual Iron even easier to install, deploy and manage.”
About Virtual Iron Software Inc.
Virtual Iron provides enterprise-class server virtualization software solutions that are easy to use and easy to afford. The software enables organizations of all sizes to dramatically reduce the cost and complexity of managing and operating their datacenters. The software includes advanced capabilities that leverage industry standards, open source economics and built-in hardware-assisted acceleration. Users leverage Virtual Iron to support a broad range of datacenter initiatives. The software is available exclusively through Virtual Iron’s Channel One partner network. Evaluation copies are available for free download at www.virtualiron.com/free. For more information, visit www.virtualiron.com.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.