December 17, 2007
REDMOND, Wash., Dec. 13 -- Microsoft Corp. this morning delivered a holiday surprise for customers and partners, unveiling a public beta for its hypervisor-based server virtualization technology called Hyper-V, a feature with some versions of Windows Server 2008. Customers and partners today can download Windows Server 2008 RC1 Enterprise with the beta version of Hyper-V to evaluate the new technology, test applications and plan future consolidation, business continuity and high-availability projects. The beta was previously expected to be ready in the first quarter of 2008 with the release to manufacturing (RTM) of Windows Server 2008. The beta is available for download at www.microsoft.com/ws08eval.
"Delivering the high-quality Hyper-V beta earlier than expected allows our customers and partners to begin evaluating this feature of Windows Server 2008 and provide us with valuable feedback as we march toward final release," said Bill Laing, general manager of the Windows Server Division at Microsoft. "Along with Hyper-V, Windows Server 2008 offers cost-effective and flexible licensing for virtualization so that customers and partners can extend the savings realized through server consolidation and deliver on the vision of Dynamic IT."
Currently, the beta for Windows Server 2008 with Hyper-V is available for the x64 Enterprise Edition in English. This beta release provides customers and partners with expanded features and capabilities not previously available in the September 2007 Community Technology Preview of Hyper-V, such as Quick Migration, high availability, Server Core role and Server Manager integration. The final version of Hyper-V remains on target for release within 180 days of the RTM of Windows Server 2008. As a feature of Windows Server 2008, Hyper-V is designed to provide a broad range of customers with familiar and cost-effective virtualization infrastructure software that can help reduce operating costs, increase hardware utilization, optimize infrastructure and improve server availability.
To provide integrated management of physical and virtual environments, Microsoft is also developing the next version of System Center Virtual Machine Manager. Customers will be able to use this integrated management tool to rapidly provision and configure new virtual machines, and centrally manage their virtual infrastructure, running on Hyper-V, Microsoft Virtual Server 2005 R2, VMware ESX Server and Virtual Infrastructure 3 (V13).
Windows Server 2008 Moves Closer to Launch
The release of the Hyper-V beta marks another significant milestone for Windows Server 2008 as it approaches final release by the "Heroes Happen Here" launch in Los Angeles on Feb. 27, 2008. The launch, featuring Windows Server 2008, Microsoft SQL Server 2008 and Microsoft Visual Studio 2008, will mark the largest enterprise launch for the Redmond, Wash.-based company and is just one of many events planned for a worldwide rollout. So far, almost 2 million customers around the world have obtained Windows Server 2008 evaluation code. With such an extensive network of testing by customers and partners, Windows Server 2008 will be the most reliable and highly secure server platform on the Windows platform Microsoft has yet delivered, introducing role-based installation and management that will make it easier to help manage and secure specific server roles.
More information about Windows Server 2008 and the Hyper-V beta is available at www.microsoft.com/windowsserver2008/virtualization/default.mspx.
Those wanting to read more about Windows Server 2008 and the Hyper-V beta on TechNet Blogs should go to http://blogs.technet.com/stbnewsbytes.
Founded in 1975, Microsoft is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.