September 04, 2012
New server is built from the cloud up for the modern datacenter
REDMOND, Wash., Sept. 4 — Today in a global online launch event Satya Nadella, president of Microsoft Server and Tools Business, announced the general availability of Windows Server 2012. In his keynote speech, Nadella described how Windows Server 2012 is a cornerstone of the Cloud OS, which provides one consistent platform across private, hosted and public clouds.
"The operating system has always been the heartbeat of IT and is now undergoing a renaissance in the new world of continuous cloud services, connected devices and big data," Nadella said. "Microsoft's unique legacy in the most widely used operating systems, applications and cloud services positions us to deliver the Cloud OS, based on Windows Server and Windows Azure, helping customers achieve a datacenter without boundaries."
Enabling the Modern Datacenter
Microsoft built Windows Server 2012 from the cloud up, applying its experience operating global datacenters that rely on hundreds of thousands of servers to deliver more than 200 cloud services. Windows Server 2012 expands the definition of a server operating system, with significant new advancements in virtualization, storage, networking and automation. Hundreds of new features can help customers achieve a transformational leap in the speed, scale and power of their datacenters and applications. In combination with Windows Azure and System Center, Windows Server 2012 empowers customers to manage and deliver applications and services across private, hosted and public clouds.
Customers Find Success With Windows Server 2012
Customers can use their existing skills and investments in systems management, application development, database, identity and virtualization to take advantage of Windows Server 2012 and realize the promise of cloud computing. Many enterprise customers are already seeing tremendous value in early deployments. A survey of 70 early adopter customers from across the globe revealed that they expect, on average, 52 percent reduction in downtime, 41 percent reduction in workload deployment time, and 15 hours of productivity time saved per year, per employee. 91 percent of the companies surveyed expect a reduction in server administration labor, and 88 percent expect reduction in network administration labor.*
Menzies Aviation, an airline passenger and cargo handling company that employs more than 17,000 people, is using Windows Server 2012 to provide identity access management and information access policies to its employees as it rapidly incorporates newly acquired businesses.
"We are very impressed by Windows Server 2012 and Microsoft's overall solution to help us manage our systems and applications across our private cloud environments as they scale with our business," said Martin Gallington, senior vice president of IT at Menzies Aviation. "This is a dramatic leap forward, matched by a simple, cost-effective pricing model."
Equifax is a global information solutions provider that organizes and assimilates data on more than 500 million consumers and 81 million businesses worldwide. It now counts on Windows Server 2012 for improved reliability and uptime of its information services to clients.
"Windows Server 2012 revolutionizes how we can operate our datacenter, allowing us to better meet our commitments," said Bryan Garcia, chief technology officer at Equifax. "The new high availability technologies help us deliver 'always-on' applications, and we're betting on Hyper-V as a critical component of our private cloud strategy. We are gaining tremendous efficiencies, which translate into more time to innovate for company growth."
More information about Windows Server 2012 and the Cloud OS is available here. Read Satya Nadella's post on The Official Microsoft Blog here. The conversation on Twitter can be followed at #WinServer.
Founded in 1975, Microsoft (Nasdaq "MSFT") is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.
* "Windows Server 2012 Rapid Deployment Program: TCO Study Whitepaper," Microsoft Corp., June 2012
Source: Microsoft Corp.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.