July 11, 2012
NEW YORK, July11 — Uptime Institute, a division of The 451 Group, today announced the complete results of its second annual data center industry survey. The survey was developed to collect data on an annual basis around Digital Infrastructure deployment trends, procurement plans, measurement and standards practices, and other topics that impact the mission-critical data center industry.
"Among many interesting upward trends, we continue to see an increase in data center budgets, which is a pleasant surprise as many budgets in the IT sector are on the decline," said Matt Stansberry, Uptime Institute Director of Content and Publications. "Our survey, which has already piqued industry recognition in its early stages, is a true picture of where the industry is headed as our sample base represents many of the top data center owners and operators across the globe."
Key findings of the survey include:
Uptime Institute's complete survey report drills down on many of the data points, carving out segments by company size, geography and vertical industry. It includes Uptime Institute's expert analysis on Data Center Infrastructure Management (DCIM) adoption, equipment manufacturer market share, energy saving strategies for data center operators, and much more. Complimentary access to the full report is available for download with registration: http://uptimeinstitute.com/2012-survey-results.
The 2012 survey represents responses from more than 2,000 owners, operators, vendors, consultants and users from around the world. The survey report focuses on the 1,100 owners and operators from this pool. Respondents were largely represented by the financial industry, technology service providers, manufacturing and government agencies. Over 75% manage more than one data center, with North America being the best-represented region.
About Uptime Institute
Uptime Institute provides independent thought leadership, certification, education and professional services for the global data center industry in more than 50 countries. It serves all industry stakeholders, including enterprise and third-party owners and operators, manufacturers, service providers and engineers. Through Uptime Institute Professional Services, Uptime Institute delivers due diligence assessments and Certifications of site infrastructure and site management in accordance with the Tier and Operational Sustainability Standards.
Uptime Institute, a division of The 451 Group, is headquartered in New York, with offices in key locations, including San Francisco, Washington DC, London, Boston, Seattle, Denver, Sao Paulo, Dubai and Singapore. The 451 Group also owns 451 Research, a leading technology-industry syndicated research and data firm focused on the business of enterprise IT innovation.
Source: Uptime Institute
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.