March 31, 2011
ORANGE, Calif., March 31, 2011 - AFCOM, the world's leading data center association, today announced the release of “The State of the Data Center,” a status report on data centers highlighting results from a survey of 358 data center managers from around the world. AFCOM conducted the survey to reveal how data centers are adapting to the most critical challenges, technologies and economic factors. This latest report proves top issues include the demands of space, energy efficiency, and physical and logical security.
Key findings in the report include:
“When it comes to disaster recovery, the survey results are indicative of the investment activity we have seen in data centers throughout the recession—focus on immediate needs with business continuity and disaster recovery planning considered a luxury,” said Richard Sawyer, Worldwide Practice Leader, Critical Facilities Assurance at HP Critical Facility Services, and member of the Data Center Institute Board of Directors. “But now, with the regional disasters in Australia, New Zealand, and Japan, and the turmoil in the Middle East, we are reminded it is a management necessity to be prepared for anything.”
“In an environment where change is an accepted part of day-to-day life, it is important to recognize how data center managers are adapting to the new technologies and directions emerging in the industry,” said Jill Yaoz, CEO, AFCOM. “One of the most interesting changes our survey illustrates is the continued transition to the cloud. When we last did this survey in October 2009, very few data centers were even interested in the cloud, let alone actually adapting it. However, thanks to information such as the Data Center Institute’s ‘Guide to the Cloud’ report, now we see that data center managers are more familiar with the risks and concepts, and cloud computing is quickly becoming a new standard of operation.”
In August 2010, the Data Center Institute released its findings on cloud adoption, a trend that has continued to grow, according to the recent survey. Because of this continuing trend, AFCOM is offering yet another educational series on cloud, as well hosting as a panel discussing all of the results of “The State of the Data Center” survey, at the Spring 2011 Data Center World conference, currently being held until March 31, 2011 at the Mirage Hotel in Las Vegas, Nev.
Media interested in viewing the full survey results should please contact afcom-at-schwartzcomm-dot-com.
AFCOM (www.afcom.com) is the leading association supporting the educational and professional development needs of data center professionals around the globe. Established in 1980, AFCOM currently boasts more than 4,000 member data centers and 41 chapters worldwide, and provides data center professionals with unique networking opportunities and educational forums and resources through its annual Data Center World Conferences, published magazines, regional chapters, research and hotline services, and industry alliances.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.