December 11, 2012
IRVINE, Calif., Dec. 11 – Vision Solutions, Inc., the world's leading provider of information availability software and services for the Windows, Linux, IBM Power Systems and Cloud computing markets, has released its "State of Resilience" report for 2012 based on surveys of over 530 IT professionals worldwide representing more than 30 countries. The most important result was that 47% of respondents plan to use a hybrid strategy based on combining storage, software and cloud based technologies to protect their critical IT infrastructures and applications, compared to around 13% each that said they would use one of those technologies alone for their recovery and continuity.
Alongside this, the report findings include an overview of the increasing pressure to deliver 24/7 access and the acceptance of virtualization as a core component of resilience in the IT environment.
The fifth annual "State of Resilience" provides analysis of new and existing technologies that IT professionals are employing to protect their critical information and ensure business continuity. This year's report addresses the role that cloud has in data protection strategies, future plans for business continuity and the increasing importance of high availability solutions.
"Against a backdrop of natural disasters that had serious personal and business ramifications, as well as a range of high profile IT failures, the theme of resilience has been in the public eye more than usual. While the majority of companies have a disaster recovery strategy for IT, this year's report emphasizes that for most organizations, a traditional strategy is no longer adequate. Business demand for 24/7 access to data and applications drives the need for high availability and disaster recovery solutions and we expect this trend to continue as globalization and around-the-clock access pervade. As each and every IT transaction becomes critical, the need for real-time protection increases, regardless of the platform on which the application itself is running," said Alan Arnold, CTO of Vision Solutions.
Joint State of Resilience Webinar with IBM Corporation
The findings of the fifth annual State of Resilience Report 2012 will be discussed in a webinar presented by Vision Solutions and IBM. Speakers on the webinar will be Jacqueline Woods, Global Vice President of Systems Software and Growth Solutions at IBM, and Kim Kaminski, Vice President of Marketing at Vision Solutions.
Time: 10:00am USA CST / 3:00pm GMT / 4:00pm EU Standard Time
Date: Tuesday, December 11th
About Vision Solutions
Vision Solutions, Inc. is the world's leading provider of information availability software and services for Windows, Linux, IBM Power Systems and Cloud Computing markets. Vision's trusted Double-Take, MIMIX and iTERA high availability and disaster recovery brands support business continuity, satisfy compliance requirements and increase productivity in physical and virtual environments. Affordable and easy-to-use, Vision products are backed by worldwide 24X7 customer support centers and a global partner network that includes IBM, HP, Microsoft, VMware and Dell.
Source: Vision Solutions
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
he study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.