December 06, 2010
MAYNARD, MA., December 6, 2010 -- IT groups want uptime assurance and know they need high availability (HA) solutions, but are stymied in their search by a short-sighted view of price and an inadequate accounting of downtime costs, according to a Stratus Technologies/ITIC survey of 367 IT organizations.
The "2010 High Availability & Virtualization Survey" is a follow-up to an identical survey conducted in April/May 2010. The two surveys reveal a consistent ramp up in demand for high availability solutions. Almost 80 percent of the Stratus/ITIC survey responses -- from industries such as healthcare, manufacturing, high-tech and financial services -- have seen a steady increase in applications that demand high availability over the last two years. Forty-one percent consider half or all of their applications to be business critical.
However, when it comes to buying resilient technology to support those needs, those same IT people balk at what they perceive as high availability products' cost and complexity. More than half -- 54 percent -- said HA products are too expensive, require special skill sets, or don't deliver on their promises. Worse, the survey shows the cost of downtime -- a crucial piece of the total cost of ownership calculation -- is often omitted from the high availability discussion. A full 51 percent of the respondents haven't calculated the cost of one hour of downtime*. That leaves them no way to establish a cost-benefit ratio to determine if high availability technology is a solid investment. Those who say they measure downtime costs usually cannot actually cite or will underestimate the cost of critical applications going offline.
"Affordable products for uptime assurance are readily available," said Roy Sanford, Stratus Technologies chief marketing officer. "Many IT organizations' perceptions of high availability solutions are stuck in the conventional cluster era. What may seem cost-justifiable up front quickly escalates in price with more software licensing, code modification, failover scripting, protracted testing, and on-going management complexity. At best, clusters are a failure-recovery technology with marginal availability, and not failure-prevention technology or true uptime assurance."
The virtualization boom still looms over all of these developments. Eighty-one percent run business-critical applications on virtual machines today, and 84 percent expect the number of virtualized critical applications to increase over the next 12 months. Eighty-five percent believe that virtualization can provide adequate uptime for mission-critical applications. However, if the application requires interruption-free processing, virtualization alone can't provide it.
"There's a lot of misplaced faith in virtualization's high availability potential," said ITIC Principal Laura DiDio. "Virtualization is not an availability solution. It gives IT a lot of flexibility to use virtual machines to back each other up, but it takes time to bring a new virtual machine online, and any data not written to disk is lost in that time. That doesn't work for highly demanding applications that rely on uninterrupted processing delivered by fault tolerant server platforms."
The other hot IT topic of the moment, cloud computing, didn't resonate with the survey audience. Only 15 percent are planning internal cloud projects over the next year. Fifty-one percent aren't planning any kind of cloud implementation through third party providers, and of the remainder only 17 percent are sure about an implementation in the near future.
(* average mid-size company experiences 16-20 hours of downtime annually; one study group showed cost of downtime averaged $70,000 per hour according to "Reducing Downtime and Business Loss," IDC white paper, August 2009)
The survey consisted of a Web questionnaire that recorded 367 responses and two dozen in-person interviews from companies in 22 countries. The subjects were small and medium-sized businesses, mid-sized and large enterprises with anywhere from 10 to 1,000 servers and desktops.
For full survey results, please visit: http://www.stratus.com/en/About/News.aspx
About Stratus Technologies
Stratus delivers uptime for the applications its customers depend on most for their success. With its ultra-reliable servers, software and services, Stratus products help to save lives and to protect the business and reputations of companies, institutions, and governments the world over. To learn more about worry-free computing, visit www.stratus.com.
Source: Stratus Technologies
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.