November 30, 2010
SAN FRANCISCO, Nov. 30, 2010 -- InformationWeek Analytics, the leading service for peer-based IT research and analysis, today announced the release of its latest research report; Storage Utilization: Data Dedupe Gains Ground-Albeit Slowly which explores the reasons companies are embracing less conventional storage technologies, and discusses trends in tiered storage and what's driving storage growth. The report digs deep into the research on data deduplication benefits and drawbacks, and assesses the need for the industry to remove the lingering mystery around the technology and set deduplication standards. The report author, Behzad Behtash, is an independent IT consultant who has served as CIO of Tetra Tech EM and VP of Systems for AIG Financial Products.
As the volume of corporate data continues to grow, IT professionals keep investing in new storage utilization technologies. Compression still ranks No. 1, but newer technologies-deduplication as well as thin provisioning, massive array of idle disks (MAID) and others-continue to make headway in the market.
* 61% of respondents now manage more than 10 TB of data for their organizations, up from 43% of those polled in 2009; 12% manage more than 500 TB of data, up from 8% last year.
* The adoption rate for data deduplication technology hit 37% this year, compared with 24% in 2009; the percentage of respondents who said they're evaluating dedupe rose slightly, to 34% from 32%.
* The percentage of respondents with no plans for immediate use of deduplication dropped from 34% to 22% in 2010, and only 7% of this year's respondents said they won't use dedupe-last year, 10% made that claim.
* 65% of respondents told us they use compression technology for data storage, at least in limited production (that's about even with last year's 64%).
* 27% of respondents use thin provisioning, up from 17% in 2009, and 25% are evaluating it, up from 20% a year ago.
* 16% of people polled said they use MAID, up from 12% in 2009; however, 22% told us they definitely won't use MAID, an intriguing increase from last year's 17%.
For full access to the research data, members can download now: http://analytics.informationweek.com/abstract/24/4637/Storage-Server/research-2010-data-deduplication.html
"We probably shouldn't be surprised that these newer storage technologies are getting people's attention," says Lorna Garey, content director of InformationWeek Analytics. "The sheer volume of data being churned out really demands new options in the storage market. And it's only going to get more intense with the ongoing push toward electronic medical records and other digital data."
InformationWeek Analytics is a subscription-based service, offering peer-based technology research. Its site currently houses more than 900 reports and briefs, and includes a dedicated area where technology professionals can access complete issues of InformationWeek Magazine. More than one hundred new reports are slated for release in 2010. InformationWeek Analytics members have access to:
* The full InformationWeek Analytics library of reports
* Peer based research and analysis to guide buying and implementing decisions
* Over 20 technology and IT business categories
* New reports launched every week
* Signature reports, such as the InformationWeek Salary Survey, InformationWeek 500 and the State of Security report
For more information on our membership programs please visit: http://analytics.informationweek.com/join
About InformationWeek Business Technology Network (http://www.informationweek.com/)
The InformationWeek Business Technology Network provides IT executives with unique analysis and tools that parallel their work flow-from defining and framing objectives through to the evaluation and recommendation of solutions. Anchored by InformationWeek, the multimedia powerhouse that looks across the enterprise, the network scales across the most critical technology categories with online properties like DarkReading.com (security), IntelligentEnterprise.com (application architecture), NetworkComputing.com (networking and communications) and PlugintotheCloud.com (cloud computing). The network also provides focused content for key IT targets, such as CIOs, developers, SMBs and IT Support Managers via InformationWeek Global CIO, Dr. Dobb's, InformationWeek SMB and HDI, as well as vital vertical industries with InformationWeek Financial Services, Government and Healthcare sites. Content is at the nucleus of our information distribution strategy-IT professionals turn to our experts and communities to stay informed, get advice and research technologies to make strategic business decisions.
About UBM TechWeb (http://www.techweb.ubm.com)
UBM TechWeb, the global leader in technology media and professional information, enables people and organizations to harness the transformative power of technology. Through its core businesses – media solutions, marketing services, and professional information – UBM TechWeb produces the most respected and consumed brands, applications, and services in the technology market. More than 14.5 million business and technology professionals (CIOs, IT and IT Support managers, Web and digital professionals, software and game developers, government decision makers, and telecom providers) actively participate in UBM TechWeb's communities. UBM TechWeb brands include: global face-to-face events such as Interop, Game Developers Conference (GDC), Web 2.0, Black Hat, and VoiceCon; large-scale online networks such as InformationWeek, Light Reading, and Gamasutra; research, training, and certification services, including HDI, Pyramid Research, and InformationWeek Analytics; and market-leading magazines such as InformationWeek and Wall Street & Technology. UBM TechWeb is part of UBM, a global provider of media and information services for professional B2B communities and markets.
Source: UBM TechWeb
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.