November 30, 2010
SAN FRANCISCO, Nov. 30, 2010 -- InformationWeek Analytics, the leading service for peer-based IT research and analysis, today announced the release of its latest research report; Storage Utilization: Data Dedupe Gains Ground-Albeit Slowly which explores the reasons companies are embracing less conventional storage technologies, and discusses trends in tiered storage and what's driving storage growth. The report digs deep into the research on data deduplication benefits and drawbacks, and assesses the need for the industry to remove the lingering mystery around the technology and set deduplication standards. The report author, Behzad Behtash, is an independent IT consultant who has served as CIO of Tetra Tech EM and VP of Systems for AIG Financial Products.
As the volume of corporate data continues to grow, IT professionals keep investing in new storage utilization technologies. Compression still ranks No. 1, but newer technologies-deduplication as well as thin provisioning, massive array of idle disks (MAID) and others-continue to make headway in the market.
* 61% of respondents now manage more than 10 TB of data for their organizations, up from 43% of those polled in 2009; 12% manage more than 500 TB of data, up from 8% last year.
* The adoption rate for data deduplication technology hit 37% this year, compared with 24% in 2009; the percentage of respondents who said they're evaluating dedupe rose slightly, to 34% from 32%.
* The percentage of respondents with no plans for immediate use of deduplication dropped from 34% to 22% in 2010, and only 7% of this year's respondents said they won't use dedupe-last year, 10% made that claim.
* 65% of respondents told us they use compression technology for data storage, at least in limited production (that's about even with last year's 64%).
* 27% of respondents use thin provisioning, up from 17% in 2009, and 25% are evaluating it, up from 20% a year ago.
* 16% of people polled said they use MAID, up from 12% in 2009; however, 22% told us they definitely won't use MAID, an intriguing increase from last year's 17%.
For full access to the research data, members can download now: http://analytics.informationweek.com/abstract/24/4637/Storage-Server/research-2010-data-deduplication.html
"We probably shouldn't be surprised that these newer storage technologies are getting people's attention," says Lorna Garey, content director of InformationWeek Analytics. "The sheer volume of data being churned out really demands new options in the storage market. And it's only going to get more intense with the ongoing push toward electronic medical records and other digital data."
InformationWeek Analytics is a subscription-based service, offering peer-based technology research. Its site currently houses more than 900 reports and briefs, and includes a dedicated area where technology professionals can access complete issues of InformationWeek Magazine. More than one hundred new reports are slated for release in 2010. InformationWeek Analytics members have access to:
* The full InformationWeek Analytics library of reports
* Peer based research and analysis to guide buying and implementing decisions
* Over 20 technology and IT business categories
* New reports launched every week
* Signature reports, such as the InformationWeek Salary Survey, InformationWeek 500 and the State of Security report
For more information on our membership programs please visit: http://analytics.informationweek.com/join
About InformationWeek Business Technology Network (http://www.informationweek.com/)
The InformationWeek Business Technology Network provides IT executives with unique analysis and tools that parallel their work flow-from defining and framing objectives through to the evaluation and recommendation of solutions. Anchored by InformationWeek, the multimedia powerhouse that looks across the enterprise, the network scales across the most critical technology categories with online properties like DarkReading.com (security), IntelligentEnterprise.com (application architecture), NetworkComputing.com (networking and communications) and PlugintotheCloud.com (cloud computing). The network also provides focused content for key IT targets, such as CIOs, developers, SMBs and IT Support Managers via InformationWeek Global CIO, Dr. Dobb's, InformationWeek SMB and HDI, as well as vital vertical industries with InformationWeek Financial Services, Government and Healthcare sites. Content is at the nucleus of our information distribution strategy-IT professionals turn to our experts and communities to stay informed, get advice and research technologies to make strategic business decisions.
About UBM TechWeb (http://www.techweb.ubm.com)
UBM TechWeb, the global leader in technology media and professional information, enables people and organizations to harness the transformative power of technology. Through its core businesses – media solutions, marketing services, and professional information – UBM TechWeb produces the most respected and consumed brands, applications, and services in the technology market. More than 14.5 million business and technology professionals (CIOs, IT and IT Support managers, Web and digital professionals, software and game developers, government decision makers, and telecom providers) actively participate in UBM TechWeb's communities. UBM TechWeb brands include: global face-to-face events such as Interop, Game Developers Conference (GDC), Web 2.0, Black Hat, and VoiceCon; large-scale online networks such as InformationWeek, Light Reading, and Gamasutra; research, training, and certification services, including HDI, Pyramid Research, and InformationWeek Analytics; and market-leading magazines such as InformationWeek and Wall Street & Technology. UBM TechWeb is part of UBM, a global provider of media and information services for professional B2B communities and markets.
Source: UBM TechWeb
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.