November 16, 2012
PARIS, Nov. 16 – At its second EMEA Dell Storage Forum customer and partner event, Dell today unveiled a series of end-to-end storage portfolio updates designed to advance the Fluid Data architecture and to help customers further optimize data at every point in its lifecycle.
"The nature of IT and what a data center should do for an organization is changing rapidly as customers face multiple and often competing models of how technology is delivered – from client to cloud to converged," said Darren Thomas, vice president and general manager, Dell Storage. "Dell is best positioned with an end-to-end storage architecture that can optimize data at every point in its lifecycle through intelligence and automation. Dell's Fluid Data Architecture enables new capabilities that power the data center, so customers can drive real business results."
New Dell Compellent and PowerVault MD3 Software Increases Performance and Automation
Dell Compellent Storage Center 6.3 array software unveiled today offers enhanced scalability and performance for Dell Compellent arrays. Compellent SC8000 controllers with Storage Center 6.3 software can increase performance up to 100 percent over previous versions when running enterprise workloads.1 Additionally, with this release, Dell is the first storage provider to announce end-to-end 16Gb Fibre Channel capability – from server to switch to storage – doubling bandwidth and speeding access to business critical applications and data.
Designed for medium to large enterprises and cloud computing, Dell Compellent arrays offer best in class automated tiering, advanced data protection and superior ease of use. These features combine with a highly scalable, modular design that enables users to update storage software at no additional cost and easily expand without requiring a "rip and replace" of their existing infrastructure. According to a new IDC White Paper, commissioned by Dell, Compellent installations had almost twice as long a productive life span than non-Dell storage solutions with organizations replacing Compellent every 6.9 years compared to 3.5 years for their other storage environments.2
Additional enterprise enhancements from Storage Center 6.3 announced today include:
New Dell PowerVault MD3 software enhancements offer enhanced data protection, performance, capacity and virtualization capabilities while designed to maintain up to 99.999 percent availability for these affordable, high performance arrays.
Data protection enhancements include dynamic disk pools that simplify disk management and improve rebuild performance by distributing data across all drives, decreasing significant recovery time of drive failures. In addition to current Fibre Channel replication capabilities, new IP-based remote asynchronous replication protects against site failures, improves application speeds, and increases distance support.
New PowerVault MD3 software application integration and efficiency capabilities include VMware vStorage APIs for Array Integration (VAAI) support, which allows customers to offload certain storage related tasks from the server to PowerVault arrays. New thin provisioning capabilities enable arrays to automatically allocate storage as needed for simplified and cost-effective expansion of capacity.
Dell Boosts Data Protection Portfolio to Simplify Data Backup and Recovery
Dell's growing data protection portfolio enables customers of all sizes to attain unmatched data and application protection and recovery capability times for virtual, physical and cloud environments. Dell provides complete backup, replication and recovery solutions designed to maximize data efficiency, improve IT agility and help ensure business resiliency. Additions to the Dell data protection portfolio include:
"Dell Compellent storage plays a critical role in our IT infrastructure, and we continue to look forward to new enhancements and features that help us meet enterprise level storage demands," said Dan Marbes, Lead Systems Engineer, Associated Bank. "The opportunity to gain up to two times the performance when running our applications means faster access to important information when our users need it. We are also eagerly anticipating the integration of Active Directory authentication to more efficiently manage access to these resources."
"We've relied on AppAssure for recovery and peace of mind for years on a variety of storage platforms," said Michael Rapp, information security officer, Center for Information Technology in Education, College of Education, University of Houston. "Dell's integration of this data protection software into a purpose-built platform could make it even easier for customers to deploy an AppAssure-optimized solution with a minimal amount of set up and configuration work."
Dell Inc. listens to customers and delivers innovative technology and services that give them the power to do more.
Join us at Dell World 2012 – The Power to Do More. Technology professionals will learn from one another and identify key challenges and opportunities connected to the top forces changing business today.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.