June 04, 2007
SANTA CLARA, Calif., May 29 -- Hitachi Data Systems Corp., a wholly owned subsidiary of Hitachi Ltd. and the only provider of Services oriented Storage Solutions, today announced several industry-leading enhancements to its 21st century digital archiving solution, the Hitachi Content Archive Platform. The enhanced Hitachi Content Archive Platform -- Version 2.0 -- outpaces competitive content addressed storage (CAS) solutions to deliver the specific content management services that customers are demanding to enable them to meet their data retention, scalability, preservation and protection requirements as mandated by governmental regulations and corporate governance.
Just as Hitachi introduced a myriad of industry breakthroughs in storage virtualization, thin provisioning, scalability and performance with the recently announced Hitachi Universal Storage Platform V, the company has unveiled several industry-firsts for the digital archive.
The enhanced Hitachi Content Archive Platform can support up to 20 petabytes in an 80-node archive system. A single Hitachi Content Archive Platform node can scale up to 400 million objects (files + metadata + policies), and an 80-node system can support up to 32 billion objects. Utilizing proven enterprise Hitachi storage functionality such as RAID in a storage area network (SAN) + array of independent node (SAIN) architecture, the Hitachi Content Archive Platform delivers up to 470-percent greater performance than first-generation CAS solutions.
“The features and flexibility of a content archive solution are of paramount importance to today’s businesses which are required to store, manage and secure an exploding amount of data,” said Jack Domme, chief operating officer of Hitachi Data Systems. “Since Hitachi’s acquisition of Archivas in February of this year, we have fully integrated the company’s roadmap and delivered the next-generation capabilities that will ease customer pains associated with managing unstructured data. As opposed to burdening customers by introducing yet another island of storage for content archives and another set of software tools and management interfaces, we are delivering unique content archive services within a common management framework that enables customers to take advantage of the established heterogeneous storage virtualization services already available from Hitachi.”
"The Hitachi Content Archive Platform provides one of the best content archive solutions out there on multiple levels including intelligence, scalability, performance and ease of management," said Tony Asaro, senior analyst with Enterprise Strategy Group. "Version 2.0 of the Hitachi Content Archive Platform is feature-rich providing a compelling offering in their arsenal giving them real potential to be a leader in this space. The IT world is going from the structured data (database) archetype to an environment dominated by unstructured data (files, images, multi-media). HDS is one of the kings of structured data storage with solutions such as the Universal Storage Platform V and are poised to be a leader in the next wave."
The Hitachi Content Archive Platform is the first solution in the industry to enable customers to scale archive server nodes and storage capacity independently. While other archiving solutions require additional servers and additional processing power to scale storage, the Hitachi Content Archive Platform V 2.0 reduces the number of server nodes required to scale, resulting in considerably lower heat emissions and power consumption and greatly simplified management.
"As the need to archive and store data continues to grow at an unprecedented rate, customers need safe, scaleable, reliable solutions to address their content archiving needs," said Laura DuBois, research director for Storage Software at IDC. "The Hitachi Content Archive Platform Version 2.0 provides an innovative and compelling encryption tool for data protection and long-term scalability with an 80 node archive system and is the first in the industry to offer customers a completely flexible approach with the ability to scale archive server nodes and storage capacity independently."
The Hitachi Content Archive Platform uses open, standards-based interfaces such as Network File System (NFS), Common Internet File System (CIFS), Web-based Distributed Authoring and Versioning (WebDAV) and Hypertext Transfer Protocol (HTTP) as well as storage management standards such as the Storage Management Initiative Specification (SMI-S). Archaic CAS solutions still force customers to spend additional development and training costs associated with proprietary APIs used to integrate content-producing applications with their systems.
Patent-Pending Data Protection
Hitachi today also introduced a breakthrough encryption solution for customers looking to simplify content management and ensure continuous archived data protection. The patent-pending innovation, also referred to as “secret sharing,” allows a customer to store their security key within the Hitachi Content Archive Platform and “secretly share” that key across multiple nodes within the archive. In doing so, only a fully operational system -- with all of its nodes connected to the archive -- will be able to decrypt the content, meta data and search index. This new software ensures that if a server or storage device is stolen or removed from the cluster, the device would be automatically encrypted and immediately unreadable by any other device.
The Hitachi Content Archive Platform V 2.0 also features enhanced object replication capabilities with advanced configurable options for digital signatures to ensure authenticity, compression for efficient use of network bandwidth and encryption of data at rest for privacy of data-in-flight.
Customers now facing the challenge of storage efficiency can rely on the Hitachi Content Archive Platform to maximize their storage resources with a new data de-duplication service. Also known as single-instance-storage, this new storage service from Hitachi goes beyond competitive offerings by providing both a hash comparison and binary comparison to ensure objects are actual duplicates, therefore avoiding “hash collisions” where different objects could have the same cryptographic hash key. Through metrics generated by the Hitachi Content Archive Platform, customers will be able to see how many duplicates were eliminated and the amount of total storage capacity saved.
"Efficient retention and ready access to a rapidly-growing volume of highly-diverse business information helps organizations protect against legal action and fulfill evolving compliance requirements," said Galina Datskosky, senior vice president of development at CA. “By implementing CA’s federated records, email, file systems and retention management solutions on the new Hitachi Content Archive Platform Version 2.0, IT organizations can better achieve the scalability and manageability necessary to help them address these information governance challenges.”
Hitachi Services Oriented Storage Solutions: Content Archive Services
Hitachi Services Oriented Storage Solutions applies service-oriented architecture (SOA) concepts to storage to deliver a platform that offers sets of automated functions delivered as services to the business that can be invoked as needed. Companies now have the means to replace inefficient capacity-based chargeback models with models that apply more relevant metrics and better “monetize” all of the storage-based services they provide across the enterprise, delivering substantial breakthroughs in efficiency and business agility.
Serving as the foundation for Hitachi Services Oriented Storage Solutions, the all-new Hitachi Universal Storage Platform V -- with its powerful controller-based virtualization engine -- packages and delivers storage services such as content archive services that enable clients to maximize a common management framework, common search and common protection across heterogeneous storage assets.
The Hitachi Content Archive Platform provides an archive tier of storage where aged data on primary storage can be moved. A part of a virtual pool with the Hitachi Universal Storage Platform V, data in the archive can be offloaded from expensive disk to less expensive Advanced Technology Attachments or Serial ATA storage -- further reducing archive costs.
About Hitachi Data Systems
Hitachi Data Systems Corp. provides Services Oriented Storage Solutions that enable heterogeneous storage to be dynamically provisioned according to business needs and centrally managed via industry-leading Hitachi storage virtualization software. As an integral part of the Hitachi Storage Solutions Group, Hitachi Data Systems delivers storage infrastructure platforms, storage management software, and storage consulting services through direct and indirect channels in over 170 countries and regions. Its customers include nearly 60 percent of Fortune 100 companies. For more information, visit the company's Web site at www.hds.com.
About Hitachi Ltd.
Hitachi Ltd., headquartered in Tokyo, is a leading global electronics company with approximately 384,000 employees worldwide. Fiscal 2006 (ended March 31, 2007) consolidated revenues totaled 10,247 billion yen ($86.8 billion). The company offers a wide range of systems, products and services in market sectors including information systems, electronic devices, power and industrial systems, consumer products, materials and financial services. For more information on Hitachi, visit the company's Web site at www.hitachi.com.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
he study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.