September 27, 2010
SANTA CLARA, Calif., September 27, 2010 -- Hitachi Data Systems Corporation, a wholly owned subsidiary of Hitachi, Ltd. (NYSE: HIT), today introduced the Hitachi Virtual Storage Platform, the industry’s first 3 dimensional scaling platform, enabling organizations to scale up, out and deep for unprecedented levels of agility and cost savings in their virtualized data centers. The Hitachi Virtual Storage Platform (VSP), in combination with the new Hitachi Command Suite management software, offers best-in-class performance, capacity and open, multivendor storage virtualization for large businesses and enterprise organizations. Together, these solutions represent a major milestone in continued Hitachi Data Systems commitment to transform data centers into dynamic information centers where access to blocks, files and content is seamless and resides in a fluid and virtualized environment. (See related press release also issued today: “Hitachi Data Systems Announces Major Milestone on the Road to Enabling Data Center Transformation.”)
“Storage vendors that deliver on the critical requirements, such as 3D scaling, are what enterprises are looking for to transform their data center into an infrastructure that meets the demanding requirements of today’s virtualized environment.”
“Hitachi has a long history of innovation and industry-leading technologies since first introducing heterogeneous storage virtualization within the storage system,” said Hu Yoshida, chief technology officer and vice president, Hitachi Data Systems. “Today, only the Hitachi Virtual Storage Platform and Hitachi Command Suite provide the scalability and integration that will transform the data center with new levels of agility, flexibility, performance and sustainability. With unique 3D scaling and management, customers can deliver capacity and computing resources as quickly as virtual servers are created. This platform was actually built for virtualized server environments.”
Hitachi VSP: 3D Scaling for Performance, Capacity, External Storage Asset Utilization
The Hitachi Virtual Storage Platform is the only storage architecture that scales 3 dimensionally to help customers adapt flexibly for performance, capacity and multivendor storage asset utilization. Its data migration capabilities greatly reduce outage windows. Page-level dynamic tiering automates the page-based movement of data to the most appropriate storage media to simplify and optimize tier costs and performance. With new 2.5-inch SAS hard disk drives, it is the highest density storage available today. And, with more than 30 percent less power consumption for capacity stored than the competition, it is the most efficient enterprise storage platform. 3D scaling delivers extreme performance and capacity for robust disaster recovery and high availability systems:
Scale up to meet increasing demands of applications and servers:
Scale dynamically for performance, capacity, and connectivity without disruption
Add incremental resources as demand for storage resources increases by tightly coupling them through a global cache
Scale resources nondisruptively, including caches, ports, and drives to meet demands
Scale out to support multiple servers with changing workload requirements:
Build out the storage system by combining multiple units into a single logical system
Provision storage dynamically to multiple host servers on demand from a common pool of storage resources
Tailor for workloads and server needs without compromise
Achieve the highest performance for open and mainframe environments
Scale deep to extend the platform’s capabilities and value to heterogeneous storage:
Integrate multivendor external storage into the storage pool with scale up and scale out functionality
Provide a single platform for block, file and content with central management, data protection and search
Use lower cost external storage for lower cost purposes
Reduce OPEX through a common management framework
“Today’s data centers are reaching an inflection point, evolving into something that is more agile, scalable and efficient,” said Roger W. Cox, research vice president, Gartner, Inc. “Storage vendors that deliver on the critical requirements, such as 3D scaling, are what enterprises are looking for to transform their data center into an infrastructure that meets the demanding requirements of today’s virtualized environment.”
Use Cases: Enabling Server Virtual Environments and Cloud Deployments
Companies that will benefit from the Hitachi Virtual Storage Platform are those with massive scalability requirements, such as virtualized server environments or those looking to facilitate and optimize their cloud deployments:
Deep integration with leading server virtualization platforms like VMware vSphere and Microsoft Hyper-V for end-to-end visibility from individual virtual machine to storage logical unit and protection of large-scale multivendor environments:
Integration with the Hitachi NAS Platform for best-in-class performance and scalability, single namespace, unprecedented intelligent file and objects tiering, and the fastest performing NFS protocol for VMs
Ideal foundation for organizations moving toward the Hitachi Unified Compute Platform, which consists of servers, storage and network assets managed as inclusive business resources to put the right data in the right place at the right time.
Hitachi Command Suite: Maximizing IT Asset Utilization in Complex Infrastructures
The Hitachi Command Suite, tightly integrated with reliable hardware, enables “worry free scaling” of large numbers of virtual machines and manages even the largest infrastructures and application deployments. It provides effective reporting and management to maximize IT asset utilization within complex infrastructures, unified management for virtual tiered storage and server environments, and improved business application availability, performance and access to critical data. Organizations powered by the Hitachi Command Suite simply and effectively develop and manage their next generation virtualized data centers to reduce risks and operational costs, operate efficiently, and a realize return on their storage asset investments.
Through unique management in three dimensions, the Hitachi Command Suite helps customers lower costs and properly manages all data types:
Increase management automation and efficiency for storage, computing and virtual infrastructures
Manage more than 5 million objects and 255PB of virtualized capacity under one management server
Build a single, consolidated multivendor storage management framework with the breadth to manage storage, servers and the IT infrastructure
Unify management for block, file and content across all Hitachi storage
Create end-to-end visibility and correlation of applications, virtual machines and servers, and logical storage devices for traditional and virtualized VMware and Microsoft Hyper V environments
Manage data center complexities with an integrated suite for the highest operational efficiency
Incorporate both capacity and performance for service level management
Gain application visibility into the virtualized or cloud infrastructure through a dashboard that shows the status of the application’s service level agreement (SLA), actual storage resources being utilized, health of storage assets, and utilization trend lines.
Built on a heritage of market-leading innovation, the Hitachi Virtual Storage Platform and Hitachi Command Suite provide organizations with a reliable, dynamic and open architecture that helps lower the total cost of ownership (TCO) in the first year and will future proof their data centers. It includes a proven services organization and network of partners to help reduce operational and capital costs, lower implementation risks, transition to a new environment, and extend the life of current assets. (See related press release also issued today: “Hitachi Virtual Storage Platform Embraced by Global Partner Ecosystem.”) Savings include:
Reduced TCO: by up to 33 percent in first year, compared to monolithic architectures
Reduced storage acquisition costs: by up to 70 percent through tiered storage, storage reclamation and dynamic tiering
Reduced storage footprint: by more than 30 percent compared to the competition
Reduced carbon emissions: by more than 30 percent compared to the competition
HDFC Bank: “Over the years, we have found the right partner in Hitachi Data Systems who have helped us consistently transform our IT landscape by leveraging their cutting edge technology. Their proactive approach in updating and helping us deploy new technologies has enabled us to stay ahead of our competition. We are confident that the new Hitachi Virtual Storage Platform and Hitachi Command Suite will further improve our overall business productivity, while delivering significant environmental impact.” ~Harish Shetty, executive vice president, Information Technology
Lloyds Banking Group: “We’ve evaluated the new Hitachi Virtual Storage Platform and Command Suite software to power one of our most important infrastructure programs. Initial results showed that the new platform and software delivered consistent performance improvements, as well as uninterrupted service, which are key enablers for Lloyds Banking Group WMTT infrastructure strategy.” ~Colin Everett, head of IT Infrastructure, Wholesale
Markets Treasury and Trading
University of Utah Health Care: “At University of Utah Health Care, our storage requirements are characterized by growth and retention. The Universal Storage Platform integrated virtualization with the storage area network and not as a separate module. Now, the new Hitachi Virtual Storage Platform will help us to meet our needs for higher capacity and performance while protecting our patient records. We see no other storage solution that can match the 3D scaling ability in a single storage product.” ~Jim Livingston, director, IT
The Hitachi Virtual Storage Platform and Hitachi Command Suite are available worldwide.
Hitachi Data Systems provides best-in-class information technologies, services and solutions that deliver compelling customer ROI, unmatched return on assets (ROA) and demonstrable business impact. With a vision that IT must be virtualized, automated, cloud-ready and sustainable, Hitachi Data Systems offers solutions that improve IT costs and agility. With more than 4,200 employees worldwide, Hitachi Data Systems does business in more than 100 countries and regions. Hitachi Data Systems products, services and solutions are trusted by the world’s leading enterprises, including more than 70 percent of the Fortune 100 and more than 80 percent of the Fortune Global 100. Hitachi Data Systems believes that data drives our world – and information is the new currency. To learn more, visit: http://www.hds.com.
About Hitachi, Ltd.
Hitachi, Ltd., (NYSE: HIT / TSE: 6501), headquartered in Tokyo, Japan, is a leading global electronics company with approximately 360,000 employees worldwide. Fiscal 2009 (ended March 31, 2010) consolidated revenues totaled 8,968 billion yen ($96.4 billion). Hitachi will focus more than ever on the Social Innovation Business, which includes information and telecommunication systems, power systems, environmental, industrial and transportation systems, and social and urban systems, as well as the sophisticated materials and key devices that support them. For more information on Hitachi, please visit the company's website at http://www.hitachi.com.
Source: Hitachi Ltd.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.