December 11, 2006
At Storage Decisions Las Vegas, DataCore Software highlighted customers that are deploying its storage solutions. DataCore reports that it has thousands of virtual SAN deployments around the world that cite cost savings, hardware independence, better resource utilization, greater productivity and faster response to change as the main reasons why they have "gone virtual." Storage virtualization offers the benefit of giving storage administrators the ability to buy lower cost storage for use in a tiered environment and delivers the overall opportunity for them to bring costs down through automated provisioning and storage pooling to enhance overall storage resource utilization. Companies are also just as keen nowadays to adopt a virtual infrastructure for replication and disaster recovery.
New user deployments such as OverStock.com, USGS, Harbor Federal Savings, the United States Treasury Department, DFW International Airport and Physicians Web Link indicate the growing base of support for DataCore's storage virtualization in the USA. Internationally, DataCore has more than one-thousand SANsymphony and SANmelody servers installed across Europe. Germany, France, and the UK in particular, have shown a large increase in the adoption of virtualization throughout 2006. Designed for small and medium-sized environments, SANmelody 2.0 has been especially successful in Microsoft Exchange and SQL environments, which need fast disk-to-disk backup solutions, as well as in building virtual infrastructure (e.g. VMware) storage networks and for remote site, disaster recovery solutions. Many larger scale customers have also recently deployed SANsymphony Enterprise Edition software platforms, such as the Telegraph Group, the National Health Service (NHS), DBV-Winterthur Insurance, France Telecom, Alcatel, ETO, ABN AMRO, Munich Hospitals, as well as the Machine and Turbine Union (MTU) Friedrichshafen, among others.
"High availability, performance and flexibility were decisive factors and were the criteria used in choosing a storage solution," said Andreas Trockle technology specialist IKEA IT-Services GmbH, Germany. "After having been convinced of the functionality and reliability of the solution through intensive testing and with the installations proceeding very smoothly so far, we are more than satisfied with our choice to use DataCore for critical storage operations."
According to George Teixeira, president and CEO, DataCore Software, "DataCore's customers expect robust storage solutions that deliver the highest levels of reliability and uptime. Plus, they demand the cost savings and flexibility benefits inherent in storage virtualization solutions. For a virtualization solution to be successful it must be able to deliver the full spectrum of data protection capabilities -- from snapshots, to automated failover to remote disaster site mirroring. A growing customer base is the final proof and we are certainly proud of the diversity and the momentum we are seeing worldwide with users protecting their data storage with DataCore SANsymphony and SANmelody solutions."
DataCore Software delivers storage management solutions to companies around the world; however, many DataCore customers are in the company's own backyard of South Florida. By deploying VMware for server virtualization and DataCore for virtualized storage, enterprises across South Florida reflect a national and global trend. DataCore's Florida-based customers include The John S. and James L. Knight Foundation, All Medical Personnel, Harbor Federal Savings Bank, Embraer, Mercy Hospital, Homebuilders Financial Network, Gables Engineering and local universities and colleges, among others.
With DataCore, Homebuilders Financial Network, a division of Fidelity National Financial, has deployed a scalable SAN solution with fast disk backup and recovery capabilities. "In terms of advanced features of DataCore's SANmelody solution, Homebuilders Financial Network is utilizing snapshots to create point-in-time images and high-availability network mirroring that supports automatic and transparent failover of the storage to protect the data and to enhance recovery," said Jaime Soto, information technology manager, Homebuilders Financial Network.
Another Florida customer is one of the twenty-five largest, private foundations in the U.S., and the largest in Florida. Its primary data center is in Miami, and the company suffered a brief outage after Hurricane Wilma. The company decided to build a remote data center in another location as part of an update to its disaster recovery preparedness infrastructure. "We explored this option before, but even with the advent of server virtualization eliminating some of the hardware replication issues, we still needed to find a way to keep our data current at both sites. While our HP SAN is a great product, replicating our data was an expensive proposition; that is, until SANmelody came in to the picture," said Jorge Martinez, director of information systems at the Knight Foundation.
Added Teixeira, "Key to serving customers' needs in terms of disaster preparedness lies in offering proven data storage management solutions that deliver high-availability and robust business continuity. Across Florida and around the world, DataCore and our partners are helping companies implement virtual infrastructure storage solutions that deliver business continuity and disaster recovery."
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Jun 19, 2013 |
Ruan Pethiyagoda, Cameron Boehmer, John S. Dvorak, and Tim Sze, trained at San Francisco’s Hack Reactor, an institute designed for intense fast paced learning of programming, put together a program based on the N-Queens algorithm designed by the University of Cambridge’s Martin Richards, and modified it to run in parallel across multiple machines.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.