May 07, 2007
LOWELL, Mass., May 1 -- Acopia Networks Inc., the leader in high-performance intelligent file virtualization, today announced that Wiley Publishing Inc., one of the world's leading providers of print and electronic products in the areas of science, technology and medicine, as well as consumer books and subscription services -- including the world renowned "For Dummies" book series -- has deployed an Acopia solution including ARX systems and FreedomFabric software to support its storage tiering and replication for disaster recovery (DR) efforts.
Prior to the Acopia deployment, Wiley faced the challenge of unacceptably long backup windows, as well as the need to reduce the cost and complexity of backing up to tape. In addition, Wiley needed a solution that would automatically and transparently replicate data for disaster recovery (DR) purposes.
“We needed to refine our backup strategies to ensure the availability of our growing data stores. We recognized that if one of our filers had a problem, we faced the real possibility of it taking one-to-two days to restore operations. Obviously, this was an unacceptable risk,” said James Sample, director of IT infrastructure for Wiley. “Today, with the Acopia solution, we have a process that automatically tiers our file data, based on its age. This provides a significant savings in both tape costs and backup time. Plus, we can now isolate our older files and intelligently remove or archive as needed. Before, we had no long term strategy to manage this data except to buy more disk.”
He continued, “The Acopia ARX also automatically replicates our most critical files to failover servers, ensuring recoverability from disaster. We looked at competing products, but quite honestly, they simply did not have the scalability nor replication capabilities that the Acopia ARX offered.”
“Wiley has taken a thoughtful and practical approach to designing its file data management strategy” said Kirby Wadsworth, senior vice president of marketing and business development at Acopia Networks. “Intelligent file virtualization offers a powerful new tool to address burgeoning growth and complexity. Acopia offers customers like Wiley the freedom to choose the most appropriate technology for storing data as its value and access characteristics change over time. The proof, as demonstrated in dramatic success stories like Wiley’s-- with its impressive improvement in efficiency and cost, is now irrefutable. We are proud to have partnered with Wiley in building such a powerful infrastructure.”
To date, Wiley has replicated 10TB of file data. An amount that Wiley predicts will continue to grow exponentially.
Founded in 1807, John Wiley & Sons Inc., provides must-have content and services to customers worldwide. Its core businesses include scientific, technical, and medical journals, encyclopedias, books, and online products and services; professional and consumer books and subscription services; and educational materials for undergraduate and graduate students and lifelong learners. Wiley has publishing, marketing and distribution centers in the United States, Canada, Europe, Asia and Australia. The company is listed on the New York Stock Exchange under the symbols JWa and JWb. Wiley's Internet site can be accessed at www.wiley.com.
About Acopia Networks
Acopia Networks is the leader in high-performance, intelligent file virtualization. Solutions based on Acopia’s FreedomFabric network operating system software help customers manage the growth, complexity and cost of unstructured, globally distributed, file-based information. By providing automatic, policy-driven, data migration, tiering, load balancing, snapshots and replication across multi-vendor storage environments, Acopia helps IT executives to reduce management overhead and accelerate business workflow. For further information about Acopia’s products and services, please visit its Web site at www.acopia.com.
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Jun 19, 2013 |
Ruan Pethiyagoda, Cameron Boehmer, John S. Dvorak, and Tim Sze, trained at San Francisco’s Hack Reactor, an institute designed for intense fast paced learning of programming, put together a program based on the N-Queens algorithm designed by the University of Cambridge’s Martin Richards, and modified it to run in parallel across multiple machines.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.