February 28, 2013
SANTA CLARA, Calif., Feb. 28 — Silver Peak Systems, the leader in accelerating data over distance, today announced that its Virtual Acceleration Open Architecture (VXOA) software in collaboration with Amazon Web Services (AWS) delivers cloud-ready data acceleration for enterprise customers. Silver Peak software enables AWS users to accelerate data moving to and from AWS data centers, drastically improving data transfer efficiency by overcoming common network challenges such as limited WAN bandwidth, distance and network quality.
Whether backing up data to the cloud or accessing cloud-hosted applications and services, Silver Peak makes AWS fast and affordable. AWS data centers can be hundreds-to-thousands of miles away from a company’s core data center. Accessing those data centers over lower-quality Internet connections can result in poor application performance and slow data transfers. Silver Peak software deployed in the Amazon cloud saves customers thousands-of-dollars every month in AWS expenses by eliminating over 50% of cloud traffic and overcoming the adverse effects of distance and network quality. With Silver Peak, Amazon customers can migrate data to the cloud quicker than ever with 20x more throughput—reducing the largest data transfers from days to hours.
“The benefits of cloud computing and being able to replicate data to the cloud can be negated by the WAN bottleneck and lower-quality public Internet connections,” said Vivian Xu, director of cloud and virtualization product management for Silver Peak. “Silver Peak software installs quickly and easily to accelerate data to and from the Amazon cloud and make it seem like your cloud-based applications are hosted right next door.”
The Silver Peak Amazon Machine Image (AMI) in the Amazon Virtual Private Cloud (VPC) enables AWS users to accelerate traffic between AWS storage, AWS hosted applications and services, and on-premise data centers. The Silver Peak AMI accelerates all data and applications, including Web and replication traffic. The software puts full design flexibility and control in the hands of enterprise users, enabling them to more quickly and cost-effectively meet their business objectives. Silver Peak supports all major hypervisors, and delivers the highest-capacity virtual appliances available on the market today (gigabit and multi-gigabit-per-second). This avoids vendor lock-in and ensures that enterprises can choose the virtual data center and cloud offering that is most suitable for their business needs.
About Silver Peak Systems
Silver Peak software accelerates data between data centers, remote offices and the cloud. The company’s software-defined acceleration solves network quality, capacity and distance challenges to provide fast and reliable access to data anywhere in the world. Leveraging its leadership in data center class wide area network (WAN) optimization, Silver Peak is a key enabler for strategic IT projects like virtualization, big data and disaster recovery.
Source: Silver Peak Systems
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.