March 20, 2013
SANTA CLARA, Calif., March 20 — Silver Peak Systems, a leader in accelerating data over distance, today announced the industry’s first and only multi-gigabit-per-second (Gbps) data acceleration solution for Citrix XenServer environments. Silver Peak’s virtual offering for Citrix XenServer now extends from small remote offices to large data centers, giving Citrix hypervisor customers a highly flexible and cost-effective solution for maximizing the performance of application and storage traffic over a wide area network (WAN).
Silver Peak’s unprecedented data acceleration performance on Citrix XenServer means networking, virtualization and storage administrators can more easily meet their objectives for strategic IT initiatives such as remote data replication, virtualization, cloud, big data, and server/storage centralization. Silver Peak is the only company to offer multi-Gbps virtual data acceleration for any standard hypervisor, delivering 20x the capacity of alternative solutions.
“Citrix continues to advance its hypervisor performance and capabilities, and we are excited to be on the leading edge of those advancements with multi-gigabit data acceleration on Citrix XenServer,” said Vivian Xu, director of virtualization product management for Silver Peak. “Silver Peak is the undisputed leader in WAN acceleration and replication acceleration performance. This latest development further advances Silver Peak’s lead over the competition in software-based data acceleration by delivering products that are easy to use with unmatched performance on any standard hypervisor.”
Based on Silver Peak’s Virtual Acceleration Open Architecture (VXOA), Silver Peak software delivers maximum performance with no hardware dependencies. A recent independent analysis highlights that Silver Peak virtual products perform as well as or better than physical devices from Riverbed. In addition, Silver Peak offers the only virtual software capable of running on every major hypervisor, delivering maximum flexibility to our customers.
About Silver Peak Systems
Silver Peak software accelerates data between data centers, remote offices and the cloud. The company’s software-defined acceleration solves network quality, capacity and distance challenges to provide fast and reliable access to data anywhere in the world. Leveraging its leadership in data center class wide area network (WAN) optimization, Silver Peak is a key enabler for strategic IT projects like virtualization, disaster recovery and cloud computing.
Source: Silver Peak Systems
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
he study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.