January 22, 2013
NEW YORK, N.Y., Jan. 22 – Dimension Data, the $5.8 billion global ICT solutions and services provider, announced today it has joined the Riverbed Authorized Support Partner Program (RASP). This allows Dimension Data to build closer relationships with its global clients and provide high-value services and support for Riverbed deployments. Dimension Data will support the full portfolio of Riverbed performance solutions spanning WAN optimization, edge virtual server infrastructure (edge-VSI), application-aware network performance management, application delivery controllers, Web content optimization (WCO), and cloud data protection for backup and disaster recovery.
More than 20,000 organizations worldwide depend on Riverbed to understand, optimize and consolidate their IT infrastructure through solutions that overcome performance issues caused by distance, distributed computing and ever increasing amounts of data. As IT organizations embark on strategic initiatives to virtualize, consolidate, and migrate workloads into cloud environments, users are moved farther from their data. Slow applications, slow file transfers and inefficient websites can negatively impact the performance and success of these initiatives.
Gary Middleton , Dimension Data's Global Business Development Manager, Network Integration said, "We are proud to become one of only a handful of Riverbed partners in the world with global RASP status. We have a long history with Riverbed. This, coupled with the strategic investments made by Dimension Data, provides the best audited coverage of Riverbed support globally.
"What's even more exciting is our new Network Optimization Assessment (NOA) offering, which is based on Riverbed Cascade network performance management, and is being field tested with a select group of clients and became generally available in early December. NOA gives organizations visibility into, and a breakdown of their network traffic patterns. In addition, it provides accurate performance statistics on how the existing network is performing; identifies top network resource consumers (users and applications), as well as network performance problems; and enables more informed network planning such as reviewing the network for voice, video or even cloud computing. This in turn allows for better selection of performance optimization and network performance management solutions that will ensure these solutions have a higher potential for delivering on the clients' unique business outcomes," continued Middleton.
"Enterprises benefit from Dimension Data's global expertise in deploying, integrating and managing significant technology implementations, which feature Riverbed performance solutions," said Scott Downie , senior vice president of worldwide support at Riverbed. "As we scale our support organization to include services from our valued partners, Riverbed is able to focus on our core competencies – providing enterprises with award-winning performance solutions that enable them to accomplish strategic IT initiatives, including IT consolidation, cloud computing and virtualization."
About Dimension Data
Founded in 1983, Dimension Data plc is an ICT services and solutions provider that uses its technology expertise, global service delivery capability, and entrepreneurial spirit to accelerate the business ambitions of its clients. Dimension Data is a member of the NTT Group.
Source: Dimension Data
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.