October 22, 2012
JOHANNESBURG, Oct. 22 — Dimension Data, the global ICT solutions and services provider, today announced it has been positioned by Gartner, Inc. in the Leaders quadrant of the Magic Quadrant for Cloud Infrastructure-as-a-Service. Compiled by Gartner analysts, the Magic Quadrant examined 15 providers of Infrastructure-as-a-Service (IaaS), employing two main evaluation criteria which include 'ability to execute' and 'completeness of vision'.
"Over the next 12 months, Dimension Data will extend its platform, develop new services offerings, and aggressively execute on the strategic plans we have set for the Group"
Brett Dawson, Dimension Data CEO said, "We are extremely honoured to be positioned as a Leader in Gartner's Magic Quadrant for Cloud Infrastructure-as-a-Service. The leading trend shaping the global IT services landscape is the emergence of cloud computing. We believe our investments in our cloud capabilities are testament to Dimension Data's commitment to the cloud market, and we remain on track to become a formidable force in cloud computing."
Steve Nola, CEO of Dimension Data's Cloud Solutions Business Unit said, "We are very proud to be recognised as a Leader in Gartner's Magic Quadrant for Cloud Infrastructure-as-a-Service. We designed our cloud IaaS offerings to address the performance and security concerns of enterprises and service providers. We wanted to make it easier and less complex for organisations of every size to benefit from the flexibility and scale of cloud technologies through automation and broad global coverage. Dimension Data will continue to invest in its go-to-market resources and intellectual property, as well as its platforms and processes, so that we continue to grow our leadership position in the cloud market."
As a global ICT and systems integrator with more than 15,000 employees and USD 5.8 billion in annual revenues, Dimension Data has moved fast to advance its cloud strategy. In 2011, the Group acquired cloud hosting leader OpSource and concluded the buy-out of the remaining shares of BlueFire, a managed and cloud IT and telecommunications services company in 2011. Building on the strong cloud technology and operational expertise of the acquired companies, Dimension Data introduced a broad portfolio of public and private cloud services and rolled out five Managed Cloud Platforms (MCPs) in the US, The Netherlands, South Africa and Australia. An MCP in Hong Kong will follow later this year. In June this year, Dimension Data launched its OneCloud Partner Programme to enable service providers to bring cloud services to market more quickly.
"Over the next 12 months, Dimension Data will extend its platform, develop new services offerings, and aggressively execute on the strategic plans we have set for the Group," concludes Dawson.
About Dimension Data
Founded in 1983, Dimension Data plc is an ICT services and solutions provider that uses its technology expertise, global service delivery capability, and entrepreneurial spirit to accelerate the business ambitions of its clients. Dimension Data is a member of the NTT Group.
Source: Dimension Data
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.