July 19, 2012
Company appoints Chris Roberts as Channel Director for EMEA
JERSEY CITY, NJ, July 19 — Cloud provider, Datapipe, a global provider of managed services and infrastructure for outsourced IT and cloud computing, today announced Chris Roberts joined the team as Channel Director for EMEA. Roberts will work with existing Datapipe EMEA partners as well as expand new business in the VAR/SI, technology vendor and service provider channel.
Datapipe's world-class managed services and experience in delivering complex IT solutions provide a competitive advantage for UK channel partners seeking additional opportunities for growth. Chris Roberts explains further, "Cloud computing has provided VAR/SI and service provider industries with a new revenue stream to enhance their businesses. Datapipe's award-winning managed services provide a competitive advantage for the UK and EMEA channel market, enabling them to quickly take advantage of cloud computing and virtualization, business continuity, disaster recovery, Infrastructure as a Service (IaaS) and other solutions to strengthen their businesses."
Previously, Chris Roberts oversaw Iomart's development of a multi-million pound channel business. In this position, he was responsible for building, promoting and delivering an indirect program for the channel community and delivering some of Iomart's largest-ever managed hosting contracts. In addition to the indirect channel program, Chris Roberts created and executed a training program for the company's VAR community as well as the accreditation of Iomart as a Dell Managed Services Provider.
"The EMEA market presents an important opportunity for Datapipe," said Robb Allen, Datapipe's CEO. "In order for us to properly serve this market, we require a seasoned channel veteran, who clearly understands the unique UK VAR/IS needs, to deliver measurable value to all parties. Chris's experienced knowledge of the EMEA channel market is a welcome addition to our team."
Interested in becoming part of Datapipe's Partner Program? For more information, visit http://www.datapipe.com/partners.
Datapipe offers a single provider solution for managing and securing mission-critical IT services, including cloud computing, infrastructure as a service, platform as a service, colocation and data centers. Datapipe delivers those services from the world's most influential technical and financial markets including New York metro, Silicon Valley, London, Hong Kong and Shanghai. For more information about Datapipe visit www.datapipe.com.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.