March 06, 2012
Three of Europe's most prominent research centers, CERN, the European Space Agency (ESA), and the European Molecular Biology Laboratory (EMBL) have teamed up to launch a massive cloud computing project. Helix Nebula – the Science Cloud, which takes its name from a large planetary nebula in the Aquarius constellation, will support the fast-growing IT requirements of European scientists. After an initial two-year pilot phase, the project will be expanded to include governmental organizations and industry.
The launch is part of the wider Digital Agenda for Europe. Europe's cloud-first goals are outlined in the Strategic Plan for a Scientific Cloud Computing infrastructure for Europe, which includes this ambitious vision statement:
In 2020, all scientists of all disciplines will choose the European Cloud Computing Infrastructure as their first option to store and access data, for data processing and analysis. This infrastructure will be considered as a natural infrastructure for the global science community similar to the road or telecommunication infrastructure for the general public today.
This infrastructure will contain vast quantities of data, an unrivalled array of open source tools, and a literally infinite amount of computing power accessible and usable from any kind of computer, smart phone or tablet device. Science will make significant progresses by applying data sharing and interdisciplinary research using this infrastructure as the fundamental tool. Important articles for leading publications, such as Nature and Science, will be derived from this infrastructure and it will be the source of a drastic increase of patents in Europe.
This infrastructure will have such a reliability and worldwide recognition for its implemented security/privacy scheme that also commercial companies will be using this "high security area" to derive patents.
For now, at least, the Helix Nebula project is a Europe-only endeavor due to concern over US laws like the Patriot Act, which conflict with European data security and privacy mandates. Commercial partners include Atos, Capgemini, CloudSigma, Interoute, Logica, Orange Business Services, SAP, SixSq, Telefonica, Terradue, Thales, The Server Labs and T-Systems, as well as the Cloud Security Alliance, the OpenNebula Project and the European Grid Infrastructure (EGI.eu).
The participants are working on creating a common framework, documenting everything and getting real computing going, but as the project gains steam, other scientific organizations and service providers will be invited to join.
"Assuming this phase is successful, an expansion to include more applications, more research and public organizations and more cloud computing suppliers is foreseen. Of particular interest is to stimulate a market where SME can make use of the computing platform to provide new services," notes Bob Jones, head of CERN openlab.
CloudSigma, an Infrastructure-as-a-Service (IaaS) provider, based in Zurich, Switzerland, is supplying the cloud infrastructure for the project. CEO Robert Jenkins explains that the research partners were frustrated by a lack of communication among cloud providers, and decided to use their collective buying power to commission a pan-European cloud for HPC and scientific computing. CloudSigma has been working with supply and demand side partners since June 2011, to assess the HPC requirements of the research institutions and from there design a cloud computing environment that meets these specific needs.
At this stage, they've completed a successful proof-of-concept pilot with CERN, which is using the additional computing power to process data from the Large Hadron Collider as part of the search for the theoretical Higgs boson. They're currently working with EMBL to enable more accurate gene sequencing methodology and with ESA to process large amounts of earth science data to support natural disaster research. While Jenkins was reticent to comment on the individual compute requirements of the three partners, he estimates that the total combined computing power for the project will be in the neighborhood of 50,000 – 100,000 CPU cores.
The data requirements of these institutions are accelerating rapidly, a bit like a car accelerating out in front of them, as Jenkins puts it. At EMBL, the wet lab output and the data from extracting DNA is doubling every six months or so. This puts pressure on the later stages, in terms of assembly and sequencing, and so forth. The research sites must spend more and more time and effort chasing after the extra computing capacity, which increasingly distracts them from their primary mission, the science.
In a nutshell, there's a shortfall in computing capacity. Not to mention some of these very important projects and problems that they are trying to solve are limited by the amount of actual computing power they can deploy because of coordination problems and practical issues. As just one example of the latter, CERN cannot do more science at the moment because Geneva cannot give them more electricity.
"It's kind of crazy that these problems are holding back some of the most important scientific research areas for mankind," remarks Jenkins, "The idea of Helix Nebula is that we can bring the collective computing power of these different providers and the cloud delivery mechanism, with the flexibility and transparency that it enables, to be able to allow these institutions to essentially burst into cloud and pull down those extra computing resources."
A committee of supply side and demand side partners meets regularly to map out the cloud system architectures. They evaluate the information coming in from the different proof-of-concepts to determine how the various partners are getting their work done and the requirements that involves. Then they document the performance requirements in terms of networking, CPU, RAM, etc. – all these different aspects of computing are captured and fed back to the group.
Jenkins stresses the importance of coordination and the role of networking to support that. "One of things we're working on at Helix Nebula is creating proper coordination between the cloud and the different providers so that we're able to hand off data and transfer it to each other very efficiently and reliably," he says. The participants all sit down to optimize their networking, to make it easier for data to get where it needs to go.
"There's a big win to be had from cloud providers coordinating to make their clouds much more user-friendly when you're actually using more than one cloud. That doesn't generally happen," notes Jenkins.
CERN's Bob Jones shares a similar outlook: "The extreme scale of the computing needs of CERN, in terms of processing power, data transfer rates and data storage capacity, pushes what can be done with cloud computing beyond its current limits. Science relies on collaboration, so the cloud services being deployed need to able to allow groups of researchers around the world to share their data and results in a secure manner. The sharing of resources in a secure manner is challenging what can be done with cloud computing today."
When it comes to cloud technology, one of the strengths of Helix Nebula is that it's very open, Jenkins explains. It doesn't have any specific cloud technology requirements, stipulating what software providers must run. It's more about the use case, being able to process a given data type with a specified level of performance. The methodology is cross-technology, with some providers using VMware, others using KVM, OpenStack or OpenNebula, all coordinating together to create a common framework. In the future, Jenkins says they may choose a cross-cloud driver, but during this proof-of-concept phase, which is where they're at now, they want to capture all the requirements first.
The initial work dovetails with the project's strategic objective, which states: "The European Research Area shall drive the development and implementation of a secure and globally recognised European Cloud Computing Infrastructure, initially targeting science users. This infrastructure will become 'the' platform for Europe, under public governance, ensuring open standard and interoperability and adhering to European policies, norms and requirements."
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.