September 27, 2010
Brussels, September 27, 2010 – RESERVOIR enables the migration of resources across distributed administrative domains, maximizing resource exploitation, and minimizing costs to the end-user with guaranteed quality of service. How does it work? RESERVOIR defines an open federated infrastructure cloud architecture and delivers a framework of open source components you can download from the RESERVOIR website and integrate to build your own open source cloud infrastructure.
Open Source Components
Several key components of the RESERVOIR architecture are being released as open source middleware.
The Claudia platform offers a Service Management toolkit to deploy and control the scalability of service among a public, private or hybrid IaaS cloud. It provides a Dashboard and a standard TCloud API based on OVF to support provisioning of PaaS and SaaS. The Claudia platform is available through the Morfeo open source community.
The Claudia platform can also be integrated with the OpenNebula cloud management framework. OpenNebula is an open source toolkit, with excellent performance and scalability to manage tens of thousands of virtual machines, with high integration capabilities to fit into any existing data center, and with the most advanced functionality for building private, public and hybrid clouds. It provides the most common cloud interfaces to expose its functionality for virtual machine, storage and network management The OpenNebula platform is available under Apache license on its community site and on the Morfeo open source community. Explanations are available on how to integrate the Claudia and the OpenNebula platforms.
To help secure the integrated Claudia and OpenNebula platforms, security services are also planned for release on Morfeo. The security services provide access control for the public interfaces of the IaaS cloud, and allow securing an IaaS federation. Role based access control will protect both the Claudia and OpenNebula public interfaces. Role based access control is provided in combination with X509 certificates to provide authorisation, authentication and integrity checks. Security services are also provided to secure the IaaS federation. They allow providing authentication between data centres within a cloud federation, and enforcing global security policies in a federation.
RESERVOIR at ICT2010 - Learn More about these Components Developed for Building Clouds
The RESERVOIR R&D stand called "Deploying Complex Multi-tier Applications on a Federated Cloud Infrastructure" will be placed in the ICT Connects zone of the ICT2010 conference in Brussels on September 27-29, 2010. This exhibit will show, via an interactive demonstration, how complex multi-tier applications can be securely deployed on a federated cloud infrastructure. It will also demonstrate how virtualisation and business service management techniques can be used to manage resources and services on-demand, at competitive costs with a high quality of service. This demonstration will present how RESERVOIR innovation will improve consumers’ accessibility to government and business services.
The networking session "Research/Industry Collaboration on Open Source Cloud Middleware" is presented by RESERVOIR in collaboration with SLA@SOI and takes place on September 28, from 11:00 to 12:30. The session aims to identify better ways to collaborate for developing European open source technology for building clouds.
Building a Cloud Infrastructure Using the RESERVOIR Framework
RESERVOIR is also providing training, giving insight into the RESERVOIR Framework. Our experts offer consulting on the architecture, the individual RESERVOIR components, and how to integrate these to build an open source cloud infrastructure. Training also aims to teach users how to create service definitions and submit them to a RESERVOIR infrastructure for deployment. Details on RESERVOIR training can be accessed from the ‘Technical Information’ section on www.reservoir-fp7.eu.
RESERVOIR, Open Source and European Perspectives
The European Commission recently highlighted, in an Expert Group’s Report on the Future of Cloud Computing, the need for coordinating open source initiatives between Research and Industry, aimed at promoting the emergence of flexible cloud-based infrastructure-as-a-service offerings. The FP7 RESERVOIR project is making significant contributions in this direction by defining an open federated infrastructure cloud architecture, and delivering a framework of open source components for building infrastructure clouds.
Further information on RESERVOIR can be found at www.reservoir-fp7.eu
The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 215605.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.