December 05, 2011
The ability to run HPC workloads inside a public or private cloud provides valuable insights related to both the industrial and academic worlds. If you focus on the latter and on universities in particular, you will see that we professors do not spend our entire day on research exclusively, as teaching is among our duties. This article explains how cloud computing can also be accessed by students in order to complement their education by developing Master Thesis projects related to this technology. What follows is a showcasing of the splendid work done by my students recently.
Every academic year, computer science students file in to their professors' offices to inquire about proposed Master Thesis projects. When students come to me, I respond with another question: "What would you like to work on that may benefit from cloud computing?"
If motivating Master Thesis students by having their work be an extension of their natural interests is a must, involving them in a bleeding-edge technology like cloud computing is the icing on the cake.
As I allude to in the title of this article, one of the features of this technology is its high accessibility. In this way, it opens up a world of research possibilities and engenders a fast learning process, allowing the students to develop in a reasonable time projects like the ones that are outlined below:
RSA@Cloud: Efficient Cryptanalysis on the Cloud (2010-2011)
When I asked these three students about a research field that would benefit from cloud-based cycles, they answered "cryptanalysis" as they understood that public cloud providers would provide the best infrastructure for an efficient security auditing of RSA keys (used in many fields, such as e-commerce) in terms of performance and cost.
RSA@Cloud: Alberto Megia, Antonio Molinera and Jose Antonio Rueda at one of the UCM classrooms.
They developed a system that took advantage of parallel programming and cloud computing in order to factorize very big integers, the real basis of RSA cryptosystem's security, by executing different mathematic algorithms like trial division and quadratic sieve. The combination of Amazon EC2 and private cloud machines were chosen to achieve this goal, since both make possible to reach the efficient results mentioned above.
The RSA@Cloud system relies in three main modules:
The Forecaster, a simulation software whose purpose is to estimate the required processing time and cost on Amazon EC2 allowing to find a compromise between these two parameters.
The Engine, the RSA keys factorization and parallelization module. New factorization algorithms can be easily added as they only need a wrapper script for their integration in the system.
The Keyswarm, a graphic representation of the interactions between cloud machines during the process. It uses Code Swarm (http://code.google.com/p/codeswarm/) by translating RSA@Cloud events to CVS/SVN ones.
RSA@Cloud: Forecaster module.
An interesting outcome of this Master Thesis project was the preparation and presentation of a research paper at a Spanish conference by my students. This experience allowed them to get in touch with the national scientific landscape but also with the nice natural landscape of Tenerife (Canary Islands), where the conference took place. All in all, this trip was a fitting reward for a job well-done.
RSA@Cloud: Keyswarm module showing factorization of a 49 digit RSA key using 12 machines. Each parallel task is represented by a dot.
CONSTRUCTOR: Cloud PaaS for Startups (2011-2012, ongoing)
These students planned to orient their professional career to the industry once they leave the university. With this in mind, they researched how a software startup could begin its activity and continue to prosper.
CONSTRUCTOR: Isabel Espinar, Adrian Escoms and Esther Rodrigo with a "tuned" set of servers from the UCM Computer Science Museum.
They understood that a startup with a low budget could rely solely on on-demand resources provided by a public cloud infrastructure. A development phase of a given product has specific software requirements that can be restrictive, so the developers need a tool to easily deploy and (re)configure the necessary machines. Also, developers need a standard entry point to the development platform, allowing them to change the context as fast as possible. On the other hand, the startup needs to calculate the exact development cost in order to assign a competitive price to the final product.
With these previous factors taken into account, my students' framework, named CONSTRUCTOR as a reference to "The Matrix" saga, will provide a graphical user interface with the following features:
Virtual machine management using Amazon EC2 API.
Software deployment using Chef (http://www.opscode.com/chef/).
Instant generation of a Eclipse (http://www.eclipse.org/) configuration file corresponding to the virtual machine and software that the developer wants to work with.
Differentiated accounting (virtual machine usage) for each development phase and software product.
CONSTRUCTOR's key features.
POPULOUS: Biodiversity on the Cloud (2011-2012, ongoing)
Only one member comprises the team behind this project, as he was my student at the UCM Master in Bioinformatics and Computational Biology (http://bbm1.ucm.es/masterbioinfo/) and not at the Faculty's Computer Science studies. He is currently preparing his Master Thesis at a department in the Spanish Natural Sciences National Museum (http://www.mncn.csic.es/).
POPULOUS: Gonzalo Santana and one of the National Natural Sciences Museum exhibits, a Snow Leopard.
His project, named POPULOUS as a reference to the 1989 video game, aims to understand the effects of climatic elements on the evolution, migration and extinction of species and biodiversity. This is accomplished with complex mathematical models that generate physiologic responses from the studied species according to climatic data. Survival odds for each species are then obtained for a given area.
The resulting system will verify if biodiversity in mountains is higher than in other zones or if the climatic change forces species to migrate to other zones where their survival odds are higher.
The initial computational resources available at my student's department consisted in a NVIDIA GPU cluster with no dedicated usage. The final application will be very computing intensive and should not discard any available resources in order to increase global performance. For this reason, my student turned his attention to public cloud infrastructures that would complement the local resources.
Focusing on a lower level, the parallel application will be executed in both CPU and GPU resources provided by local and cloud infrastructures. For this reason, my student has chosen the OpenCL API (http://www.khronos.org/opencl/).
The next generation of "cloud shapers"
Cloud computing is a reality we are using on a daily basis, but now it's time to start considering the next generation that will fill our ranks.
Master Thesis projects are a great opportunity assisted by the many applications of cloud computing and boosted by the student's imagination. If cloud's high accessibility and open source tools are added to the equation then we get the best methodology to help them implement their awesome ideas.
As I say to my students year after year: "It's not only education but ideas that will help you survive outside this classroom."
About the Author
Dr. Jose Luis Vazquez-Poletti is Assistant Professor in Computer Architecture at Complutense University of Madrid (UCM, Spain), and a Cloud Computing Researcher at the Distributed Systems Architecture Research Group. He is (and has been) directly involved in EU-funded projects, such as EGEE (Grid Computing) and 4CaaSt (PaaS Cloud), as well as many Spanish national initiatives.
From 2005 to 2009, Professor Vazquez-Poletti's research focused in application porting onto Grid Computing infrastructures, activity that let him be "where the real action was." These applications pertained to a wide range of areas, from Fusion Physics to Bioinformatics. During this period he achieved the abilities needed for profiling applications and making them benefit of distributed computing infrastructures. Additionally, he shared these abilities in many training events organized within the EGEE Project and similar initiatives.
Since 2010 his research interests lie in different aspects of cloud computing, but always having real life applications in mind, specially those pertaining to the High Performance Computing domain.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.