January 15, 2007
NCSA staff and UIUC biophysics researchers collaborate on new techniques for studying molecular dynamics involving managing large numbers of simulations with a powerful Grid application.
"It started with the challenge in technology."
That's how Klaus Schulten, Swanlund Professor of Physics and director of the Theoretical and Computational Biophysics Group at the University of Illinois at Urbana-Champaign, describes the impetus for what became the collaborative scientific efforts conducted by his research group and NCSA. From that collaboration came two things: successful science and NAMD-G.
One of the main challenges the world faces today is the need to change our economy by using fuel sources other than petroleum. Hydrogen gas produced by light-absorbing algae is one potential fuel source, but there's a catch: the algae bubbles hydrogen gas out and permits oxygen to enter the protein, thereby switching off hydrogen production.
Researchers from the National Renewable Energy Laboratory in Colorado believed that the protein could be redesigned so that it would still permit hydrogen to bubble out without letting oxygen in. They consulted Schulten and his group, which includes Jim Phillips, senior research programmer, and physics graduate students Jordi Cohen and Anton Arkhipov. They set to work trying to discover how gas atoms travel through proteins and how hydrogen and oxygen atoms can be differentiated so that hydrogen passes through and oxygen does not.
It was Cohen who hypothesized that the gas atoms do not alter the protein, but instead take advantage of an existing suitable cavity behind the protein. From that premise, one could look for protein cavities large enough for hydrogen and even larger cavities for oxygen. But this line of inquiry only led to a larger question: How does any protein conduct gases?
That question, Cohen says, has not really been investigated, because in the field of molecular dynamics the study of one protein at a time is itself a challenge. A satisfactory answer would require the study of a large number of proteins, taking into account their variability. The simulations themselves don't require much computer time, but managing between 20 and 50 different proteins would take an enormous number of person-hours.
Whereas many NCSA users' simulations involve one large calculation, Schulten's group knew that theirs would involve utilizing multiple computing resources to perform thousands of calculations simultaneously. "And that was the moment when we realized that our savior would be the grid," says Schulten. "We did not want to use just one computer to look at one protein at a time, maybe one tomorrow or one next year. We wanted to do many at the same time, or as many as we felt compelled to examine."
However, managing workflow for such an enormous research problem was itself an enormous task. Assistance came from NCSA's cyberenvironments division, including Rick Kufrin-who has been supporting Schulten's group's use of NAMD, a parallel molecular dynamics code designed for high-performance biomolecular modeling for more than a decade -- and NCSA's Grid Application Support Group, led by Doru Marcusiu. "One way in which we distinguish ourselves is in trying to map the use of our resources to the newer tools that are part of available Grid technologies," says Marcusiu.
Following exploratory discussions in early 2005, Michelle Gower, a research programmer in Marcusiu's group, was enlisted for her expertise in Grid technologies and began working with Kufrin and the biophysics researchers to make an initial assessment of how Schulten's group was using NAMD. The team realized that their workflow could be streamlined significantly. "They were spending quite a bit of time doing relatively trivial management tasks, such as moving data back and forth, or resubmitting failed jobs," says Kufrin. After gaining familiarity with the existing "human-managed" tasks that are required to carry out lengthy, computationally intensive simulations of this nature, the team identified Condor-G, an existing, proven Grid-enabled implementation of Condor, as a possible solution. "Our goal has always been to do whatever we can to improve the resources and increase productivity for the scientists, whether it involves an existing tool, a new technology, or a technology of the future," explains Kufrin.
Schulten says it was Gower who "made this idea of using the grid a workable proposition." She used her experience with Condor-G to develop a prototype system that integrated it with Globus, NAMD, and other technologies. Schulten's group provided feedback every step of the way. NAMD-G automatically handles authentication, file transfer, job submission, and job-chaining work. It also alerts users when something goes wrong, notifies them when the job is completed, and transfers all files onto users' computers for analysis. Furthermore, NAMD-G interacts with the queuing system and can distribute jobs to multiple sites around the world.
"One thing the prototype system does for them is to allow them to have their input files on their local machine and submit the job to the remote machine without having to understand the underlying Grid architecture," says Gower. Soon thereafter, Cohen was able to begin submitting jobs via the grid.
The collaboration was so successful that it became part of an NCSA demonstration prominently featured at SC05 in Seattle. Whereas technology demonstrations are often mere exhibitions that use a "mock" experiment to show what could potentially be done, this one demonstrated how NAMD-G actually helped achieve genuine scientific results.
"I can select, say, 50 proteins which I think will be interesting and send them out to NAMD-G with very little preparation," says Cohen. "For each of the proteins, NAMD-G will do a long sequence of operations and give me all the data I want. As far as I know, this process just isn't being used anywhere else in the field of molecular dynamics." Cohen is still in the process of analyzing data and finding out what it all means, but he expects to publish the scientific results later this year. While the results will be a small but significant part of the solution to the problem of hydrogen gas production, they could shed light on a broader question that up to now has received little consideration.
Teamwork is key
"What needs to be stressed," says Schulten, "is that we were very demanding of Michelle because we didn't just want to engage in a Grid experiment, we wanted to get some science done. And that of course made it much harder for her. The result is that we not only got our science done, but we also have NAMD-G, a program that we can offer to any other users who want to use a grid." Indeed, the NAMD program underlying NAMD-G already has thousands of registered users worldwide, a huge user group that can now tap into NAMD-G.
Marcusiu believes the importance of NAMD-G's success goes beyond the science that it's helping to accomplish. It also illustrates the potential value of community cyberenvironments in two important ways. "One is to provide scientists with the capability to do something they've never been able to do before, whether it's scaling up to a larger simulation or using multiple resources instead of just one," he says. "The other way is what this project is about -- to streamline what they're currently doing. That gives them the opportunity to have more time to spend on the science instead of on these mundane tasks."
Members of both groups emphatically agree that it was the way the collaborative process was conducted that made the project so successful. Building on their longstanding relationship with Kufrin at NCSA, Schulten's group was quickly able to familiarize Gower with the problem to be solved. Gower then identified a way in which a significant impact could be made in a short amount of time, getting it in the hands of Schulten's group as rapidly as possible while working closely with them to make it robust and reliable.
It's a lesson, says NCSA's Marcusiu, that he has learned from prior collaborations with science and engineering communities, such as NEESgrid, which produced a cyberinfrastructure providing earthquake engineers around the world with the ability to share knowledge and experimental equipment and to integrate physical and computational simulations. "We continue to say that building relationships between scientists and technologists is extremely important, and it often takes time. You build trust in the people you're working with in terms of competency and reliability. You also learn each other's technology or science -- or even just their vocabulary -- and IT people start to better understand what problems they're really trying to solve."
This work was supported in part by the National Science Foundation. Funding for the Resource for Macromolecular Modeling and Bioinformatics is provided by the National Institutes of Health.
For further information: http://www.ks.uiuc.edu/Research/hydrogenase/
Source: Kathleen Ricker & Herb Morgan, NCSA
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.