October 12, 2010
BERKELEY, Calif., October 12, 2010 -- With more than 200 researchers in attendance at the seventh annual eScience Workshop, Microsoft Research showcases two technologies that facilitate data-driven research: the Microsoft Biology Foundation (MBF) and a MODISAzure-based environmental service.
Programmers and developers in bioscience now have access to the first version of MBF, part of the Microsoft Biology Initiative. With this platform, Microsoft Research is bringing new technology and tools to the area of bioinformatics and biology, empowering scientists with the resources needed to advance their research. This programming-language-neutral bioinformatics toolkit built as an extension to the Microsoft .NET Framework serves as a library of commonly used bioinformatics functions. MBF implements a range of parsers for common bioinformatics file formats; a range of algorithms for manipulating DNA, RNA and protein sequences; and a set of connectors to biological Web services such as National Center for Biotechnology Information BLAST.
"Biologists face a number of issues today, such as detecting correlations between human genome sequencing or identifying the likelihood for a patient to develop a certain disease," said Tony Hey, corporate vice president of Microsoft External Research. "The MBF aims to provide healthcare research facilities with the tools needed to help scientists advance their research and ensure data accuracy."
Several universities and companies are already using MBF as a foundation for a wide range of experimental tools that could enable scientists and clinicians with the technologies needed to make critical advancements in healthcare.
The Informatics Group at Johnson & Johnson Pharmaceutical Research and Development leveraged MBF to extend its Advanced Biological & Chemical Discovery informatics platform to seamlessly integrate small and large molecule discovery data.
"The bioinformatics features and functionality within the MBF equipped us with pre-existing functions so we didn't have to re-invent the wheel," said Jeremy Kolpak, senior analyst at Johnson & Johnson Pharmaceutical Research and Development. "Ultimately, it saved us a tremendous amount of time, allowing us to focus on the development of higher-level analysis and visualization capabilities, and delivering them faster to our scientists, thus improving their ability to make data-driven discoveries and critical diagnoses."
Another service available for researchers leverages MODISAzure and was created by Dennis Baldocchi, biometeorologist at U.C. Berkeley, Youngryel Ryu, biometeorologist at Harvard University, and Catharine van Ingen, Microsoft eScience researcher. This MODISAzure-based environmental service combines state-of-art biophysical modeling with a rich cloud-based dataset of satellite imagery and ground-based sensor data to support global-scale carbon-climate science synthesis analysis.
Using this research, scientists from different disciplines can share data and algorithms to better understand and visualize how ecosystems behave as climate change occurs. This service is built on MODISAzure, an image-processing pipeline on the Microsoft Windows Azure cloud computing platform.
"To study Earth science we need to have systems that are everywhere, all of the time, and today with our MODISAzure-based environmental service, we have taken a giant step toward that goal," Baldocchi said.
Available under an open source license, the MBF is freely downloadable at http://research.microsoft.com/bio.
Microsoft organizes the eScience Workshop. This year, the event is presented in partnership with the Berkeley Water Center, the Colleges of Engineering and Natural Resources at UC Berkeley, and the Lawrence Berkeley National Laboratory. More information about the event, the Microsoft Biology Foundation, the MODISAzure environmental service or Phil Bourne -- this year's recipient of the third-annual Jim Gray eScience Award -- can be found at http://research.microsoft.com/en-US/events/escience2010/default.aspx.
About Microsoft Research
Founded in 1991, Microsoft Research is dedicated to conducting both basic and applied research in computer science and software engineering. Researchers focus on more than 55 areas of computing and collaborate with leading academic, government and industry researchers to advance the state of the art. Microsoft Research has expanded over the years to eight locations worldwide and a number of collaborative projects that bring together the best minds in computer science to advance a research agenda based on their unique talents and interests. Microsoft Research has locations in Redmond, Wash.; Cambridge, Mass.; Silicon Valley, Calif.; Cambridge, England; Beijing, China; and Bangalore, India, and also conducts research at the Cairo Microsoft Innovation Center in Egypt; European Microsoft Innovation Centre in Aachen, Germany; and the eXtreme Computing Group in Redmond. Microsoft Research collaborates openly with colleges and universities worldwide to enhance the teaching and learning experience, inspire technological innovation, and broadly advance the field of computer science. More information can be found at http://www.research.microsoft.com.
Founded in 1975, Microsoft (Nasdaq "MSFT") is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.