November 15, 2004
IBM and Massachusetts General Hospital (MGH), a teaching hospital of Harvard Medical School, announced that they are working together to study how the development of a Grid-based, distributed computing infrastructure can facilitate improved collaboration and information sharing among cancer researchers.
Working with leading cancer researcher Thomas Deisboeck, of MGH's Martinos Center for Biomedical Imaging -- who is also affiliated with the Division of Health Sciences and Technology (HST) of Harvard University and Massachusetts Institute of Technology (MIT) -- IBM computer scientists at the company's Cambridge, Massachusetts, research lab have built a Grid of high-performance computers designed to improve information sharing and help researchers gain new insight through advanced brain tumor modeling and simulation.
The Grid includes IBM eServer pSeries supercomputers on Harvard's Crimson Grid and at MIT, linked with multiple IBM eServer Bladecenter servers at the company's Cambridge facility.
"Effective tools for information management, integrated tightly with underlying computing and data infrastructures, are key to life sciences researchers gaining new insights into complex problems," said David Grossman, distinguished engineer for the IBM Internet Technology Group. "In addition, the use of semantic web technologies to integrate many sources and formats of data with advanced modeling algorithms is particularly helpful for this type of large-scale collaborative project."
In October, Deisboeck was one of nine research leaders receiving a total of $14.9 million in National Cancer Institute (NCI) funding to establish an Integrative Cancer Biology Program (ICBP). Centers participating in that multi-institutional program will incorporate a spectrum of new approaches and technologies -- including genomics, proteomics, and molecular imaging -- to design mathematical models and generate computer simulations that could improve the understanding of tumor growth.
"There is an urgent need to develop a common, unifying infrastructure that enables the integration and sharing of knowledge about cancer -- both in terms of disparate data and distinct computational tools -- with the goal of modeling cancer as a complex dynamic system," said Deisboeck. "While advances in cancer research and new technologies have generated a wealth of new data and insight, all too often the lack of shared systems and standards makes integration of this crucial knowledge difficult or impossible. "
By establishing the ICBP, the NCI has acknowledged the need to generate complex synthetic models of cancer. At the same time, the NCI has identified the lack of common technical standards and tools for information sharing among cancer researchers and institutions as a significant inhibitor to more rapid progress in the fight against cancer.
"The NCI's important mission can only be achieved through these types of public-private partnerships, which leverage the strengths of different institutions in a variety of disciplines," said Dan Gallahan, associate director of the Division of Cancer Biology at the National Cancer Institute. "There is nowhere that this is more true than in our battle against cancer.
Over the next three to five years, Deisboeck will work with IBM and an international team of collaborating scientists to develop a multiscaled "virtual tumor," which will model a tumor from its earliest stage as a single cell up to a neoplasm with millions of interacting cells. The goal is to better understand and ultimately to predict the growth patterns of patient-specific tumors accurately enough to allow successful targeting.
In addition to the Grid, IBM has developed a Linux-based, high-resolution video wall -- featuring 9.2 million pixel monitors -- to provide MGH with the visualization capabilities required for the advanced modeling of tumors.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.