February 28, 2012
As part of its Powering the Possible charitable program, Dell is providing the secure cloud-based IT infrastructure to support the first FDA-approved personalized medicine clinical trial for pediatric cancer. The research is being conducted by the Neuroblastoma and Medulloblastoma Translational Research Consortium (NMTRC) at the Van Andel Research Institute (VARI) and supported by the Translational Genomics Research Institute (Tgen). HPC in the Cloud spoke with Jamie Coffin of Dell to learn more about this life-saving work.
Advancements in DNA sequencing are set to transform the use of genetic information in mainstream medicine. The Human Genome Project, launched in 1990, took 13 years and cost almost $3 billion to complete. Now with the latest work of companies like Life Technologies and Illumina, it is possible to sequence a genome in under two weeks at a cost of about $10-50K, or less, depending on the technology.
The industry has reached an inflection point where the advent of new high-performance computing technology and new genome sequencing tools means you can do something never before possible in clinical medicine, such as using genomic data to make clinical care decisions at the patient level. No doubt about it – this is a big deal.
To accomplish this important work, Dell has partnered with Translational Genomics Research Institute, a Phoenix, Arizona-based non-profit organization focused on moving translational medicine from the research area into the clinic. The trial is being conducted by the Neuroblastoma and Medulloblastoma Translational Research Consortium, a group of 11 universities and children's hospitals offering a nationwide network of childhood cancer clinical trials. In November 2011, the team launched a first-of-its-kind genomic-based clinical trial to treat and study pediatric cancer – specifically relapsed and refractory neuroblastoma, a particularly deadly form of the childhood cancer. According to Dell representative Jamie Coffin, this deadly disease kills about 97% of its victims within 1-2 years. With such a fast-moving cancer, there is no time for the conventional "trial and error" method of drug treatment, but personalized medicine is offering new hope to these young patients.
The trial is testing the hypothesis that molecular aberrations in the tumors of individual patients can be identified in real time through genomic analysis to predict responsiveness to targeted therapies. Researchers use genomics data to build a unique medical profile for the patient. This allows them to predict which of the 150-200 available chemotherapy drugs will be most effective. Each patient is then matched with a personalized drug cocktail. With Tgen translational technology and the Dell cloud platform, work that used to take a year, can now be accomplished in two weeks. This literally means the difference between life and death for some patients. According to Coffin, early results are reflecting success rates of 24-30%.
Dell's charitable foundation, Powering the Possible, is involved in this work on many levels, in building the high-end platforms and from a research standpoint. The amount of data that is being generated by each test is about 30 terabytes per patient. Daniel Vonn Hoff, T-Gen Physician-in-Chief explains it's like shredding an entire law library and then putting it back together. Computational chemists from Dell are addressing ways to deal with all this data, such as how to move the data off the instrument, process it as quickly as possible and then move the data to the cloud so it can be shared.
Cloud technology is being used to speed computation, as well as manage and store the resulting data. Cloud also enables the high degree of collaboration that is necessary for science research at this level. The scientists have video-conferences where they work off of "tumor boards" to make clinical decisions for the patients in real-time. Before they'd have to ship hard drives to each other to have that degree of collaboration and now the data is always accessible through the cloud platform.
"We expect to change the way that the clinical medicine is delivered to pediatric cancer patients, and none of this could be done without the cloud," Coffin says emphatically. "With 12 cancer centers collaborating, you have to have the cloud to exchange the data."
Dell relied on donations to build the initial 8.2 teraflop high-performance machine. A second round of donations has meant a doubling in resources for this important work, up to an estimated 13 teraflops of sustained performance.
"Expanding on the size of the footprint means we can treat more and more patients in the clinic trial so this is an exciting time for us. This is the first pediatric clinic trial using genomic data ever done. And Dell is at the leading edge driving this work from an HPC standpoint and from a science standpoint."
The donated platform is comprised of Dell PowerEdge Blade Servers, PowerVault Storage Arrays, Dell Compellent Storage Center arrays and Dell Force10 Network infrastructure. It features 148 CPUs, 1,192 cores, 7.1 TB of RAM, and 265 TB Disk (Data Storage). Dell Precision Workstations are available for data analysis and review. TGen's computation and collaboration capacity has increased by 1,200 percent compared to the site's previous clinical cluster. In addition, the new system has reduced tumor mapping and analysis time from a matter of months to days.
Personalized medicine is the future of healthcare. With continued support for its charitable arm, Dell hopes to make this life-saving technology accessible to hundreds of children over the next three years. The even bigger goal here, according to the company, is to establish an information framework that could one day help thousands of pediatric cancer patients.
Jun 19, 2013 |
Ruan Pethiyagoda, Cameron Boehmer, John S. Dvorak, and Tim Sze, trained at San Francisco’s Hack Reactor, an institute designed for intense fast paced learning of programming, put together a program based on the N-Queens algorithm designed by the University of Cambridge’s Martin Richards, and modified it to run in parallel across multiple machines.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.