December 07, 2006
Sun Microsystems has announced that Cedars-Sinai Medical Center is now using a Sun Grid Rack system, comprised of 400 Sun Fire x64 servers, Sun StorageTek solutions, Sun N1 software and pre-integrated by Sun Customer Ready Systems, to process and analyze vast amounts of complex data in the pursuit of medical discoveries that could lead to new treatments for life-threatening and chronic diseases. One of the largest academic medical centers in the United States and a leader in clinical research, Cedars-Sinai expects to more than quadruple its previous data processing capacity using the Sun high performance computing grid, while also decreasing cost and power consumption.
At the new Spielberg Family Center for Applied Proteomics at Cedars-Sinai Medical Center in Los Angeles, researchers are doing highly complex analyses of the proteins in patient blood samples in order to discover and develop treatments -- for cancer, heart disease, epilepsy, high cholesterol and other diseases -- that are based on an individual's biochemical makeup and medical history. To undertake this task, Cedars-Sinai sought a supercomputer capable of massive computational power and data storage to process multiple terabytes of raw data daily and reveal patterns that could be correlated to clinical outcomes.
"Sun looked at the tasks and the computational needs we had and was able to provide an optimal solution. They were able to meet our needs at every level," said Jonathan Katz, senior scientist and director of operations, Spielberg Family Center for Applied Proteomics, Cedars-Sinai Medical Center.
The 400 Sun servers form a supercomputer in a compact footprint that perform huge volumes of statistical and data analysis. The system is aiming to generate four terabytes of data daily by 2007 -- four times what was previously processed by the grid -- and eight terabytes daily by 2008. The processing power enables researchers to analyze complex data sets in days rather than weeks or months and cross-compare data to uncover new disease connections.
"Leading research organizations come to Sun for our proven expertise in high-performance computing. Sun's technologies and services enable organizations to accelerate real-world problem solving, while also decreasing cost and power consumption," said Marc Hamilton, senior director of High Performance Computing, Sun Microsystems. "Sun's commitment goes beyond the technology though -- what drives our innovation in HPC is the belief that our supercomputers can help organizations like Cedars-Sinai solve some of the most pressing healthcare challenges that humans face."
Cedars-Sinai estimates that it saved $60,000 and two months' time by having the Sun Customer Ready Program integrate and deploy the pre-assembled grid. Moreover, the Sun servers provide further cost savings by scaling down to one-third their normal power when not active.
"We have a remarkable relationship with Sun. The passion of the employees goes far beyond selling equipment. They offer to come in on weekends to help us. The enthusiasm and dedication is something I haven't experienced with any company -- ever," said David Agus, director, Spielberg Family Center for Applied Proteomics, Cedars-Sinai Medical Center.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.