March 14, 2005
IBM announced the availability of its world renowned Blue Gene supercomputing system, the most powerful supercomputer, at its newest Deep Computing Capacity on Demand Center in Rochester, Minn. The new Center will allow customers and partners, for the first time ever, to remotely access the Blue Gene system through a highly secure and dedicated Virtual Private Network and pay only for the amount of capacity reserved.
While some of the most ambitious supercomputing work still takes place in government labs, the potential for Deep Computing breakthroughs in new commercial markets, such as drug discovery and product design, simulation and animation, financial and weather modeling, is growing rapidly. Deep Computing Capacity on Demand will also service a number of customers in market segments that have traditionally not been able to effectively access a supercomputer at a price within their budgets.
IBM eServer Blue Gene Solution provides customers with the capability to advance science and business with unprecedented speed, ultrascale performance and extreme efficiency. The system enables customers to obtain a peak performance of 5.7 teraflops, with a single full rack system, and is optimized for compute density, low power consumption, and scalability.
With a footprint of less than one square meter, the system delivers more than ten times the performance of other supercomputers now on the market while occupying less floor space and consuming significantly less power.
IBM, working with its business partners, is making Blue Gene applicable for workloads across a variety of disciplines. Many national lab and university members are enabling a growing list of HPC applications in areas of life sciences, hydrodynamics, quantum chemistry, molecular dynamics, astronomy and space research, and climate modeling. Other areas of interest include, financial modeling, business intelligence, risk and compliance, aerodynamics study and testing and manufacturing processes.
Many key software vendors such as Novell-SuSe, LSTC, and Allinea have specified an interest in enabling their applications on Blue Gene. Etnus is already in the process of enabling their TotalView parallel debugger to Blue Gene.
"IBM has been reaching out and working with a number of our key business partners to port applications to build the Blue Gene ecosystem," said David Gelardi, vice president of IBM Deep Computing Capacity on Demand. "Customers have been able to access IBM's Deep Computing solutions via our centers for more than 18 months. Today's announcement provides customers another venue for them to 'test drive' Blue Gene to help them make purchasing decisions for either their own racks or to determine if it is more financially beneficial to continue to buy time through IBM's Deep Computing Capacity on Demand centers."
IBM's other United States- based centers in Poughkeepsie, N.Y., and Houston, as well as the European-based center in Montpellier, France, are accessible to customers worldwide via a secure VPN connection over the Internet. Clients have on demand access to over 5,200 CPUs of Intel, AMD Opteron and IBM POWER technology based compute power to run the Linux, Microsoft Windows and IBM AIX operating environments. The new center in Rochester introduces over 2,000 CPUs of IBM PowerPC-based Blue Gene technology to run Linux-based workloads.
IBM's Deep Computing Capacity on Demand centers offer scalable, secure systems that are accessible from anywhere in the world. With Deep Computing on Demand customers can:
This combination of high performance with smaller size, cost and power consumption has brought supercomputing technology to the point where it can now be made more widely available and applied to a broader set of applications. As a result, IBM has begun to offer the technology as a commercial offering.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.