February 15, 2012
Enabling business to tap into all the benefits of HPC
PROVO, Utah, Feb. 15 — Adaptive Computing, managers of the world's largest supercomputing systems and experts in HPC workload management solutions, is the latest company to join NCSA's Private Sector Program, which puts the center's expertise to work on some of the toughest challenges faced by industry.
NCSA's Private Sector Program puts the center's expertise and technological innovation to work on the real-world challenges faced by business and industry. This program brings the promise of HPC to a broad segment of the market and enables businesses to tap into all the benefits HPC has to offer as well as having access to a wealth of knowledge within the HPC community.
"Adaptive Computing is taking advantage of a partnership category for software developers," says Merle Giles, leader of NCSA's Private Sector Program. "We think these partnerships will help develop an important three-way dialogue among software developers, industrial end users, and large HPC providers like NCSA."
NCSA's computing resources, including the iForge system that was designed specifically for industrial use, are a valuable platform for industrial power users to maximize productivity. Adaptive Computing's Moab intelligence engine is a big part of this collaboration to further accelerate, automate, and self-optimize IT workloads, resources, and services in large, complex heterogeneous computing environments. This partnership will also allow Adaptive Computing to advance its HPC workload management products and services to better serve the industrial market.
"NCSA's Private Sector Program will allow users access to Adaptive Computing's patented and battle-tested intelligence engine within a real world environment to bench test code that may not have otherwise been so easily accessible," comments Dave Jackson, CTO of Adaptive Computing. "It is our goal to allow the enterprise to better utilize HPC by collaborating with companies like ours to learn how to transform their standard data center environment into a high performance computing environment."
Connect with Adaptive Computing
The National Center for Supercomputing Applications (NCSA), located at the University of Illinois at Urbana-Champaign, provides powerful supercomputers and expert support that help thousands of scientists and engineers across the country improve our world. NCSA is supported by the state of Illinois, the University of Illinois, the National Science Foundation, and grants from other federal agencies. www.ncsa.illinois.edu.
About Adaptive Computing
Adaptive Computing, manages the world's largest supercomputing environments with its self-optimizing dynamic cloud management solutions and HPC workload management systems driven by Moab, a patented multi-dimensional intelligence engine. Moab delivers policy-based governance, allowing customers to consolidate and virtualize resources, allocate and manage applications, optimize service levels and reduce operational costs. Adaptive Computing offers a portfolio of Moab cloud management and Moab HPC workload management products and services that accelerate, automate, and self-optimize IT workloads, resources, and services in large, complex heterogeneous computing environments such as HPC, data centers and cloud. Our products act as a brain on top of existing and future diverse infrastructure and middleware to enable it to self-optimize and deliver higher ROI to the business with its:
For more information, call (801) 717-3700 or visit www.adaptivecomputing.com.
Source: Adaptive Computing Enterprises Inc.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.