October 20, 2010
Although it’s not news that Autodesk has been providing users access to its line of software products via its own cloud, an article that appeared today in a publication focused on a particular software package, this time for the plastics industry, served as a reminder about how the landscape for complex, compute-intensive applications is changing due to a shift in how these are accessed and used. Furthermore, the article contains a detail interview with an Autodesk official who shares some details about the company’s cloud and what they are enabling by providing it to users, both for tests and trials and the “real deal” of mission-critical modeling and simulation tasks.
On the design and engineering front, Autodesk, makers of AutoCAD and a number of other applications for similar audiences is a software front runner due its wide adoption. It is also a leader in delivering software as a service, thus allowing a broader range of researchers, designers and engineers access to the resources required to handle large-scale or particularly compute or data-intensive operations.
According to Keith Perrin, a senior industry manager for manufacturing for Autodesk, “we’re providing a whole bunch of servers to allow companies to undertake simulation out there, in their analysis.” He states that this “broadens the scope of simulation that can be undertaken and the ease of use; an analysis that frankly wasn’t possible before suddenly is possible.”
Autodesk has been working on Project Cumulus, which is its own version of hosted HPC services that are delivered to users via a simplified interface, thus making access to its MoldFlow software, which is used by many plastics engineers and designers, simpler and opening the doors to a wider number of users who were once barred from entry due to their lack of computational power.
As an Autodesk release stated, “Project Cumulus aims to help design engineers and plastics specialists that use Autodesk’s Moldflow Insight software to optimize their injection molded plastic part designs and manufacturing processes.” Along with this offering, Autodesk is taking a number of its other popular software items to the cloud market, thus delivering the CAD/CAE industry with a fresh influx of resources and perhaps new users of its products.
Jeff Wymer, a product manager with Autodesk stated that the company is “letting users leverage the untapped potential compute power of the cloud to bring optimization into the equation…we’re allowing the MoldFlow designer to optimize his design and get the best results with unrivaled performance and capacity compared to the desktop.”
Full story at Plastics Today
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.