August 23, 2010
To read that one of the leaders at Autodesk Labs is not only broadly stating the company’s ambitions for Computer-Aided Design (CAD) and Computer-Aided Engineering applications in the cloud, but calling this movement “a disruptive change” signals that the traditional mode of CAD/CAE software delivery models might, like other software industry markets, be losing steam. While some can argue about the pace of such a march and what it implies about the next year (or even five years, for that matter) for the industry, the fact remains that an application segment that many have declared being non-cloud compatible is compelling news, if nothing else.
The general consensus is that CAD and CAE applications are not well-suited for the cloud for a number of reasons, not the least of which is the fact that such applications tend to require high-performance networks and superior GPU capabilities that the public cloud cannot accommodate. Still, there are a number of startups, including the undisputed master in the space, Autodesk, that are venturing into the cloud in order to find new ways to deliver their software product to designers and engineers.
As Beth Stackpole noted today in DesignNews, the cloud is viewed as being incompatible with the goal of “delivering the performance and interactivity required for data-intensive, graphically demanding CAD and CAE applications, especially when it comes to handling complex assemblies and larger models.” This is in addition to a host of other perceived technical barriers in addition to the security concerns that typically arise when matters of losing “control” over processes comes into play.
Brian Matthews, Vice President of Autodesk Labs told DesignNews that many “are focusing on the cloud to do the old method better, cheaper and faster, but the real implication is to do what couldn’t have been done with the traditional model.”
Matthews went on to note that Autodesk is boosting its efforts to bring more of its products to a cloud-ready state and that the beauty of the model is that “with the cloud, you can ask for 10, 100, or even 1,000 CPUs and rent them for minutes, seconds or hours and you don’t have to buy a supercomputer. Not only do you get answers much more quickly, you end up asking more questions…so the machine can give you the optimal answer to your optimization rather than an acceptable. This is a disruptive change.”
Full story at DesignNews
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.