May 11, 2011
Chances are, if you’ve been lurking around here for some time you’re already quite familiar with cloud computing in the HPC context. However, it’s easy to get lost in the minutia that constitutes those clouds—the management layers, virtualization, latency, and beyond...
To put things into perspective, we’re posting provide a decent overview (and a link for some free time on Azure, which is running in tandem with the free Amazon trials) from a researcher focused directly on the practical applications of running HPC applications on remote resources.
Rob Gillen, a cloud computing researcher with Planet Technologies out of Knoxville, Tennessee spent a brief few moments on video to lay down some of the core concepts behind scientific uses for HPC clouds.
In the brief video below, he carves out the concept of cloud as it applies to the technical and research computing space and provides a few details about how clouds signal the democratization of large-scale computing.
Gillen’s host asks him what HPC encompasses generally, to which he provides a litany of examples. However, he notes that HPC cloud computing is the “lower end of the HPC space” noting that it works well for average researchers or academics that lack access to high-end machines.
Using Microsoft’s Windows Azure as a starting point, he provides the example of the genome sequence alignment tool BLAST, which runs as an Excel worksheet that is used to define problems, fill in details and shoot it off for remote processing. He notes that this is where the democratization layer comes in. For instance, a professor can use actual BLAST in a class but when it’s over, just shut down and stop incurring charges.
Outside of the rapid-fire definition, did you happen to wonder who you contract right this moment to build you a wall-to-wall dry erase room like the one shown?
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.