July 29, 2010
After a hectic day at SIGGRAPH 2010, it's time to reflect on some of the highlights from the week thus far.
First of all, a lot of focus has been on movies and the workflow. Everything from modeling and prototyping the storyline and computer generated characters/scenes, to shooting the movie and capturing/ingesting the digital film, meta-tagging and archiving it, and letting artists loose on it to do post-production enhancements/additions/manipulation of the movie, and transcoding to the increasing number of formats that are needed these days for the distribution of the movie. A complex pipeline of work with lots of specialization and specific requirements in each stage.
To get the full picture of this area, in addition to SIGGRAPH, one has to attend the National Association of Broadcasters (NAB) show and the Gaming Developers Conference. But SIGGRAPH is where a lot of high-end and cool technology is on display and its association with the ACM gives it a nice academic foundation with a rich conference program and poster sessions.
Certainly one rendering software company and another and another they all began to look and sound the same after a couple of hours trolling the booth space. Pixar with RenderMan was more notable, probably because it is well-recognized brand and had one of the best show giveaways.
Some high points or booths that created an experience:
Hardcore Computer with their Intel based liquid submersion cooling technology reminded me of the Cray 2, only in a desktop workstation and blade packaging. Very nice. It also reminded me of the famous line from the Movie Top Gun:
Maverick: I feel the need...
Mavrick, Goose: ...the need for speed!
Now here was a product that is completely different from anything else at the event and with such great product names that conjure up all kinds of visions – Detonator and Reactor
These systems use patented liquid submersion cooling technology to remove the main barrier of electronics performance—heat. The result is increased speeds.
NVIDIA sported one of the largest booths at the event and is riding high on it product launch for computational visualization workstation based on the NVIDIA Fermi architecture and NVIDIA 3D Vision Pro – all makes for some very slick demos. I wonder when the last time a car was actually photographed for an ad versus computer-generated images, very slick – Aston Martin, very cool.
Quietly going about their business was BlueArc with several large screen displays showing movie clips and ads from some of their customers in this industry, including the Nike World Cup Ad. Wayne Rooney looked great. Pity he didn't look as good in the real thing!
BlueArc's main focus was on their success and momentum in this industry with flagship production and post production companies including: The Whitehouse of Chicago; Digital Post Ltd of Auckland, New Zealand; Goldtooth Creative Agency of Vancouver, British Columbia; Turner Studios of Atlanta; Just for Laughs of Montreal, Quebec; and Mercury Filmworks of Ottawa, Ontario.
Finally, numerous award winning studios, software and animation houses strutting their stuff, BlueSky, Rhythm & Hues Studios, Pixar and Sony Pictures. All had fabulous and sometimes riveting demos.
SIGGRAPH gets the vote for best demos anywhere.
Posted by Steve Campbell - July 29, 2010 @ 8:43 PM, Pacific Daylight Time
An HPC industry consultant and cloud evangelist, Steve Campbell is a seasoned senior HPC executive.
No Recent Blog Comments
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.