August 14, 2012
Now that the Curiosity rover has made an action hero's entrance onto the red planet, members of NASA's Jet Propulsion Laboratory (JPL) can breath a quick sigh of relief. The car-sized, rolling laboratory has already transmitted a small but breathtaking collection of images, displaying the planet's landscape and will continue to do so over the course of its mission. Eager earthlings who want a glimpse of what Mars looks like from the Curiosity's point of view can turn to the JPL website and spend time looking at pictures captured by the rover's cameras. All this has been made possible by a collection of tools provided by Amazon, who recently published a case study about the Curiosity mission.
NASA and Amazon have an established history when it comes to working on missions to Mars. The cloud provider is handling images transmitted by the Opportunity exploratory rover, which continues to function after eight years of service. Amazon also had a role in handling Web traffic during the new rover's complex landing procedure.
In preparation for Curiosity's big debut, NASA asked Amazon to help serve the estimated hundreds of thousands of visitors looking to see the landing operation. A complex system was devised, incorporating load balancing, traffic monitoring and a method to de-provision resources after the event took place. The system was benchmarked by SOASTA, which verified the stream could handle requests to the order of hundreds of gigabits per second.
There was good news all around as the landing was a success and the stream worked without any noticeable issues. Now that Curiosity is on the red planet, AWS will process pictures taken by the new inhabitant, making them available to JPL researchers and the public.
The workflow is slightly more complicated than sharing a photo from a smartphone. Although the rover has 17 cameras in total, the panoramic pictures are assembled using images gathered by a stereoscopic camera located on its masthead. An Amazon blog entry explains the process:
In order to produce a finished image, each pair (left and right) of images must be warped to compensate for perspective, then stereo matched to each other, stitched together, and then tiled into a larger panorama.
This process is completed using Amazon's Simple Flow and AWS Flow Frameworks, producing the graphics available to the public. The service provider says accelerated analysis of these images will lead to better decision-making, ultimately increasing the amount of exploration the new rover will embark upon.
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.