December 07, 2010
The International Climate Change Conference in Cancun, Mexico, this year was the site of a number of conversations that revolved around tackling climate change through policy and actionable research. As one might expect, the vast majority of the research objectives are made possible by access to high-performance computing resources.
Google was on hand for the United Nations-sponsored event to announce its contribution to continuing efforts in the form of a new cloud-based platform backed by a stockpile of CPU hours.
The platform, which is called Google Earth Engine, will place Landsat data, which was only recently made available via the Web, online for use by scientists and researchers around the world.
According to the company's announcement, this project will be placing "an unprecedented amount of satellite imagery and data" from the last 25 years until the present, into the cloud to allow for large-scale monitoring and measurement of the environment and changes caused by ecological and human-related events.
The massive datasets will be available via Google's cloud-based platform so scientists around the world can analyze the data, both within the confines of the platform itself and by branching out from it to conduct specialized research projects via the ability to build specialized applications.
Google will be donating 20 million CPU hours to extend over the next two years to research conducted through the Google Earth Engine platform. Interestingly, this donation of cycles is worldwide in scope with emphasis on developing nations and their role in contributing research.
As Rebecca Moore, engineering manager for Earth Engine, stated in the company's formal announcement, "For the least developed nations, Google Earth Engine will provide critical access to terabytes of data, a growing set of analytical tools and our high-performance processing cabailities."
Until this point, one of the hurdles scientists faced was rooted in data volumes alone; the sheer number of complexity of current satellite images available is staggering and due to this, it's been difficult for researchers to find ways to manage and analyze these images.
There are numerous benefits to this platform outside of the fact that it will allow for instant collaboration among researchers worldwide as they mine the images. First of all, the information available will allow for tracing back 25 years of climate and ecological change, painting a vivid portrait of deforestation, water and soil alterations, and historical land development.
Google states that among other benefits, this platform will result in "reduced time to do analyses using Google's computing infrastructure [since] by running analyses across thousands of computers, for example, unthinkable tasks are now possible for the first time."
Additionally, the company states that there will be new features that aid in satellite image analysis, including tools that will automatically pre-process the images to remove obstructions in the picture like haze or clouds.
The San Francisco-based Gordon and Betty Moore Foundation, which is focused on advancing scientific research in the realm of environmental conservation, was one of the primary partners behind Earth Engine with nearly $12 million in investment. This foundation was also one of the forces supporting the United States Geological Survey's Landsat archive project to bring the enormous wealth of Landsat data online and into Google's Earth Engine.
One of the focus points of the Moore Foundation's investment is on addressing deforestation with a chunk of the funding aimed at helping scientists integrate existing software to work within the confines of Earth Engine.
Scientists from the Carnegie Institution for Science and the Geographic Information Science Center at South Dakota State University were involved in these early efforts to integrate forest monitoring systems into the new platform.
Furthermore, as Rebecca Moore stated, in collaboration with CONAFOR, which is Mexico's National Forestry Commission, there has already been progress made producing water and forest maps of Mexico, which represent "the finest-scale forest map produced of Mexico to date."
According to Moore, this map "required 15,000 hours of computation, but was completed in less than a day on Google Earth Engine, using 1,000 computers over more than 53,000 Landsat scenes."
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.