January 27, 2012
Premier Apache Hadoop community event opens call for papers
SUNNYVALE, Calif., Jan. 26 — Hadoop Summit 2012, the premier Apache Hadoop community event, will take place at the San Jose Convention Center, June 13-14, 2012. The event, now expanded to two days, is co-sponsored by Yahoo! and Hortonworks, and will outline the evolution of Apache Hadoop into the next generation enterprise data platform. Hadoop Summit will feature presentations from community developers, experienced users and administrators, as well as a vast array of ecosystem solution providers.
Apache Hadoop is the open source technology at the epicenter of big data and cloud computing. It enables organizations to more efficiently and cost-effectively store, process, manage and analyze the ever-increasing volume of data being created and collected every day. With Apache Hadoop, companies can connect thousands of servers to process and analyze data at supercomputing speed. Yahoo! pioneered Apache Hadoop and is a contributor to and one of the leading users of the big data platform, while Hortonworks is an independent company consisting of key architects and is the leading contributor to the Apache Hadoop technology.
The Hadoop Summit tracks include the following:
The event's call for papers is now open. The deadline to submit an abstract for consideration is February 22, 2012. The community-driven selection committee will choose the presentations that best highlight innovative use cases on how best to implement Hadoop to extract value from massive data sets and build momentum for the burgeoning Apache Hadoop market. Accepted presenters will be notified on or before March 9, 2012.
Discounted early bird registration is available now through March 30, 2012. To register for the event or to submit a speaking abstract for consideration, visit www.hadoopsummit.org.
Sponsorship packages are also now available. For more information on how to sponsor this year's event please visit: www.hadoopsummit.org/sponsors.
Hortonworks was formed in July 2011 by Yahoo! and Benchmark Capital in order to promote the development and adoption of Apache Hadoop, the leading open source platform for storing, managing and analyzing large volumes of data. Together with the Apache community, Hortonworks is making Hadoop more robust and easier to install, manage and use. The company and its founders are leaders in designing and delivering current and future generations of Apache Hadoop and leverage their expertise to provide unmatched technical support, training and certification programs for enterprises, systems integrators and technology vendors, including ISVs, OEMs and service providers. For more information, visit www.hortonworks.com.
Yahoo! (NASDAQ:YHOO) is the premier digital media company, creating deeply personal digital experiences that keep more than half a billion people connected to what matters most to them, across devices and around the globe. And Yahoo!'s unique combination of Science + Art + Scale connects advertisers to the consumers who build their businesses. Yahoo! is headquartered in Sunnyvale, California. For more information, visit the pressroom or the company's blog, Yodel Anecdotal.
Source: Yahoo! Inc.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.