February 28, 2013
SAN FRANCISCO, Calif., Feb. 28 — The Linux Foundation, the nonprofit organization dedicated to accelerating the growth of Linux, today announced the keynote speakers for The Linux Foundation Collaboration Summit to be held April 15-17, 2013 in San Francisco.
Leaders from the Linux developer, industry and end-user communities will gather at the invitation-only Linux Foundation Collaboration Summit to advance the state-of-the-art of Linux and open source software. The summit is the only place where leaders from the developer, end-user and industry communities can come together to tackle the most pressing issues facing the platform today. It is also the place that brings together The Linux Foundation's Collaborative Projects and workgroups to meet and work more broadly with other community leaders to achieve breakthrough innovation.
The following confirmed keynote speakers and presentations will set the stage for discussion on such topics as automotive engineering, big data, cloud computing, virtualization, mobile and embedded development, filesystems, kernel development, legal topics, the Linux Standard Base, SPDX, parallel processing, Tizen, tools, and tracing.
Confirmed keynote speakers and topics include:
On days two and three of the event, hundreds of summit participants will break into workgroup meetings to expand on topics presented during the keynotes and to take a dive deep into additional topics. Highlights include:
"This exclusive event enables the growing, vibrant Linux and open source communities to re-imagine the possibilities for collaborative development," said Jim Zemlin, executive director at The Linux Foundation. "Witnessing the energy between groups collaborating who normally don't get to meet face-to-face is electric and the expansion of the topics and content covered reflects the growth of Linux and our Collaborative Projects, workgroups and membership."
Immediately following The Linux Foundation Collaboration Summit, a second event, the Linux Storage, Filesystem and Memory Management Summit will take place April 18-19, 2013. Holding the summits in close proximity increases the opportunities for collaborative engagement among developers and researchers focused on advancing Linux in these areas.
For more information about this invitation-only event, please visit: http://events.linuxfoundation.org/events/lsfmm-summit
The complete Linux Foundation Collaboration Summit schedule can be viewed here: http://events.linuxfoundation.org/events/collaboration-summit/schedule
To request an invitation, please visit: http://events.linuxfoundation.org/events/collaboration-summit/request-an-invitation
The Linux Foundation Collaboration Summit is made possible with generous support from all of our sponsors, including Platinum sponsors HP and Intel.
About The Linux Foundation
The Linux Foundation is a nonprofit consortium dedicated to fostering the growth of Linux and collaborative software development. Founded in 2000, the organization sponsors the work of Linux creator Linus Torvalds and promotes, protects and advances the Linux operating system and collaborative software development by marshaling the resources of its members and the open source community. The Linux Foundation provides a neutral forum for collaboration and education by hosting Collaborative Projects, Linux conferences, including LinuxCon and generating original research and content that advances the understanding of Linux and collaborative software development.
Source: Linux Foundation
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.