February 17, 2012
Innovative open source interoperability toolkit provides any-platform access and support for all major cloud service providers
FOREST HILL, Md., Feb. 15 — The Apache Software Foundation (ASF), the all-volunteer developers, stewards, and incubators of nearly 150 open source projects and initiatives, today announced that Apache Deltacloud has graduated from the Apache Incubator to become a Top-Level Project (TLP), signifying that the Project's community and products have been well-governed under the ASF's meritocratic process and principles.
Apache Deltacloud defines a RESTful Web Service application programming interface (API) for interacting with Cloud service providers and resources in those clouds in a unified manner. In addition, it consists of a number of implementations of this API for the most popular Clouds such as Amazon, Eucalyptus, GoGrid, IBM, Microsoft, OpenStack, Rackspace, and more. In addition to the API server, the project also provides client libraries for a wide variety of languages.
"We are thrilled to have the project's growth and maturity recognized by The Apache Software Foundation," said David Lutterkort, chair of the Apache Deltacloud Project Management Committee and principal software engineer at Red Hat., "Deltacloud has shown the value of vendor-neutral cloud management API's, and we are excited that, as an Apache Top-Level Project, we will continue to provide choice to cloud users; graduation is a great milestone for our community."
"Ubiquity between clouds will be critical in the coming years and Deltacloud enables developers to only have to code towards one API," said Mark Worsey, CIO and EVP of Technology at GoGrid. "At GoGrid, we feel it is extremely important to continue openness within the cloud community and provide APIs that are accessible, useful and adopted by other 3rd party libraries like Deltacloud."
"Deltacloud is another example of how open source development continues to drive cloud innovation and development," said Tim Cramer, vice president of engineering at Eucalyptus Systems.
"Deltacloud delivers an elegant ReST API, focused on exposing the differences between cloud services. jclouds and Deltacloud have a strong history together, starting with collaborating on abstraction design in 2009, spiking with our interface to Deltacloud released last year. I'm excited to see Deltacloud's graduation, and looking forward to more shared code this year," said Adrian Cole, founder of jclouds.org and CTO jclouds at CloudSoft.
Initially proposed for development within the ASF by Lutterkort in May 2010, Deltacloud was seeded with code developed by Red Hat. Since then, the project has continued to innovate by expanding both its committer base and the diversity of clouds it can manage.
Availability and Oversight
Apache Deltacloud software is released under the Apache License v2.0, and is overseen by a self-selected team of active contributors to the project. A Project Management Committee (PMC) guides the Project's day-to-day operations, including community development and product releases. Apache Deltacloud source code, documentation, mailing lists, and related resources are available at http://deltacloud.apache.org/.
About The Apache Software Foundation (ASF)
Established in 1999, the all-volunteer Foundation oversees nearly one hundred fifty leading Open Source projects, including Apache HTTP Server — the world's most popular Web server software. Through the ASF's meritocratic process known as "The Apache Way," more than 350 individual Members and 3,000 Committers successfully collaborate to develop freely available enterprise-grade software, benefiting millions of users worldwide: thousands of software solutions are distributed under the Apache License; and the community actively participates in ASF mailing lists, mentoring initiatives, and ApacheCon, the Foundation's official user conference, trainings, and expo. The ASF is a US 501(3)(c) not-for-profit charity, funded by individual donations and corporate sponsors including AMD, Basis Technology, Cloudera, Facebook, Google, IBM, HP, Hortonworks, Matt Mullenweg, Microsoft, PSW Group, SpringSource/VMware, and Yahoo!. For more information, visit http://www.apache.org/.
Source: Apache Software Foundation
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.