August 17, 2012
New usage models outline requirements for interoperability in the cloud, including PaaS and SaaS interoperability
BEAVERTON. Ore., Aug. 17 — The Open Data Center Alliance (ODCA) today published two new interoperability-focused usage models focused at PaaS and SaaS accompanied by an foundational document on interoperability requirements for the cloud. These documents form a clear picture of customer demands for interoperable solutions regardless of type of service delivered through the cloud and join previously published usage models addressing interoperability of IaaS solutions. The documents detail expectations for market delivery to the organization's mission of open, industry standard cloud solution adoption, and discussions have already begun with providers to help accelerate delivery of solutions based on these new requirements. This suite of requirements was joined by a Best Practices document from National Australia Bank (NAB) outlining carbon footprint reductions in cloud computing. NAB's paper illustrates their leadership in innovative methods to report carbon emissions in the cloud and aligns their best practices to underlying Alliance requirements. All of these documents are available in the ODCA Documents Library.
Because cloud applications and services must co-exist and interact across service providers and enterprise deployments, it is increasingly important to define and implement requirements for interoperability. The new usage models, focusing on PaaS (Platform as a service) and SaaS (Software as a service) interoperability, ensure that applications and services operate seamlessly across clouds and providers.
The PaaS interoperability usage model outlines requirements for rapid application deployment, application scalability, application migration and business continuity. The SaaS interoperability usage model makes applications available on demand, and encourages consistent mechanisms, enabling cloud subscribers to efficiently consume SaaS via standard interactions. In concert with these usage models, the Alliance published the ODCA Guide to Interoperability, which describes proposed requirements for interoperability, portability and interconnectivity. The documents are designed to ensure that companies are able to move workloads across clouds.
The "National Australia Bank Carbon Neutral White Paper" outlines the bank's efforts, in line with ODCA usage models, to become carbon neutral. NAB's efforts in technology and environmental efficiency are seen by the company as essential business practices and keys to the company's success. The white paper examines the development of a new strategic data center, set to come online in 2013, and provides insight into how other organizations can follow NAB's lead to decrease carbon footprints.
These new publications join over 14 customer requirements for the cloud and come as the Alliance prepares for its first Solutions Summit, an event focused on accelerating POCs and deployments of cloud through facilitated networking between enterprises and solution providers. For more information about the Summit, Alliance membership, or to access usage model requirement documents published by the ODCA and its members, visit www.opendatacenteralliance.org.
About The Open Data Center Alliance
The Open Data Center Alliance is an independent IT consortium comprised of global IT leaders who have come together to provide a unified customer vision for long-term data center requirements. The Alliance is led by a twelve member steering committee which includes IT leaders BMW, Capgemini, China Unicom, Deutsche Bank, JPMorgan Chase, Lockheed Martin, Marriott International, Inc., National Australia Bank, T-Systems, Terremark, Disney Technology Solutions and Services, and UBS. Intel serves as technical advisor to the Alliance.
In support of its mission, the Alliance has delivered the first customer requirements for cloud computing documented in eight Open Data Center Usage Models which identify member prioritized requirements to resolve the most pressing challenges facing cloud adoption. Find out more at www.opendatacenteralliance.org.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.