March 04, 2013
LAS VEGAS, March 4 — IBM today announced that its cloud services and software will be based on an open cloud architecture. This move will ensure innovation in cloud computing is not hampered by locking businesses into proprietary islands of unsecured and difficult-to-manage offerings. Without industry-wide open standards for cloud computing, businesses will not be able to fully take advantage of the opportunities associated with interconnected data, such as mobile computing and big data.
As the first step, the company today unveiled a new cloud offering based on open cloud standards, including OpenStack, that significantly speeds and simplifies managing an enterprise-grade cloud. For the first time, businesses have a core set of open source-based technologies to build enterprise-class cloud services that can be ported across hybrid cloud environments.
"History has shown that standards and open source are hugely beneficial to end customers and are a major catalyst for innovation," said Robert LeBlanc , IBM senior vice president of software. "Just as standards and open source revolutionized the Web and Linux, they will also have a tremendous impact on cloud computing. IBM has been at the forefront of championing standards and open source for years, and we are doing it again for cloud computing. The winner here will be customers, who will not find themselves locked into any one vendor – but be free to choose the best platform based on the best set of capabilities that meet their needs."
Based on customer-driven requirements, the new software, called IBM SmartCloud Orchestrator, gives clients greater flexibility by removing the need to develop specific interfaces for different cloud services. With the new software, companies can quickly combine and deploy various cloud services onto the cloud infrastructure by lining up the compute, storage and network resources with an easy-to-use graphical interface. The new IBM SmartCloud Orchestrator allows users to perform the following:
The development of open industry standards has proven a critical turning point in the success of many technologies, such as the Internet and operating systems. For cloud computing to grow and mature similar to its predecessors, vendors must stop creating new cloud services that are incompatible. A recent report by Booz & Company warned that without a more concerted effort to agree on such standards, and leadership on the part of major companies, the promise of cloud computing may never be reached.
IBM is applying its experience in supporting and validating open standards from Linux, Eclipse and Apache to cloud computing. Working with the IT community, IBM is helping to drive the open cloud world by:
IBM is one of the world's largest private cloud vendors with more than 5,000 private cloud customers in 2012, which increased 100 percent year-over-year. IBM's cloud portfolio, called SmartCloud, is based on a common code of interoperability, allowing clients to move between IBM's private, hybrid and public cloud services.
IBM SmartCloud Orchestrator is now available through a beta program and is expected to be generally available later this year.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.