November 24, 2008
ARMONK, N.Y., Nov. 24 -- Today, IBM
The old one-server-per-application model has created a dire situation in the datacenters of large enterprises. Their infrastructures are becoming too complex and expensive to maintain, and smaller organizations want new ways to grow and expand their businesses without falling victim to these same issues.
Cloud computing, or network-delivered services and software, can save customers up to 80 percent on floor space and 60 percent on power and cooling costs, and deliver triple asset utilization(1). While the economics are compelling to businesses of all sizes, concerns over security, data portability and reliability are causing reluctance among enterprise customers.
New Services for Cloud Computing
IBM's new business consulting services use economic modeling to assess the total cost of ownership for building and integrating clouds. Initial research indicates that organizations will employ both public and private clouds to achieve business goals, and IBM can help companies find the most effective balance, and manage it all as one integrated strategy.
In addition, cloud technology consulting services will help clients create roadmaps for re-constructing their IT environments, so they can take advantage of cloud computing models to improve operational efficiency, overall carbon posture and return on investment. With new cloud implementation services, IBM will apply expert-level skills, methods, guidance and project management techniques to help clients plan, configure and test the servers, storage and technologies necessary to support a dynamic technology environment.
"Cloud strategies need to be in line with business strategies," said Willy Chiu, vice president of High Performance on Demand Solutions at IBM. "Over the last year in our 13 cloud computing centers worldwide, we've worked with clients to understand how to help them take advantage of both public and private clouds to get the best economics."
New Clients Move into the Cloud with IBM
In addition to new services, IBM is helping new clients move into the cloud. One of Houston's largest and fastest-growing human services agencies, Neighborhood Centers serves over 200,000 citizens in Southwest Texas and delivers key services including economic development services, citizenship and immigration services, early childhood development programs, a K-5 charter school and seniors programs. The non-profit organization depends on IBM cloud services to back-up server and PC data from distributed environments, and store it in secure off-site locations.
"Neighborhood Centers is dedicated to helping citizens cope with disruption and plan for contingencies in life -- as second responders in emergencies we simply cannot afford to be shut down, or slowed down, by a data loss," said Tom Comella, chief information officer of Neighborhood Centers Inc. "IBM cloud services were critical in our community recovery efforts following Hurricane Ike. Since we experienced no business interruptions in any of our 20 facilities, we were able to focus on bringing the community, our services and our citizens back online. But the benefits of cloud services reach far beyond disaster recovery. Better data protection -- demonstrating that we are good stewards of information -- has become a selling point for us in willing contracts."
IBM Research is working directly with clients to create replicable, cloud-delivered, industry-specific services like Lender Business Process Services or Healthcare Process Services, as well as horizontal business services like CRM and supply chain management.
In China, for example, IBM Research is piloting a newly developed cloud computing platform, codenamed Project Yun (Chinese for "cloud"), for companies to access business services, designed to make the selection and implementation of new cloud services as easy as selecting an item from a drop-down menu. With no need for back-end provisioning, the IBM platform stands to cut the time required to deliver new services dramatically. The Yun platform allocates storage, server and network resources for the customer application with zero human input, achieving top performance, availability and power utilization.
One of China's largest retailers with more than 10 million customers per day, Wang Fu Jing Department Store, has deployed several key cloud services from Project Yun, including a supply chain management solution for its vast network of retail stores to easily share supply chain information and visualize the execution of B2B business processes with thousands of their own SMB suppliers via the cloud.
Securing Enterprise Cloud Computing
To ensure the widespread adoption of cloud computing services, IBM has initiated a company-wide project to form a unified and comprehensive security architecture for cloud computing environments. The effort, which spans Systems, Software, Services and IBM's lauded Research and X-Force arms, is aimed at re-architecting and re-designing technologies and processes, to infuse security and shield against threats and vulnerabilities. Security is built-in to the cloud, not an afterthought.
The project incorporates next-generation security and cloud service management technologies, as well as simplified security management and enforcement, offering enterprise customers the same security and compliance guarantees that are equivalent or better than what they can expect in traditional computing environments.
Built upon IBM's extensive industry security leadership, the project focuses on developing trusted virtual domains, authentication, isolation management, policy and integrity management and access control technologies designed specifically for cloud computing
For more information on IBM's cloud computing initiatives, visit www.ibm.com/cloud.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.