October 17, 2012
LAS VEGAS, Oct. 16 — SAP AG (NYSE: SAP) today unveiled innovations on the SAP HANA platform, including the introduction of the SAP HANA One platform, a deployment of SAP HANA certified for production use on the Amazon Web Services (AWS) Cloud and immediately available on AWS Marketplace. SAP also announced one of the world's largest in-memory database systems with the ability to process 1 petabyte of raw uncompressed data. Additionally, the company has embedded application server capabilities in SAP HANA for developers and launched the "SAP HANA Academy" to enable self-learning at scale. These updates mark a major step for SAP to enable real-time business operations and attract more businesses, independent software vendors (ISVs) and startups to build solutions on SAP HANA. The announcement was made at SAP TechEd 2012, being held October 16-19 in Las Vegas.
SAP HANA One: Instant Provisioning of SAP HANA Platform on AWS
SAP is introducing SAP HANA One, a deployment option for the SAP HANA platform available for use in production on the elastic AWS Cloud. SAP HANA One is provisioned by AWS on advanced hardware with memory capacity up to 60 GB of RAM per instance. This enables companies of any size to deploy business-critical or consumer-facing applications based on SAP HANA to take advantage of the power of in-memory transactional and analytical data processing supported by SAP HANA. With this update, the company further reaffirms its commitment to supporting enterprise businesses, start-up companies and ISVs. SAP is enabling them to quickly and cost-effectively deliver a wide range of innovative applications based on SAP HANA One on AWS. Developers can go directly to the online store AWS Marketplace to find, buy and immediately begin using software that runs on the AWS Cloud, in order to provision and instantly access SAP HANA. SAP HANA One on AWS pricing is US$0.99 per hour for the SAP software.
The first SAP offering available via the AWS Marketplace is the SAP Enterprise Performance Management OnDemand solution for expense insight. The solution option provides department managers real-time insight into their daily, weekly, monthly or yearly expenses in a software-as-a-service (SaaS) model starting at US$0.49 per hour for the SAP software. SAP plans to offer additional applications that would be available at AWS Marketplace.
"SAP has been an important AWS partner as we continue to enable enterprises to run their SAP software on AWS," said Andy Jassy, Senior Vice President, Amazon Web Services. "Today we're extending the relationship to provide customers quick access to powerful software for their real-time business needs. For the first time, customers of any size can deploy SAP HANA One on the cloud in minutes and pay just $0.99 per hour for the software. There are so many great ideas that have been stranded on enterprises' white boards because teams cannot get the requisite capital, people resources or services provisioned in a reasonable time frame. The combination of in-memory transactional and analytical data processing of SAP HANA One with AWS's immediately accessible, no capital expenditure, pay-as-you-go, reliable infrastructure changes the possibilities for so many companies."
Start-up software company Taulia will take advantage of SAP HANA One on AWS to enhance its dynamic discounting capabilities and to streamline its software deployment and operation efforts.
"The transformative capabilities of SAP HANA are enabling Taulia to revolutionize supply chain financing and dynamic discounting processes," said Bertram Meyer, CEO, Taulia. "Now, making SAP HANA One available for production use on AWS will enable us to deliver our solutions at scale to our customers and significantly ease our go-to-market ability."
Announcing One of the World's Largest In-Memory Clusters: SAP HANA Proves Scalability and Speed, Processing 1 Petabyte of Raw Data in Real Time
SAP HANA has shown extreme scalability with one of the world's largest in-memory clusters, capable of handling large data sets with fast real-time responses. It is a 100 node, 100 TB in-memory system that can hold 1 petabyte of raw uncompressed data representing 1.2 trillion records. This size represents approximately 10 years of transactions for a large company with an average of 328 million transactions per day. Performance tests on this "big data" shows a fast response time of 0.43 to 0.50 seconds for ad-hoc sales and distribution queries and 1.2 to 3.1 seconds for more complex year-over-year trending data over different time periods.¹
Introducing New Extended Application Services in SAP HANA and SAP HANA® Studio Enhancements
The extended application services in SAP HANA developments are intended to be exposed as services via newly enabled data access services, such as ODATA, ATOM, JSON, XML or XMLA. These exposed services could subsequently be consumed by any business-critical or consumer-facing applications.
Free Sharing of In-Memory Expertise With SAP HANA Academy
SAP HANA Academy is a free online resource of high-quality technical instruction to empower anyone to learn how to operate the SAP HANA platform. It provides online instructional videos and live and recorded webinars, as well as virtual access to on-site training sessions as outreach channels. Technical content is contributed by SAP and the ecosystem community. Videos are divided into two sections, "Projects" and "How-To's" — projects are designed to help users learn SAP HANA by performing a task, while how-to videos teach a particular function or feature.
"SAP HANA, the industry-leading platform for real-time analytics and applications, is now available in 'real-time' on AWS," said Dr. Vishal Sikka, member of the SAP Executive Board, Technology & Innovation. "With the introduction of SAP HANA One on AWS, the application and database services in SAP HANA and SAP HANA Academy for knowledge sharing, we have dramatically simplified the application development experience by collapsing the artificial layers between transactions, analytics and application servers."
¹This test was done in Santa Clara on 100 nodes of IBM x5 servers, each with 1 TB DRAM, 40 cores. In a petascale performance test of SAP HANA in-memory cluster 1 PB of raw data was loaded onto 95 active nodes, 5 were used as standby nodes for failover.
SAP® TechEd 2012 in Las Vegas, Madrid, Bangalore, and Shanghai
SAP customers, partners, and technical experts are expected to convene at SAP® TechEd 2012, the company's premier technical conference. Hands-on workshops, demo-driven lectures, and Q&A sessions on the latest developments in analytics, mobile, cloud, database, and in-memory computing enable SAP TechEd attendees to enhance their skills while making valuable connections with peers and IT experts from the SAP community. SAP TechEd is being held in Las Vegas, Nevada, from October 15-19, and will be held in Madrid, Spain, from November 13-16; Bangalore, India, from November 28-30; and Shanghai, China, from December 4-5. Follow SAP TechEd on Twitter at @SAPTechEd and join the conversation at #SAPTechEd.
Note to Editors:
Webcasts, announcements, media roundtables, keynote presentations and blog posts from SAP TechEd will be available in the Events Newsroom at: www.events.news-sap.com. To preview and download broadcast-standard stock footage and press photos digitally, please visit www.sap.com/photos. On this platform, you can find high resolution material for your media channels. To view video stories on diverse topics, visit www.sap-tv.com. From this site, you can embed videos into your own Web pages, share video via email links, and subscribe to RSS feeds from SAP TV. Follow SAP on Twitter at @sapnews.
As market leader in enterprise application software, SAP helps companies of all sizes and industries run better. From back office to boardroom, warehouse to storefront, desktop to mobile device – SAP empowers people and organizations to work together more efficiently and use business insight more effectively to stay ahead of the competition. SAP applications and services enable more than 195,000 customers (includes customers from the acquisition of SuccessFactors) to operate profitably, adapt continuously, and grow sustainably. For more information, visit www.sap.com.
Source: SAP AG
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.