April 30, 2012
New members join Open Data Center Alliance to help enterprise IT use cutting edge database solutions
PORTLAND, Ore., April 30 — Recognizing the challenge faced by IT in managing a forecasted 6.7 times growth of data within the enterprise over the next five years, the Open Data Center Alliance (ODCA) today announced the formation of a Data Services Workgroup chartered to document the most urgent requirements facing IT in their management of data towards competitive opportunity. The Data Services Workgroup will focus initially on big data by documenting usage model requirements towards increasing enterprises' ability to securely collect, manage and analyze this data. Accelerating enterprise IT's ability to use cutting edge database solutions for near real time insight into business decisions, and driving interoperability among big data frameworks and traditional business intelligence (BI) solutions will be the primary objectives of the workgroup.
As part of the new charter, the Alliance will collaborate with big data industry leaders Cloudera, Hortonworks and MapR Technologies, which have announced membership in the organization today, as well as leading data management solutions provider members, including Teradata and SAS, in matching industry delivery of open solutions with prioritized IT requirements.
The Data Services Workgroup has multiple objectives. The workgroup will create usage models, targeted for release in the second half of 2012, to define requirements that address security, manageability and interoperability of emerging big data frameworks with traditional data management and data warehouse solutions. In part these usage models will be driven by addressing implementation within enterprise IT environments and the skill sets required to transform business intelligence solutions. Based on the usage models, workgroup members will develop reference architectures and proofs of concept (POCs) for commercial distribution with independent software vendors (ISVs) and OEM partners to test deployments and establish solutions for the enterprise market. The Alliance will also collaborate with the open source community to drive benchmarking suites.
"With the huge opportunity of big data extending across geographies and industries, the ODCA's work will help drive interoperable standards that enhance big data frameworks to be cloud ready – from manageability to security. This will benefit ODCA's members by accelerating solution delivery," said Denis Curran, head of strategy and innovation at National Australia Bank (NAB), and a member of the ODCA's Steering Committee. "Collaboration with such an impressive group of solutions providers will ensure we help enterprise IT adopt cloud technologies that harness big data to make good business decisions and deliver great customer outcomes."
"Through our discussions with customers, we have identified a common need for more industry best practices on when and how to implement new data solutions and what critical skill sets will be needed," said Marvin Wheeler, chairman of the ODCA. "By working with end users directly in the ODCA we can share our experiences and start documenting tools to enable more companies to benefit from business intelligence transformation within their enterprise."
Big data will also be one of the topics addressed at the Open Data Center Alliance Forecast 2012 event in New York City on June 12. At the event members and non members will gather to discuss and debate the next steps in developing customer-based open requirements for cloud computing. All attendees will also receive a complimentary pass to International Cloud Expo as part of their ODCA Forecast 2012 registration. Early bird registration discounts for ODCA Forecast 2012 have been extended to April 30 to accommodate for late-breaking schedule updates.
For more information on the Alliance and the new workgroup, or to register for ODCA Forecast 2012, visit www.opendatacenteralliance.org.
"We are excited to be part of Open Data Center Alliance because we know it's critical to work with end users to help them benefit from business intelligence transformation," said Alan Saldich, vice president of marketing at Cloudera. "Knowing industry best practices, when and how to implement big data solutions, and what skill sets will be needed is critical."
"With the massive growth in data in the next five years, the opportunity to harness new data sources for business advantage has never been greater," said Mitch Ferguson, vice president of business development at Hortonworks. "Hortonworks is joining the Open Data Center Alliance to work with its new data services workgroup to ensure open, industry standard solutions can meet the most pressing requirements of enterprise customers in their adoption of new data management solutions like Apache Hadoop. We look forward to working with the ODCA to create implementation best practices and set requirements to speed these solutions to the marketplace."
"Enterprises are undergoing tremendous challenges because of the explosion in data," said Jack Norris, vice president of marketing, MapR Technologies. "They are looking at the use of Big Data services as the best way to tap into their unstructured data sources in order to fuel insight. The Open Data Center Alliance's new charter towards requirements for enterprise adoption of Big Data services will help accelerate adoption by IT organizations. We're excited to join ODCA as a solutions provider member and work with organizations on delivery of solutions based on member priorities."
"Data analytics is in our foundational DNA here at SAS. The opportunity to collaborate with a large group of enterprise analytics customer's data service innovation drove us to join ODCA," said Paul Kent, SAS Vice President of Big Data. "We look forward to working with other members on the data service workgroup towards delivery of common data frameworks and industry standards testing methodologies in order to accelerate market adoption."
"The explosion of new types of data, and corresponding big data analytic applications, have created business opportunities. Businesses that know how to leverage big data technology will be able to improve operational efficiency and to drive innovation," said Scott Gnau, president, Teradata Labs. "The ODCA's efforts to identify best practices for big data technology deployment will go a long way to help cut through the big data hype and find the real business value to support their initiatives."
About The Open Data Center Alliance
The Open Data Center Alliance is an independent IT consortium comprised of global IT leaders who have come together to provide a unified customer vision for long-term data center requirements. The Alliance is led by a twelve member steering committee which includes IT leaders BMW, Capgemini, China Life, China Unicom, Deutsche Bank, JPMorgan Chase, Lockheed Martin, Marriott International, Inc., National Australia Bank, Terremark, Disney Technology Solutions and Services, and UBS. Intel serves as technical advisor to the Alliance.
In support of its mission, the Alliance has delivered the first customer requirements for cloud computing documented in eight Open Data Center Usage Models which identify member prioritized requirements to resolve the most pressing challenges facing cloud adoption. Find out more at www.opendatacenteralliance.org.
Source: Open Data Center Alliance
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.