May 23, 2012
Latest advances expand use cases, ease integration and speed development
SAN JOSE, Calif., May 23 — MapR Technologies, the provider of the industry's most advanced distribution for Apache Hadoop, today announced a range of initiatives that further establishes MapR as the most open distribution for Hadoop. This announcement follows MapR's inclusion in the Open Data Center Alliance (ODCA) as a solution provider member of the Data Services Workgroup to define enterprise requirements to address interoperability, security and manageability of Big Data frameworks.
Also announced today was the release of a fully compliant ODBC 3.52 driver as part of the MapR Distribution. This allows users to leverage hundreds of commercial and open source SQL-based tools, such as query builders and BI applications including Excel, Tableau, MicroStrategy and a variety of 100% open source SQL tools, such as Kaimon.
Open Data Access
An important dimension for any Big Data framework is the degree of openness with respect to data access. The standard Open Database Connectivity (ODBC) access announced today allows SQL-based applications to run SQL queries (via Hive) and provides open database access through a standard API. This joins the standard file access support provided by MapR's support for the complete NFS protocol allowing any file-based application to read/write data directly from/to a MapR cluster.
"Unlike other Hadoop distributions, there is no need to deploy agents or specialized connectors to get data into and out of a Hadoop cluster," said Tomer Shiran, director of product management at MapR Technologies. "MapR opens up Hadoop by allowing users to leverage a broad array of standard commands, tools and applications."
MapR also announced today full support for Linux Pluggable Authentication Modules (PAM), enabling MapR to authenticate users with common authentication back-ends, including Active Directory, LDAP, NIS, Kerberos and a variety of third-party services. These capabilities augment MapR's REST APIs, which enable functions and features of the MapR Control System to be easily integrated into third-party system management tools and dashboards. MapR also includes integration components for Nagios and Ganglia to ease integration.
Open Source Components
Finally, MapR announced the immediate availability of source code for the various components of the MapR distribution. MapR includes the broadest set of open source components as part of its distribution including Cascading, Flume, HBase, Hive, Pig, Mahout, Oozie, Sqoop, Whirr and ZooKeeper. All of the source code, patches and updates are available for these modules and users have the ability to make changes and enhancements while still enjoying the reliability, availability and performance of the underlying MapR storage and compute framework.
About MapR Technologies
MapR delivers on the promise of Hadoop, making managing and analyzing Big Data a reality for more business users. The award-winning MapR Distribution brings unprecedented dependability, speed and ease-of-use to Hadoop. Combined with data protection and business continuity, MapR enables customers to harness the power of Big Data analytics. The company is headquartered in San Jose, Calif. Investors include Lightspeed Venture Partners, NEA and Redpoint Ventures. To download the latest MapR Distribution for Apache Hadoop, please visit http://www.mapr.com/products/download. Connect with MapR at http://www.facebook.com/maprtech, http://www.linkedin.com/company/mapr-technologies and http://twitter.com/mapr.
Source: MapR Technologies
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.