November 29, 2012
LAS VEGAS, Nov. 29 – Kognitio, driving the convergence of Big Data, in-memory analytics and cloud computing, today improved its support for the Amazon Elastic Computer Cloud (EC2) cluster compute environment. The company is making multi-node instances generally available that can be rapidly implemented, enabling companies to do intensive analytics on large volumes of data, using the newly-enhanced AWS Cloud computing platform, more cost-effectively than ever before.
The announcement was made today at the Amazon re:Invent global customer and partner conference, being held this week in Las Vegas, where attendees will be able to learn how to leverage Amazon Web Services (AWS) and its features for a variety of popular use cases, such as Big Data analytics. Kognitio is a silver sponsor of the event.
Kognitio Cloud has been specifically developed to take advantage of the Amazon environment, and is not a "cloud-washed" version of the Kognitio Analytical Platform. Company officials said that the Kognitio software has been capable of running across multiple nodes in production on AWS EC2 "cluster compute 2" (CC2) environments, with several customers and partners already taking advantage of this capability today.
Companies with AWS accounts can implement multi-node instances of Kognitio Cloud with a simple web-based utility that provisions EC2 Servers immediately. Signing up for this demonstration service at kognitio.kognitiocloud.com creates a CloudFormation template that automatically builds a multi-node environment, appearing as a single appliance to the end-user.
"By combining the power of in-memory analytics with the ease of cloud-based implementation, Kognitio is again taking the lead in delivering powerful performance to the widest range of companies," said John Coppins, senior vice president, Kognitio Cloud. "Increasingly, there is a consensus that all data is Big Data. The ability to rapidly develop environments where huge amounts of information can be analyzed within seconds, at minimal cost, is emerging as a primary key to business success. With our focus on the cloud, Kognitio is well-positioned to deliver that vital capability."
Today's announcement again confirms Kognitio's consistently innovative leadership stance in the Big Data marketplace. It recently introduced free, full-feature perpetual use licenses up to 128GB for use in the cloud or on-premises, allowing organizations of any size to gain insight from Big Data, unifying the "information anywhere" approach that more companies are demanding.
"Kognitio and Amazon Web Services are both key partners for VivaKi because they are industry leaders committed to transforming Big Data into actionable insights," said Pradeep Ananthapadmanabhan, Chief Technology Officer of Vivaki Nerve Center. "We are excited to see these two key partners come together to deliver an even more compelling data offering. Kognitio Cloud multi-node provisioning will allow us to deploy and scale consumer intelligence solutions quickly and more efficiently."
Kognitio executives added that they are anxiously anticipating the launch of Amazon's "Cluster Compute 3" (CC3) environment, expected to have enhanced RAM and compute capabilities; this will offer improved economics for users, who will be able to expand linearly to a greater number of servers for more analytical power, or leverage fewer servers for the same capability.
Kognitio is driving the convergence of Big Data, in-memory analytics and cloud computing. Having delivered the first in-memory analytical platform in 1989, it was designed from the ground up to provide the highest amount of scalable compute power to allow rapid execution of complex analytical queries without the administrative overhead of manipulating data. Kognitio software runs on industry-standard x86 servers, or as an appliance, or in Kognitio Cloud, a ready-to-use analytical platform. Kognitio Cloud is a secure, private or public cloud Platform-as-a-Service (PaaS), leveraging the cloud computing model to make the Kognitio Analytical Platform available on a subscription basis. Clients span industries, including market research, consumer packaged goods, retail, telecommunications, financial services, insurance, gaming, media and utilities.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.