January 30, 2013
LONDON, Jan. 30 – Kognitio, driving the convergence of Big Data, in-memory analytics and cloud computing, today announced its support for Amazon Web Services' new High Memory Cluster instance, with pricing beginning at the unheard-of price of one cent per gigabyte per hour.
In doing so, Kognitio Cloud becomes the first big data analytics platform specifically designed to take full advantage of the new Amazon Web Services high-memory cluster instances on the Amazon Elastic Compute Cloud (EC2) that were introduced earlier this month.
Continuing the tradition of improving the price to performance and compute capacity characteristics, high-memory cluster instances advance the economics of cloud computing, providing even greater analytical capability at a smaller investment.
Through its High Memory Cluster, Amazon has actively recognized the growth and importance of in-memory computing. Kognitio clients can benefit from that change immediately, by being able to pin four times as much data into memory at a significantly lower cost. As an example, companies can now pin a terabyte of data into memory and make it available for in-depth, active cloud-based analytics for less than $100 of software licensing costs per business day.
Kognitio made the announcement today at Cloud Expo Europe. Kognitio is a sponsor; Polly Gowers , CEO of Everyclick, has spoken at the conference about how her company uses Kognitio Cloud for analytics to support Everyclick's efforts on behalf of UK-based charities.
"We have been eagerly awaiting these new high-memory instances from AWS," said John Coppins , senior vice president of Kognitio Cloud. "Our clients' demand for real-time applications will benefit from the ample CPU and RAM available—it is exactly what is needed to support the business analytics we provide for media, healthcare, retail and social media analysis."
Kognitio Cloud is developed to take advantage of the Amazon environment; company officials said that the Kognitio software has been capable of running across multiple nodes in production on AWS, with several customers and partners already taking advantage of this capability today. Companies with AWS accounts can implement multi-node instances of Kognitio Cloud with a simple web-based utility that provisions AWS servers immediately.
Today's announcement again confirms the consistent innovative leadership of Kognitio in the Big Data analytics marketplace. Last year, it began offering its first-of-its-kind in-memory analytical platform software free of charge. Full-feature, perpetual use licenses for up to 128GB of RAM can be used in the cloud or on-premises, allowing organizations of any size to gain insight from Big Data, unifying the "information anywhere" approach that more companies are demanding.
Kognitio is driving the convergence of Big Data, in-memory analytics and cloud computing. Having delivered the first in-memory analytical platform in 1989, it was designed from the ground up to provide the highest amount of scalable compute power to allow rapid execution of complex analytical queries without the administrative overhead of manipulating data. Kognitio software runs on industry-standard x86 servers, or as an appliance, or in Kognitio Cloud, a ready-to-use analytical platform. Kognitio Cloud is a secure, private or public cloud Platform-as-a-Service (PaaS), leveraging the cloud computing model to make the Kognitio Analytical Platform available on a subscription basis. Clients span industries, including market research, consumer packaged goods, retail, telecommunications, financial services, insurance, gaming, media and utilities.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.