October 04, 2012
NEW YORK, Oct. 4 — Kognitio, driving the convergence of in-memory analytics, Big Data and cloud computing, today announced its continued development in the second quarter of its fiscal year, reporting better-than-projected revenue and customer growth. These numbers are consistent with the company's streamlined positioning in the Big Data and business analytics sectors.
For the second quarter of the fiscal year, Kognitio, a privately-held firm, reported a 27 percent increase in year-over-year revenue, with a 42 percent quarter-over-quarter increase. The company also exceeded its new revenue and order targets, delivering the highest number of new orders in years. Kognitio also reported the strongest sales pipeline in its history, with a greater focus on cloud-based and software licensing, which provides significant margins and is key to building a long-term recurring revenue stream.
Company officials said Kognitio achieved its significant growth through a strategy aimed at targeting emerging firms, where data analysis is key to their operation. The Kognitio product offerings, including the Kognitio Analytical Platform and the industry's first SaaS-based business intelligence offering, Kognitio Cloud, deliver the robust performance those companies need at a fraction of the cost of competing solutions.
Kognitio continued to receive strong praise from clients:
In its second quarter of the fiscal year, Kognitio also worked to expand its channel partner program, enabling integrators who work with small and medium-sized enterprises to help recommend business intelligence implementations. The company said it will continue to work to grow that program in the coming quarters.
"Our second quarter was the most impressive quarter we have ever turned in as a company, and I believe it's indicative of things to come," said Steve Millard, Kognitio president and chief executive officer. "The market is more competitive than ever, but firms that are seeking a powerful and cost-effective analytical solution are beginning to seek us out. We have revitalized Kognitio so that we can quickly respond to emerging opportunities. We are in the right markets, selling the right products and consistently growing our revenues."
Kognitio is driving the convergence of in-memory analytics, Big Data and cloud computing. Having delivered the first in-memory analytical platform in 1989, it was designed from the ground up to provide the highest amount of scalable compute power to allow rapid execution of complex analytical queries without the administrative overhead of manipulating data. Kognitio software runs on industry-standard x86 servers, or as an appliance, or in the Kognitio Cloud, a ready-to-use analytical platform. The Kognitio Cloud is a secure, private or public cloud Platform-as-a-Service (PaaS). Kognitio Cloud leverages the cloud computing model to make the Kognitio Analytical Platform available on a subscription basis. Clients span industries, including market research, consumer packaged goods, retail, telecommunications, financial services, insurance, gaming, media and utilities.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.