October 04, 2012
NEW YORK, Oct. 4 — Kognitio, driving the convergence of in-memory analytics, Big Data and cloud computing, today announced its continued development in the second quarter of its fiscal year, reporting better-than-projected revenue and customer growth. These numbers are consistent with the company's streamlined positioning in the Big Data and business analytics sectors.
For the second quarter of the fiscal year, Kognitio, a privately-held firm, reported a 27 percent increase in year-over-year revenue, with a 42 percent quarter-over-quarter increase. The company also exceeded its new revenue and order targets, delivering the highest number of new orders in years. Kognitio also reported the strongest sales pipeline in its history, with a greater focus on cloud-based and software licensing, which provides significant margins and is key to building a long-term recurring revenue stream.
Company officials said Kognitio achieved its significant growth through a strategy aimed at targeting emerging firms, where data analysis is key to their operation. The Kognitio product offerings, including the Kognitio Analytical Platform and the industry's first SaaS-based business intelligence offering, Kognitio Cloud, deliver the robust performance those companies need at a fraction of the cost of competing solutions.
Kognitio continued to receive strong praise from clients:
In its second quarter of the fiscal year, Kognitio also worked to expand its channel partner program, enabling integrators who work with small and medium-sized enterprises to help recommend business intelligence implementations. The company said it will continue to work to grow that program in the coming quarters.
"Our second quarter was the most impressive quarter we have ever turned in as a company, and I believe it's indicative of things to come," said Steve Millard, Kognitio president and chief executive officer. "The market is more competitive than ever, but firms that are seeking a powerful and cost-effective analytical solution are beginning to seek us out. We have revitalized Kognitio so that we can quickly respond to emerging opportunities. We are in the right markets, selling the right products and consistently growing our revenues."
Kognitio is driving the convergence of in-memory analytics, Big Data and cloud computing. Having delivered the first in-memory analytical platform in 1989, it was designed from the ground up to provide the highest amount of scalable compute power to allow rapid execution of complex analytical queries without the administrative overhead of manipulating data. Kognitio software runs on industry-standard x86 servers, or as an appliance, or in the Kognitio Cloud, a ready-to-use analytical platform. The Kognitio Cloud is a secure, private or public cloud Platform-as-a-Service (PaaS). Kognitio Cloud leverages the cloud computing model to make the Kognitio Analytical Platform available on a subscription basis. Clients span industries, including market research, consumer packaged goods, retail, telecommunications, financial services, insurance, gaming, media and utilities.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.