February 25, 2008
LAS VEGAS, Feb. 18 -- Kognitio, a
leading global provider of business intelligence (BI) and data
warehousing solutions, today announced its entry into the North
American market, where it will make its WX2 analytical relational
database management system (RDBMS) available to companies seeking to
rapidly gain competitive insight from enterprise-wide data assets. The
announcement made at the start of The Data Warehousing Institute's
World Conference being held this week in Las Vegas, where Kognitio is
an exhibiting vendor.
The company also said it has named industry veteran John K. Thompson
to head its North American operations. In a career spanning two dozen
years, Thompson has held senior sales, technology, marketing and
business development positions at companies such as Nielsen, WhiteCross
Systems, PLATINUM technology, IBM and Metaphor Computer Systems.
Kognitio WX2 is the most powerful and scalable analytical database software in the industry, enabling enterprises and organizations to query, in detail, vast amounts of granular data in seconds. WX2 breaks new ground in delivering high-speed data access to large and/or complex data volumes in any one of three ways:
More than 20 years of development enables Kognitio WX2 users to
gain greater benefits in less time; tests have consistently shown WX2
to run as much as 100 times faster than traditional databases.
"As the market moves rapidly toward transactional BI, Kognitio may
have an inside track toward meeting the analytical needs of firms and
assuming a leadership role in the industry," said industry veteran Bill
Inmon, the "father of data warehousing." In a new whitepaper, "Driving
increased performance and lowering the total cost of ownership (TCO) in
the analytical environment," Inmon writes, "Kognitio should be strongly
considered by companies where BI is needed to help them gain and
maintain a competitive advantage."
Kognitio has emerged as a leader in the markets it serves, with
blue-chip organizations like British Telecom (BT), Scottish Power and
the Tattershall Castle Group (TCG) relying on WX2. Moreover, Kognitio's
innovative flexible pricing and deployment strategies allow companies
to better allocate the assets they need across multiple locations
without requiring them to buy separate licenses for each site, a key
advantage for larger firms.
"We installed WX2 in just two hours, and loaded our data in 50
minutes instead of the 36 hours it took us previously. Now, we can run
analytical queries with sub-second response times," said Philip
Papadopoulos, program director for grid and cluster computing at the
California Institute for Telecommunications and Information Technology
(Calit2), Kognitio's first customer in the United States. Calit2 is
using WX2 to help bioinformatics researchers analyze data from its
Cyberinfrastructure for Advanced Marine Microbial Ecology Research and
Analysis (CAMERA) project. "As a result, the research community we
serve has the ability to analyze the CAMERA data sets in a dramatically
improved manner." he added.
Calit2 chose Kognitio WX2 based on a number of factors, including
the projected levels of data management and query complexity, variable
workload and data loading requirements, the variety of data types and
sources and the need to have the system available at all times. In
addition, Calit2 wanted a data management and analysis environment that
provided mature and robust systems management tools, and required
minimal recovery time in the event of a system crash.
"Analytics nowadays are all about space, time, and cost," said
Philip Russom, senior manager at TDWI Research. "Multiple terabytes of
data take up a lot of space, and users have very little time to
integrate the data into a data warehouse platform. Then, queries have
to scan multi-terabyte data volumes in a few seconds or less. Since
this kind of analytic application is usually funded by a departmental
budget, low cost is critical. For these and other types of
applications, user organizations are today turning to alternative MPP
data warehouse platforms that are based on database management systems
built specifically for data warehousing and coupled with commodity
hardware and open source software. Kognitio WX2 is a data warehouse
platform that - among other things - satisfies the demanding
requirements for space, time, and cost seen in modern analytic
applications," he added. Kognitio officials noted that in many cases,
the total cost of ownership (TCO) from a WX2 implementation may run
less than half the money required by competing solutions.
"There is an explosion of companies that need insights hidden in
diverse enterprise-wide data assets. Traditional methods no longer work
in the time frame at a cost that these companies want. They are looking
for help from powerful analytical tools, deployed in a way that makes
sense for them, as an on-premise or service-based solution," said
Thompson. "With WX2 providing the most flexible ways to implement, and
virtually obliterating the wait time for results in many cases,
Kognitio is uniquely qualified to help these companies meet their
requirements of quickly putting their data and analysts to work,
delivering solid business benefits."
Kognitio (www.kognitio.com) is an innovative, technology-rich company, providing leading-edge solutions to business problems that require the acquisition, rationalization and analysis of large or complex data. Kognitio's WX2 is the industry's fastest and most scalable analytical database on the market, giving firms the ability to turn their raw data into valuable business insight fast, and empowering its customers to realize comprehensive answers to critical business questions. Globally headquartered in Marlow, England, with North American headquarters in Chicago, Kognitio delivers competitive advantage to its clients across a wide range of industries, including telecommunications, financial services and utilities.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.