June 25, 2012
FOSTER CITY, Calif., June 25 — As increasing numbers of industry leading companies look to new technologies to speed their real-time services and big data applications, GridGain Systems, leading developer of scalable, distributed in-memory compute and data grid technologies for real-time data processing, is strongly positioned to serve the demand.
This week, GridGain announced the release of an entirely new suite of solution-focused products built around the company's leading in-memory and world's fastest MapReduce technologies, expanded enterprise consulting and other professional services, advanced DevOps management and monitoring capabilities now available with all products, and out-of-box integration with virtually any existing data source, Hadoop-based systems, and Java, C++, .NET, Android and iOS applications.
Specifically related to GridGain's new products, the company now offers three which are built around its in-memory data and compute grid technologies. GridGain's "Data Grid" provides distributed in-memory data caching and storage for exponentially faster access to any information. Their "Compute Grid" enables highly distributed and massively parallel computations and processing on virtually any amount of data, and also leverages the company's data grid technology. The company's "In-Memory Big Data Edition" product includes both Data Grid and Compute Grid products, along with Hadoop (HDFS, Zookeeper, HBase) integration and high performance data loading components. All GridGain products include comprehensive enterprise DevOps management tools, and "zero deployment" options compatible with all major cloud platforms.
GridGain has already helped more than 500 enterprise customers achieve entirely new levels of value for their end-users, and thousands of other organizations that have downloaded GridGain's free, open source Community Edition. GridGain technologies are used in applications where data and computation must be highly available, distributed and scalable, with extremely low-latency. The company offers Java-based middleware used for transactional in-memory processing of terabytes to petabytes of data, while enabling on-demand scaling of data storage and computational power from one server to thousands of machines – on virtually any cloud or grid-enabled infrastructure, commodity hardware or high speed flash arrays.
GridGain's founder and CEO, Nikita Ivanov, says "GridGain is being used to build systems that perform better, much better than traditionally architected systems – and not by percentages, but by significant orders of magnitude. Our customers are achieving far greater value as a direct result of this faster performance."
Ivanov continues, "GridGain is helping companies gain significant advantage in their respective markets, enabling them to get faster answers from their data. We are very excited about this new line of products, expanded capabilities and consulting services – all of which strongly support the vision and the objectives of our enterprise and leading customers."
About GridGain Systems
GridGain develops scalable, distributed, in-memory compute and data grid technologies for real-time data processing. The company's Java-based middleware products enable development of applications and services that can instantly access terabytes to petabytes of information from any data source or file system, distribute computational tasks across any number of machines, and produce results orders of magnitude faster than traditionally architected systems. GridGain's customers include innovative web and mobile businesses, leading Fortune 500 companies, and top government agencies. The company is headquartered in Foster City, California. Learn more at http://www.gridgain.com and follow GridGain on Twitter @gridgain.
Source: GridGain Systems
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.