December 17, 2007
BOSTON, Dec. 12 -- xkoto Inc., a provider of data virtualization solutions, today announced immediate availability of version 3.5 of its flagship middleware application, GRIDSCALE.
The new GRIDSCALE release extends xkoto’s ability to deliver continuous availability and horizontal scalability of its customers’ most critical data and applications. This patent-pending technology enables commercial, off-the-shelf databases to run in a geographic grid with the same or better reliability and performance as much more expensive proprietary systems. With GRIDSCALE, customers gain a number of benefits including 24x7 continuous availability, data scalability on demand, the elimination of down time/maintenance windows, complete safety for data across multiple sites and better utilization of hardware assets.
“Zero tolerance for downtime and the imperative to access enterprise data around the clock has become the operative watchword for all businesses today, and many forward-looking organizations have realized that virtualization technologies -- and not simply more hardware -- are fundamental to ensuring 24x7 data availability,” said Albert Lee, chief strategy officer of xkoto. “Recognizing this inherent issue, xkoto continues to set the pace for technology innovation in the increasingly critical area of continuous availability. Unlike point products targeting data protection and replication at a departmental level, GRIDSCALE 3.5 is a proven solution for ensuring data access, availability and protection across the enterprise, including at remote sites.”
GRIDSCALE 3.5 is feature packed with the following new capabilities:
"As a DB2 Gold Consultant, I see a variety of database solutions but I have found none that could match xkoto in offering continuous availability, load balancing and horizontal scalability. We lead with this solution when our customers have these types of requirements," said Frank Fillmore, president of The Fillmore Group. "The enhancements in GRIDSCALE 3.5 further extend xkoto's technology leadership."
Founded in 2005, xkoto's technology platform enables the replication and virtualization of data. The GRIDSCALE Database Load Balancer manages a geographic grid of standard, commercial databases, making them appear virtually as a single local database so that applications become continuously available and horizontally scalable. Through its proven disruptive technology, xkoto currently empowers SMBs and Fortune 100 companies to decrease costs and increase utilization. Recently named one of Deloitte's 2007 Companies-to-Watch, xkoto's technology partners include IBM, Microsoft, Novell, Hewlett-Packard, Sun Microsystems and Sybase. xkoto is based in Toronto and Boston. For more information about xkoto, visit www.xkoto.com.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.