October 22, 2012
SAN JOSE, Calif., Oct. 22 — GigaSpaces Technologies, a pioneer of next generation application and cloud enablement platforms for business-critical applications, announced today that it is partnering with IBM to bring the benefits of cloud economics and developer agility to its InfoSphere BigInsights product. With IBM's BigInsights taking the power of Hadoop to enterprises, Cloudify enables BigInsights to run consistently with additional big data services such as NoSQL databases and more; simplifying these systems' deployments through consistent management, as well as improving cost efficiency with its cloud enablement & portability support.
"The intersection of big data and the cloud holds a lot of promise in terms of optimization of costs and development cycles. GigaSpaces set out to make this possible with Cloudify for BigInsights," says Leon Katsnelson, Program Director, IM Cloud Computing Center of Competence and Evangelism at IBM. "This integration enables you to run your BigInsights Hadoop distribution on the cloud of your choice, reducing overhead and infrastructure costs exponentially, in addition to minimizing the complexity involved with managing these systems through unified and consistent management."
Cloudify's consistent management refers specifically to consistent deployment, configuration, and management across the stack, which applies not only to the deployment phase, but also to post-deployment, including failover, scaling, and upgrades – which are all easily accomplished through Cloudify's built-in recipe mechanism. With agile development becoming more and more critical to big data distributions these days, the single click app deployment Cloudify provides now makes the much-needed DevOps automation of these systems a real possibility.
Additionally, big data systems tend to consume a lot of infrastructure resources that can easily pile up to thousands of nodes. GigaSpaces realized that they can optimize the infrastructure cost for running big data systems through cloud enablement and cloud portability. Cloud portability enables you to choose the right cloud for the job. For example, you can now choose a bare-metal cloud for I/O intensive workloads or a virtualized/public cloud for more sporadic workloads.
"Big data systems tend to be complex to manage and operate, as they often include other services such as relational databases, other NoSQL databases, stream processing, web front ends and more, with each framework coming with its own management, installation, configuration, and scaling solutions," says Nati Shalom, CTO at GigaSpaces. "Managing each component of your big data system separately is an operational nightmare, and that complexity only grows as the system gets bigger, and with big data that's just to be expected."
With consistent management, the experience of managing each of the tiers and services in the big data system becomes consistent throughout the entire stack, and by enabling your BigInsights systems for all cloud environments means you can leverage all the agility, efficiency, and flexibility of the cloud, while also allowing you to choose the right cloud for the job.
Visit GigaSpaces at booth #634-19, at the IBM Information on Demand Conference at the Mandalay Bay Convention Center in Las Vegas, from October 21st - 24th, 2012.
GigaSpaces' Nati Shalom will be presenting this joint-solution with Leon Katsnelson of IBM at the event in the session titled: The Elephant in the Cloud: Bring True Cloud Economics to InfoSphere BigInsights on October 24th at 2:30 PM in the Business Partner Theater.
GigaSpaces Technologies is the pioneer of a new generation of application virtualization platforms and a leading provider of end-to-end scaling solutions for distributed, mission-critical application environments, and cloud enabling technologies.
Hundreds of organizations worldwide use GigaSpaces' technology to enhance IT efficiency and performance, among which are Fortune Global 500 companies, including top financial service enterprises, e-commerce companies, online gaming providers, and telecom carriers.
Source: GigaSpaces Technologies
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.