May 20, 2011
The intersection between the “big data buzz” and cloud computing holds some interest for those considering HPC clouds, especially as it invokes a number of concerns about data movement and management.
This week we spent some time weighing the possibilities that await big data clouds with Impetus Technologies, a software product engineering company that straddles the cloud/big data line.
While Impetus is but one of a growing number of companies on the big data cloud track, Vineet Tyagi who heads the company’s labs division, has been watching the lead-up to this merger of cloud computing and massive information for some time.
We talked about this momentum and the role of key open source data management technologies several months ago, but since then, conversations about the big data cloud have gathered more steam. In light of the increased interest, his company is hosting a web-based session that will address the numerous scalability, maintenance, performance challenges and security concerns that involve big data clouds. Impetus hopes to put these challenges in context by using real-world examples, a feature that can be more helpful for such problems than an abstract discussion.
We reached out to Tyagi to get some early answers to some of the questions they will be addressing on May 27 (details about the big data cloud session here).
HPCc: Cloud computing and big data are increasingly being used in the same sentence since in many ways, the cheap offsite storage saves room and other costs in the short term. However, don’t you think the future of “big data in the cloud” is limited by significant data movement issues? How are these overcome?
Tyagi: Cloud based deployments and big data problems certainly go hand in hand WHEN this big data is already present in the Cloud. Network usage is still a major revenue earner for Cloud vendors which means that getting your data in and out of the cloud is still relatively expensive. Some cloud vendors are trying to make this easier by providing alternative bulk upload mechanisms. For example, AWS allows you to ship your data in hard drives by courier/post that can be offline uploaded to AWS cloud storage.
HPCc: What are some big data problems that are best suited to the cloud (and by cloud we mean a service like EC2) and on the flip side, which ones are best kept in house?
Tyagi: Any business problems where the big data sources can be generated or captured in the cloud are a perfect fit for cloud deployments. Web applications, web crawlers or data churning applications, OLTP or OLAP solutions deployed on cloud can benefit from the elastic compute nature of the cloud for running the application as well as to reap the additional benefits of big data analytics solutions deployed on the same cloud.
HPCc: What is your opinion of the quality and usability of Amazon’s HPC-geared Cluster Compute Instance type? How often have you worked with customers creating or migrating applications to and what were the biggest problems and benefits?
Tyagi: The AWS HPC Cloud offering is certainly going to be a major game changer with new opportunities for software developers to harness this on-demand computing power. The biggest benefit with HPC centric solutions is that it allows problem resolutions in considerably lesser cost vis-à-vis the traditional solutions. This benefit is enhanced by use of HPC Cloud since now the CAPEX costs for HPC are reduced to zero.
The biggest concerns include assuaging the issues with cloud security, building up the customer confidence in the Cloud and then readying the application for the Cloud deployment.
Choosing the right cloud is also a problem when planning to move the application to a cloud PaaS or SaaS where the application might need to be re-written. Even in the IaaS offerings, various decisions have to be taken into consideration with respect to the Cloud resources quality, SLAs, support, security compliance etc.
HPCc: As your company plays a different role in several arenas, we are curious—what is the role of open source software in the cloud computing paradigm shift as a whole?
Tyagi: Open source is one of the key drivers for cloud computing since the open source community recognized its potential early on and was able to quickly transform and evolve around the cloud.
HPCc: Can you provide one of the best examples you know of that involves very large datasets and the cloud?
Tyagi: There are plenty of examples but one of the most famous one is the NY Times AWS usage. The New York Times used 100 Amazon EC2 instances and a Hadoop application to process 4TB of raw image TIFF data (stored in S3) into 11 million finished PDFs in the space of 24 hours at a computation cost of about $240.
And one funny anecdote, rather a rumor that is making rounds is that the NY Times engineers made a mistake in running the process the first time so they had to run it twice and ended up paying $480.
As a side note: As mentioned above, last year we spent some time talking to Vineet in person during the Cloud Expo event in Santa Clara, California where he shared some insights on using cloud computing to tackle some large data problems. The “mafia connection” to the cloud is a rather interesting one well—just as carmaker Chevy once named a car Nova, which in Spanish means “no go” (quite an oversight) so too does the phrase “in the cloud” have some interesting associations outside of the English language.
Posted by Nicole Hemsoth - May 20, 2011 @ 9:43 AM, Pacific Daylight Time
Nicole Hemsoth is the managing editor of HPC in the Cloud and will discuss a range of overarching issues related to HPC-specific cloud topics in posts.
No Recent Blog Comments
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.