July 16, 2007
When South Brunswick, N.J.-based Infosolve Technologies talks about offering its customers “zero-based solutions,” the company is referring, said vice president Subbu Manchiraju, to “zero software licenses, zero defect data, zero-term commitments and, with Network.com, zero upfront hardware costs.” And while the first three benefits are strictly business strategies meant to ease customers’ minds, the latter probably puts the biggest smile on the face of Infosolve itself.
The company offers data integration and quality services to clients ranging from Bank of America all the way down to non-profit organizations (lending credence to Manchiraju’s assertion that all companies, regardless of size, need data solutions), and in 2006 decided a change was necessary to handle increasing customer demand for Infosolve’s compute-intensive data matching and de-duplication services. When it discovered the Sun Grid Compute utility, Infosolve knew it had found the highly scalable, high-performance solution for which it was looking -- and at a cost, by company estimates, $150,000 less per year per server node than traditional hosting services, and $300,000 per year less than building and maintaining an in-house cluster solution.
Speaking about the data matching service, which can take hours to run as it analyzes data of many types and from many sources, Manchiraju explained, “When we have one customer and we have N number of systems in our datacenter, it’s very easy to do for the customer and still manage to keep the resources OK, but once you have 10 customers and you are supposed to get back to them within a few hours … it becomes very, very difficult. You have to keep on adding hardware.” By offloading this work to the Sun Grid, he said Infosolve is able to bring as many resources as needed to the table, and they only have to pay for what they use.
And when it comes to how much they pay, well, Sun’s $1 per CPU per hour hardly breaks the bank. In fact, as noted earlier, Manchiraju said that as part of its “zero-based solutions” motto, Infosolve’s customers aren’t even billed for the “negligible” hardware resources required to run their jobs -- although a cynic would assume it likely is factored to some degree into the bill they receive for “professional services.”
However, the most interesting thing about Infosolve’s case isn’t even that it is a service provider using a utility computing service to the benefit of its customers, as there are an increasing number of companies offering on-demand services utilizing the Sun Grid, and others, for the backend resources. What’s so interesting is that utilizing the utility model is the norm at Infosolve, which currently runs data matching jobs across the Sun Grid for every single customer, whereas software-as-a-service programs with many ISVs tend to be among the less-popular options. Infosolve customers actually do have the option of running the jobs on their own resources, but Manchiraju said that not one has requested that ability thus far.
By essentially treating Sun’s resources like its own personal datacenter, Manchiraju believes Infosolve has been able to create a brand new business model based around the aforementioned idea of “zero-based solutions.” In the world of data quality, he says, there are so many complications and variations that most organizations really need specialists to manage the process for them, and if Infosolve is able to save big money on resources by partnering with Sun and then pass those saving onto its customers, he believes the company will be able to achieve even greater success as time goes on. There is, after all, a huge market for data quality solutions, he added.
In fact, the company has ramped up use of the grid in the past three months and, as might be expected, is looking for ways to expand its use into other lines of business. Why not? It’s been nothing if not profitable thus far. “To put it simply,” said Manchiraju, “I think it’s amazing that we have been able to leverage the Sun Grid, and also make money off of it -- [which is] the most important thing.”
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.