June 11, 2007
It has been awhile since I spoke with Digipede boss John Powers, so I must say I was pleased when he gave me a call last week to discuss his company’s recent plays in the hedge fund space. Powers always has some interesting things to say, and with the success Digipede seems to be experiencing, I can’t very well blame him for being as optimistic as he seems to be.
For starters, as referenced in this week’s feature article, Digipede has secured a handful of hedge customers within the last couple of months, including III Offshore Advisors, and these customers, according to Powers, have been more than impressed with how painlessly they were able to get up and running with the Digipede Network, as well as with how quickly they have been seeing results. A couple, he said, already have become repeat customers. However, you can read the article to find out more about this and to find out why, at least from the point of view of III Offshore Advisors CTO Paul Algreen, grid computing is such an ideal solution for hedge funds.
What really impressed from speaking with Powers is how much the company has grown since we spoke last summer. While even he acknowledges that the base numbers for growth weren’t the biggest around, you have to respect that from June 2006 to June 2007, Digipede has seen a 500 percent increase in the number of installed Digipede Agent Processor Licenses and a 100 percent increase in number of customers. On top of that, revenue for the first half of 2007 is up more than 600 percent from the first half of 2006. Not too shabby!
Of the new customers, Powers says that while the company has been seeing uptake from a variety of vertical markets, the big one has been financial services. In fact, he joked, when referencing from where business has come in the past year, he breaks it down into “Financial Services” and “Other.” “It’s really clear,” he told me, “how the financial guys translate more computing power into money.” Not surprisingly, he added, this makes the sales process pretty easy: they just have to prove the product works.
Of course, even with these significant increases in customer base and revenue, though, Digipede is hardly a Fortune 500 company -- but that doesn’t mean it isn’t establishing itself as a player in the grid market. With a potential customer base of just about every Windows shop in the country, ease-of-use that has been heralded by customers since day one, and the promise of even more functionality in the upcoming Digipede Network 2.0, Digipede might have a brighter future than its detractors could ever have imagined. For what it’s worth, I’ve heard from several customers, including III Offshore advisors, that they chose Digipede over a variety of more-established vendors.
On to other news, however, I want to share a discussion I had with Objectivity, whose distributed, object-oriented Objectivity/DB recently was certified as Level 6 grid-compliant by IBM. What’s interesting here is the timing, as it has been only a few months since Oracle announced its acquisition of Tangosol, whose flagship Coherence software provides distributed, in-memory database capabilities. According to Rich Shelley, vice president of worldwide sales for Objectivity, IBM actually has approached the company to see how the two entities might work together along the grid computing front. Now, I’m not suggesting Big Blue has any immediate plans to purchase Objectivity, but if it’s looking for a quality distributed database technology, it definitely could do worse.
Although the software, which dates back to 1988, obviously wasn’t designed with grid computing in mind, Shelley says it’s a “glove fit” for grid, and it doesn’t seem to have a problem with scale. In fact, Shelley told me, Objectivity/DB’s peer-to-peer architecture can run as one federated database across thousands of processors. This was the case at the Stanford Linear Accelerator Center, where the database was used to handle a terabyte of data across several thousand nodes.
For now, though, Objectivity/DB is simply certified to interoperate with IBM grid software, and that’s fine with Objectivity, who wanted its foray into the grid market to be hand-in-hand with the “big boy” in the space. However, after seeing how this partnership plays out, Shelley is confident more opportunities will arise, and he foresees the company seeking certification with more middleware vendors. "It's quite simple,” he told me, “for us to operate in any grid environment, but there is a certification process that we have to go through before we can go out and wave our flag and say, 'Oh, yes, we work ... with a Sun grid and Platform Computing's grid and everybody else's.' We will certainly go after that, but our idea here was you first go after the IBM grid."
Fair enough, but I’ll be interested to see how this partnership plays out and whether Objectivity does, in fact, become certified with Sun, Platform or any other middleware vendor. With the increasing importance of real-time data access in grid and next-generation datacenter environments, it might be in IBM’s best interests that Objectivity’s technology doesn’t find its way into too many other camps. After all, while Objectivity was previously certified as Level 3 compliant, which indicated an ability to run in batch programs, Level 6 compliance means Objectivity can run in increasingly important service-oriented architectures, and thus adds even more value for IBM.
Finally, to conclude my comments this week, I’ll point you to some other worthwhile items from the issue. For starters, I would check out the three reports from last month’s EGEE User Forum: “Getting Grid Users Together,” “Grid Challenges for Business” and “Grid Power in Five Minutes?” I’d like to thank the EGEE team for getting those together and sending them our way. Beyond those, other relatively important announcements include: “Singapore Forms High-Level Council to Lead Grid Adoption”; “Gigaspaces Optimizes Gallup's Web Application Framework”; “IU Data Capacitor Achieves 977 MBps Across TeraGrid”; “Platform Computing Announces LSF 7.0.1”; “HP Speeds Adoption of Virtualization With New Software”; and “OGF Seeking Presenters for Seattle Meeting.”
Comments about GRIDtoday are welcomed and encouraged. Write to me, Derrick Harris, at email@example.com.
Posted by Derrick Harris - June 11, 2007 @ 11:02 AM, Pacific Daylight Time
Derrick Harris is the Editor of On-Demand Enterprise
No Recent Blog Comments
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.