September 17, 2007
With the changes we’ve seen in the grid market in the past few years (and, as you’ll see this morning, there are more to come), time really has flown by. The Open Grid Forum under Mark Linesch has been a microcosm of this, from Linesch himself taking over for grid vet Charlie Catlett, to OGF having its relevance questioned by the formation of the Enterprise Grid Alliance, to the two bodies merging around this time last year. With this in mind, it seems a little unreal to me that Linesch’s reign is already up and the OGF has officially named its new president, Craig Lee of the Aerospace Corporation.
Lee is an OGF veteran and someone who has experience working with all sorts of distributed systems, so I hope he’ll be able to parlay that into some real tangible results for the grid market’s preeminent standards body. For his part, Lee certainly thinks he has what it takes, and he makes a very good point when discussing the need for commercial-off-the-shelf grid tools. Although he frames it in a discussion of how important this issue is for companies like Aerospace Corporation, Boeing and Lockheed-Martin, I believe this notion really spans across industries and into the minds of every decision-maker who has implemented or is planning to implement a distributed platform. These things are becoming more and more prevalent -- often they’re not even called grids -- and although the OGF has been working toward the goal of interoperability since its inception, it might be now or never in terms of really making something happen. As long as companies keep installing proprietary solutions that aren’t necessarily designed around open standards, it seems like it will be all the more difficult to get them on board with pushing the OGF agenda -- especially if users are pleased with their solutions and don’t see any immediate need for interoperation. However, if this issue is so important to Lee on a professional level beyond his role with the OGF, the organization might just be able to make some significant strides along this front under Lee’s leadership.
Of course, the issue of companies adopting grid technologies regardless of what is being done in the standards world is part of the OGF’s identity crisis. The consortium is dedicated to keeping a balance between the needs of scientific and mainstream commercial users, but I think any analyst or industry expert would agree that the real key to driving success is getting enterprise end-users involved by making them realize just how crucial open standards will be as they look to expand or evolve their current distributed architectures. No one likes vendor lock-in, but that is a very real possibility given the current state of things. The problem is that while the OGF definitely has industry support from organizations to whom distributed standards are of the utmost importance (eBay, for one), the OGF’s reputation and continued, although by no means sole, focus on research grids has to make average commercial users question whether the standards body really has their best interests in mind. Because the Aerospace Corporation is a non-profit entity and a lot of its computing work would be deemed HPC, the appointment of a new OGF president from this space might add fuel to the fires of already skeptical industry users. I’m sure the OGF thought about this when making the decision, though, and Lee himself doesn’t see it as a hindrance, so I have to give them the benefit of the doubt for the time being.
As you probably noticed, we also have another interview in this issue, this time with Digipede president and CEO John Powers. Powers does a good job detailing the new and improved Digipede Network 2.0 product, so if you’ve been following the company or thinking about implementing the solution in your datacenter, this one might be worth reading. It might not be the biggest, fanciest or most bleeding-edge product on the market, but Digipede has always been about delivering a very reliable and effective solution in an easy-to-use, cost-effective manner -- and if Powers is to be believed, Digipede will almost certainly further that reputation with the new release. You can see the official announcement in this morning’s breaking news.
Also, as most of our readers likely were aware, the VMworld conference was being held last week in San Francisco, thus resulting in a glut of virtualization (generally VMware-related) items in this week’s issue. Now, I’m not about to pick and choose which ones might be the most important – depending on your needs, that determination probably is unique – but I would like to point out the two announcements, from Cisco and start-up Xsigo Systems, around network or I/O virtualization. Cisco has been peddling its VFrame software since acquiring TopSpin a few years back, but Xsigo just began talking about its take on I/O virtualization last week. Server virtualization, while far from flawless at this point, has pretty much been proven battle-ready, so now seems like a good time to start really showing those early adopters what virtualization can do for their networks, whether we’re talking about LANs, SANs or whatever. By sticking to the same selling points as its server-centric cousin, I/O virtualization’s promise of less hardware and more efficiency could be too good to pass up, as well.
Finally, we had big news outside of the OGF and VMworld, so be sure to check out these items: “Pioneer Investments Selects Egenera for Mission-Critical Apps”; “Britons Favor World Health as Research Focus”; “Platform Computing, XenSource Sign OEM Agreement”; “BEA's Project Genesis to Enable Dynamic Business Apps”; and “AMD Breaks Out Quad-Core Opteron.”
Comments about GRIDtoday are welcomed and encouraged. Write to me, Derrick Harris, at firstname.lastname@example.org.
Posted by Derrick Harris - September 17, 2007 @ 11:29 AM, Pacific Daylight Time
Derrick Harris is the Editor of On-Demand Enterprise
No Recent Blog Comments
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.