April 20, 2011
The phrase "walled gardens of innovation" has stuck with me since I first heard it some time ago. If I recall, it was during a conversation about the closed nature of technological progress due to the high cost of infrastructure and the code licenses to make use of it--and what the shift from traditional to Internet-delivered everything would mean.
That image of secret, lush gardens tucked away and open to only a select few is rich enough in technological context. However, in this case, the second image that follows, if only because I cannot read Alan Weisman's "The World Without Us" enough times, is the evidenced power of new, ambitious growth's ability to reduce even the most admirable feats of architecture into seeded rubble--and in quick order.
Here at the beginning of this spring with gardens (walled or otherwise) coming to life, this metaphor is even more pertinent. To further it, one could suggest that the new growth this year is starting to upend the roots of the old twisted trees inside the confines. And moreover, that the strong new growth is tearing away at the gap-filled blocks that have, for so very long, kept everything nicely contained.
If we stick with this image for a moment longer, we can see those gnarled deep-rooted trees as the monolithic elements that have allowed those walls (and what happens within them) to persist. But with so many new offshoots that can change the rules by burrowing under the walls, sending their tangled fronds over them, or simply pulling them down bit by bit, it's becoming clear that the monoliths' roots are soon to be choked out.
Let me be presumptuous here and speak for all of us when I say we're bored with the literal interpretation of clouds in their actual context. I'll announce here that despite the nature/technology ties I've been making thus far, the metaphorical buck stops here. But really--there is something happening that is subverting old patterns of growth, something that is casting a shadow from above over those walled gardens (okay I said I wouldn't do that…).
Some could make the argument that cloud computing has reversed (or at least rerouted) the staid patterns of technology innovation, at least in terms of the origins of that innovation. Arguably, this is the first decade that signs pointing to a future of "democratized" technology are on the horizon because the drive behind competition and choice is coming, at least in the software point of view, from the bottom to the top.
Like any other movement to bring democratization -- political, social or otherwise -- this movement is characterized by a bottom-up approach that is upending that system of technology delivery and access. This approach brings democratization via much wider competitive playing fields, more choices (the open source-plus-cloud angle could be a book in this vein), but most importantly, a reseating of innovative power. While it still hasn't completely happened yet, the days of software giants are perhaps not over, but could be limited. If the monoliths can't begin to truly match what all the little (power in numbers) offshoots with competitive offerings delivered in easily accessible formats/pricing setups (clouds), they could be sapped in what will be, relatively speaking anyway, the blink of an eye.
By reseating of innovation, I mean that the IBMs, Microsofts, and other household-namers of this world have to work just a little harder. A lot harder, actually. They're going to be downright scrambling by this time next year if they don't find a way to intelligently market using this cloud "power" shift and then, most importantly, deliver on those promises.
In other words, up until relatively recently, new products and cutting-edge ideas were only realized in the arenas that saw government funding or were backed by Fortune 500-level companies. The tide has changed, however, with the cloud-delivered democratization of everything from massive-scale hardware infrastructure, complementary changes in software development to take advantage of this new class of users with sudden access, and for that matter, the licenses themselves with their branches that form strangleholds over the entire IT infrastructure.
In an editorial in the New York Times, Steve Lohr remarked that clouds are "more than a hyper-efficient means of distributing digital services. The cloud model is animated by a set of Internet technologies for juggling computing workloads in data centers far more efficiently than in the past--potentially reducing costs by about half." Despite these positive statements, however, he opined that the major technology players are not acting fast enough to jump aboard the cloud bandwagon in meaningful ways.
Despite this, however, Lohr argues that as of now, tech giants, not to mention their biggest customers, which used to lead the way in financing and trailblazing for cutting edge new technologies, are lagging behind in the cloud shift. He reminds readers what so many analysts have been saying since the beginning of the cloud boom--the marketing is perfectly in place but the actual offerings are still in early adolescence.
Lohr makes the claim that IBM, as the "bellwether in the corporate technology market" is one of the leaders in the big-tech-gone-cloud movement, pointing to the company's release last week of its own suite of cloud-driven services that allow users to select from a "menu" of enterprise-grade services and pay via the hallmark cloud utility billing models.
In IBM's view, the market for its own brand of cloud computing services and products will reach the $7 billion mark in revenue terms by 2015. Along those lines they figure that a little over half of that will be due to customers who are making the switch from traditional on-site software and services, whereas the other portion will be driven by entirely new forms of doing business.
While there are certainly some vague elements in these projections, especially for a company that dips over the software line to include hardware, which is increasingly being marketed as "cloud ready" (by many others--not necessarily IBM), Big Blue is looking upward. However, so are any number of technology giants. Most companies that have been around since the dawn of the Internet era and before are finding new ways to capitalize on this dramatic market, even if it just means retailoring their old offerings a little and hiring cloud-savvy marketeers to work their buzzword magic.
On the one hand, it's easy to agree that the cloud swell has left technology giants floundering for a while, trying to figure how they could both rush to put lipstick on pigs (private clouds) without selling customers an idea they weren't comfortable with quite yet (public cloud-delivered anything).
What I'm saying here, and what Lohr more gracefully addresses, is that there's a certain air of desperation to keep the walls of those gardens upright and airtight. To somehow keep the status quo in terms of the select few while paradoxically trying to also fit into the organic democratization that's been happening since the first Web services (Gmail and the like) began to enter daily conversation.
The two operational modes aren't compatible. You can't keep that old business model and expect it to work while at the same time playing to something that isn't a native part of how business has always been done.
Posted by Nicole Hemsoth - April 19, 2011 @ 10:34 PM, Pacific Daylight Time
Nicole Hemsoth is the managing editor of HPC in the Cloud and will discuss a range of overarching issues related to HPC-specific cloud topics in posts.
No Recent Blog Comments
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
he study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.