February 27, 2006
These statistics are embarrassing, if not downright scary, and illustrate the reason there is so much energy around the market phenomenon and community of users and developers using the term "Grid" to describe their approaches to IT solutions. This group of companies and individuals, and the technology they are developing, offers hope that we can improve the utility of both the infrastructure and IT budget that is used to fund it. Without their efforts, a critical component in the global economic engine could throw a rod and along with it stall the continuous march in productivity benefits we have enjoyed for the last 25 years. While the initial intent of the Grid community might have been high-end science, the real end benefit might be the salvation of corporate computing.
While the immediate future might be ominous, it's been amazing to see the economic benefits that have accrued from the convergence of computing and communications over the past five years. As a wise man said: "If it doesn't kill you, it will make you stronger" -- the dotcom bust weeded out the wheat from the chaff, and the hyper-investments in infrastructure of the late '90s are now paying off.
Much like the over-investment in railroad capacity at the end of the last millennium, IT is the track on which ideas run, and the importance in a knowledge economy cannot be overstated. The Wall Street Journal is now tracking black Monday -- the Monday after the Thanksgiving holiday, when folks go back to work and start to shop online (during lunch breaks of course) -- as closely as Black Friday -- the day immediately after Thanksgiving, which is typically the biggest brick and mortar shopping day of the year. Shopping over the internet has become as important as many of us proclaimed a few years ago. Companies in multiple industries are seeing double digit improvements in productivity with wireless mobile usage models, while some of the top 100 suppliers to Wal-Mart who made structural changes along with their RFID implementations are seeing 30 percent improvements to top line revenue growth. The open question is this: Can this era of positive benefits from IT continue?
Harvard economist Erik Brynjolffson and his colleague Lorin Hitt, from Wharton, have modeled the ROI of IT and concluded that it generally takes five years to see real payback. Hence, the benefits we saw from 2002 to 2005 were at some level the result of the hyper-investment in IT infrastructure from 1997 to 2000. For the past five years, IT investment has been flat to down as companies slashed cost. The 15 percent solution says that companies can probably eke out gains for the next couple of years, but my premise is that the demand will far outstrip supply at this level of investment and, given marginal utility at 15 percent, IT will hit the wall like a marathon runner. Lest we forget, the original marathon runner died as he finished. I think IT might suffer a similar fate if it doesn't change its diet and training regimen soon.
To put this in perspective, let's look at some data. Businesses and governments worldwide invest about $1.2 trillion in IT every year. If we round a little and exchange greenbacks for Euros, it's an even trillion. 1 trillion is about 40 percent less than the total gross domestic product of the United Kingdom or France, midway between the total GDP of Canada and Mexico, and almost identical to the GDP of Russia. To be spectacularly redundant, a trillion is a lot of anything and a hot load of Euros. Some analysts peg IT investment at about 49 percent of overall capital investment in mature economies. So it's a lot, but it isn't going to grow very much. IT investment has been roughly flat to down for the last five years and is forecasted to track or slightly lag GDP growth for the next three years.
This investment level is huge. One can debate whether companies are gaining competitive advantage relative to each other, but the verdict is generally that IT is one of the key technology ingredients to economic growth overall. The economist who figured this out, John Merton Solow, won the John Bates Clark medal in 1961 and Nobel prize in economics in 1987. As a side note, have you ever wondered why famous economists usually have three names? Seriously: John Maynard Keynes, John Kenneth Galbraith, John Merton Solow and the list goes on.
Michael Mandel, on the other hand, has only two names -- and is not named John. However, he has overcome this limitation to become the chief economic editor at Business Week. In his book Rational Exuberance, he points out the benefits to the overall economy in terms of productivity gains and commensurate increase in GDP that would not have been possible without continual improvements in technology and high technology in particular. Those companies and economic regions that invested in technology and IT infrastructure did well. And those that didn't? Well, they didn't do so well.
In summary, more money than the GDP of many large nations is spent each year on assets that generate significant benefit but are more idle than the average house pet. It's no wonder Nick Carr wrote an article for the Harvard Business Review, and a subsequent book, with the premise that IT doesn't matter. Imagine if a manufacturing company utilized only 15 percent of its factory infrastructure? Can you imagine the reaction of, say, T. Boone Pickens or Carl Icahn? I have a vivid image in my mind of Michael Douglas, playing the character Gordon Gecko in the movie Wall Street, describing why the executive leadership of said company was sucking the blood out of shareholders value while demanding complete restructuring of the board and executive management. I don't see Warren Buffet inviting executives who delivered 15 percent utilization of their factories to jump on his plane to join his buddy William Gates III in an upcoming celebrity bridge tournament.
It doesn't take the winner of the John Bates Clark medal to see the gross inefficiency here and extrapolate the impact on the business processes it is intended to improve. When infrastructure consumes half of every Euro of capital investment and delivers only 15 percent utilization, there is a big problem. John Bates' claim to fame was the theory of "marginal productivity," which established a foundational element in the basic theory of capitalism. Even in Bates' time, 15 percent utility was marginal. There is a whole body of economic theory devoted to macro and micro economic efficiency, but this situation is too obvious to weigh down with those details.
Bentley's second law of economics states: "The only thing more dangerous than an economist is an amateur economist." I'm not sure what Bentley's first law states, but the third law almost certainly covers the fact that amateurs should never ever publish their hypotheses. So, along with being an optimist, I'm a thrill seeker and will attempt to forecast the implications of the current macroeconomic investment in IT, the demand function and evolution in IT with the current state of architecture.
I'll begin by grabbing a crayon (Peter Lynch, the famous trader who started the Fidelity Magellan Fund, claimed that all investment ideas should be simple enough to describe with a crayon. Who am I to argue?) and a used envelope from a Christmas card I received last year. The hypothesis is simple: If legacy IT architecture utilization is fixed at 15 percent, and overall budget is growing at a best-case rate of 5 percent, the amount of budget required to keep the lights on increases at 2 percent each year, while demand is increasing due to an increased need to mobilize and digitize the workforce and engage in e-commerce with business partners and customers. The demand variable might be controversial, as solid data is hard to come by. Some might say that the solution is to keep usage in check in the near term. Essentially, to throttle innovation to levels that can be supported within the other two constraints. Unfortunately, the global economy is dynamic and won't stand still for static levels of innovation. In other words: In an unconstrained world, the demand would be increasing even faster, but is artificially throttled due to the limitations of the existing infrastructure. Guess what? Most of the new entrants in to the worldwide economy are unconstrained and investment inflows are at record levels. If the times are a changin', the competition is a' increasin'.
Leading indicators are raising their ugly heads from companies who are introducing wireless mobility into their workforce or completing RFID technology pilots, which I referred to earlier. In many cases, they are seeing double digit improvements in productivity or top line revenue, but can't deploy on an enterprise-wide basis. Why not? The usual cop-out is security and privacy, and it has the same rhythm and backbeat of a Ray Charles song: "I need some security healing ... c'mon I can get some privacy right across town ... it feels so good in the mornin' with the curtains down." Sounds good, feels good, but it is ultimately a distraction. IT can't meet the pent-up urge of new levels of productivity or comparative advantages because they can't afford it, and the current infrastructure can't handle the event-driven workloads, most of which originate outside the legacy firewall.
Anecdotal evidence is showing cracks in one of our economic cornerstones. It's time to go back to the crayons and envelope. Voila! The global IT economy ends in 2012. Oops, or is that 2021? Damn crayons. Well, kinder tools or not, you don't have to be Nostradamus or Alan Greenspan to forecast the problems coming for IT. One of the motivations for Bentley's Law is that amateur, and some professional, economic forecasters do not factor in improvements in basic technology. The classic example here is the forecast at the turn of the 19th century that New York City would be buried in horse manure in less than a generation. They missed the impact of World War I and the eventuality of the horseless carriage.
Whither Grid? The fundamental principal of Grid computing is virtualization, and simplify the underlying infrastructure so that it can be fully utilized. By utilizing some of the autonomic tools that are required to manage a large scale Grid infrastructure, IT can start to whittle away at the bloated overhead associated with legacy architectures. As the utility is improved, we can simultaneously reduce waste and invest in innovation.
I often get the question, "Will Grid ever become part of mainstream enterprise IT?" The question is almost amusing, in a Lenny Bruce sort of way. My answer, with a dose of Lenny's attitude, is: "Holy crap! How can enterprise IT survive without adopting the technologies and architectural concepts of the Grid community?" Otherwise, to paraphrase Lenny (who at the time was deriding telecommunications monopolies), you end up like a schmuck with a Dixie cup on a string!
It's kind of a harsh approach to advocate Grid solutions this way, but it's time to "rip and replace." These are three words you will never hear from a vendor. No one ever closed a deal by scaring the hell out of their customer. But the crayon doesn't lie. The strategies of "surround and conquer" or "embrace and extend" might have worked if incremental investments had continued at dotcom boom levels, but they haven't. We've just finished the fifth year of lagging incremental investment and the cracks are already showing. It's time to start to make some dramatic changes right now. Get the standards right, stop any nonsensical vendor infighting and get on with the hard task of overhauling legacy with Grid solutions. Economic prosperity hangs in the balance.
About Tom Gibbs
Tom Gibbs is director of worldwide strategy and planning in the sales and marketing group at Intel Corp. He is responsible for developing global industry marketing strategies, building cooperative market development, and marketing campaigns with Intel's partners worldwide. Gibbs joined Intel in 1991 in the Scalable Systems division as a sales segment manager. He then worked in Intel's Enterprise Server group, where he was responsible for business growth with all OEM customers with products that scaled greater than 4-way. Finally, just prior to joining the Solutions Market Development group, he was in the Workstation Products group -- responsible for all board and system product development and sales. Prior to Intel, Gibbs held technical marketing management and industry sales management positions with FPS Computing, and engineering design and development for airborne radar systems at Hughes Aircraft Company. He is a graduate in electrical engineering from California Polytechnic University in San Luis Obispo and was a member of the graduate fellowship program at Hughes Aircraft Company, where his areas of study included non-linear control systems, artificial intelligence and stochastic processes. He also previously served on the President's Information Technology Advisory Council for open source computing.
Jun 19, 2013 |
Ruan Pethiyagoda, Cameron Boehmer, John S. Dvorak, and Tim Sze, trained at San Francisco’s Hack Reactor, an institute designed for intense fast paced learning of programming, put together a program based on the N-Queens algorithm designed by the University of Cambridge’s Martin Richards, and modified it to run in parallel across multiple machines.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.