February 06, 2012
According to a recent Forrester Research report, not only will computing become more centralized as the bulk of storage and processing shifts from local systems to cloud datacenters, but the ownership of those datacenters will also undergo a consolidation process. The end result, according to Forrester's business and technology outlook for 2020, is a future in which a handful of computing cartels control millions of servers in datacenters around the world.
The gist of the report, as expressed in a New York Times piece, is that "cloud computing will come on quicker than you think, it will be controlled by a very few companies that will fight for the right to own your data, and businesses need to think about what software they can write that will differentiate them from all the other customers of these giants."
The cartels will gain market share by offering inexpensive computing that is easy to use and cheap to maintain. Aside from these similarities, each vendor will likely seek to differentiate its product by targeting specific technology needs as well as key verticals:
"Consumer and other data will become the mainstay of firms such as Amazon, Google, Microsoft," the report says. "Big technology vendors will provide processing, analytical capabilities and global reach, while credit-card processing, logistics and social firms will provide transaction data, logistical supply and consumer demand and consumption."
Forrester's short-list of cartel candidates includes Amazon, Cisco Systems, Google, IBM, Microsoft, and Oracle, all US-based corporations. (Dell and HP did not make the cut.) In an interview with the Times, Forrester Vice President Kyle McNabb, one of the study's authors, noted that while the US is leading the drive toward cloud consolidation, the same adoption model could occur in other countries.
If the bulk of computing becomes commoditized, businesses will need to look elsewhere for their competitive advantage. Most notably, analytics software will take on additional significance as it will boost firms' ability to identify and respond quickly to changing customer tastes.
McNabb notes that businesses will need to accept the changing IT landscape if they are to continue to compete, however, in his opinion, most are not ready for what's to come and spend the bulk of their resources simply maintaining legacy systems.
The "cloud cartel" idea is not a new one. Remember the famous Thomas J. Watson quote (or misquote): "I think there is a world market for maybe five computers." Try substituting "clouds" for "computers." The same premise was discussed here (2011) and here (in 2006). The latter, a 2006 blog entry from Sun's/Oracle's Greg Papadopoulos predicted that "there will be, more or less, five hyperscale, pan-global broadband computing services giants." The likely candidates? Google, Microsoft, Yahoo!, Amazon.com, eBay, and Salesforce.com. In the more recent blog, John Battelle writes, "over the next ten or so years, I wonder if perhaps the market won't shake out in such a way that we have just a handful of 'computers' – brands we trust to manage our personal and our work storage, processing, and creation tasks." He names Google, Amazon, Microsoft, Apple, and IBM as contenders.
If you're keeping score, Google, Amazon, Microsoft and IBM appear to be leading the charge to rule the cloud.
No utility computing discussion would be complete without referencing Nicholas Carr, who has written a large body of work on the subject. His 2005 article "The End of Corporate Computing," predicts that "computing utilities will bring to an end the traditional model of 'corporate computing' in which computing is carried out within individual corporations - just as electric utilities made 'corporate electricity generation' obsolete. And utility computing will represent 'the end' toward which business computing in general is heading. It's IT's destination."
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.