July 10, 2012
Gartner released a report yesterday, which covered global IT spending. The research firm tracked growth over the last year and made a number of prognostications regarding future investments. Most notably for cloud watchers, the report anticipates a significant increase in worldwide public cloud spending: from $91 billion in 2011 to $109 billion in 2012 – a 20% increase. Furthermore, the report predicts that enterprise cloud spending will hit $207 billion in 2016.
Richard Gordon, research vice president at the firm, explained that all major forms of cloud services have become increasingly popular with enterprise:
Business process as a service (BPaaS) still accounts for the vast majority of cloud spending by enterprises, but other areas such as platform as a service (PaaS), software as a service (SaaS) and infrastructure as a service (IaaS) are growing faster.
Gartner is not the only firm foretelling exponential growth. Last year, a report from Forrester Research, forecasted a $241 billion cloud services market in 2020. They also expect certain areas to crest in the interim. For example, researchers say that Infrastructure-as-a-Service (IaaS) spending will peak at $5.9 billion in 2014. At that time, multiple forces including commoditization, falling margins and pricing pressure will stunt further growth.
Similar claims were made about Software-as-a-Service (SaaS), which was on pace for $21.2 billion of revenue in 2011. Increased enterprise adoption created predictions of a $92.8 billion SaaS market in the year 2016, at which point analysts believe the services will reach a saturation point.
Despite increased popularity in enterprise markets, these services have not been as popular in the highest echelons of computing. An Intersect360 report from 2011 highlighting HPC trends displayed a relatively minor impact from cloud-based offerings. Specifically, the document said that public cloud services account for just 3 percent of HPC spending. The most data-intensive applications will require custom-tuned, big iron systems; however, there are still many uses cases for HPC cloud. Cycle Computing made news when it spun up a 50,000-core cluster on Amazon EC2, and earlier this month, Google used 600,000 cores to run a genomics application on its cloud infrastructure.
Advancements in technology, in addition to economic realities, should continue to boost cloud adoption in all areas. As for global IT spending, Gartner says it will grow to reach $3.6 trillion this year, up 3% from $3.5 trillion last year. Based on these figures, public cloud services ($109 billion) make up 3 percent of total IT spend. It must be stressed that this figure only takes into account public cloud; an expanded cloud definition would comprise an even greater percentage of overall IT market share.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.