September 20, 2011
According to the House Subcommittee on Technology and Innovation, which is holding a hearing on the future of cloud computing in the United States, there is an important step down the cloudy path that is being overlooked. All issues of security and data protection aside, the more immediate question is whether or not the U.S. Internet infrastructure will be capable of handling a vast influx of government, business and citizen bandwidth use.
The hearing charter, called “The Next IT Revolution? Cloud Computing Opportunities and Challenges” will address issues of innovation and efficiency as well as the broader goals and roadblocks that the federal government in the U.S. faces as it seeks to fulfill cloud computing initiatives.
The charter argues: “Users of cloud computing services will require access to services at any time from any device with an Internet connection. However, there are concerns that current broadband networks may not be able to provide constant, on-demand access if cloud adoption grows…Lack of adequate broadband access in areas where business are located or in areas where users want to access services remotely will likewise limit further widespread cloud computing adoption.”
The federal government notes that there are a number of advantages to a swift move to the cloud, including the general benefits of “providing new ways of managing information for the public and private sector…cost savings on IT infrastructure and maintenance, increased access to high-powered computing applications for both business and academic researchers, and greater data and file accessibility for consumers.” Issues of data movement and portability are also mentioned, but these are considered in light of a more imminent problem—general, ubiquitous accessibility and reliable infrastructure.
While the expected security, privacy and portability concerns are certainly valid, the subcommittee plans to address the “elephant in the room” issue that so often gets buried in favor of conversations about security—the need for immediate overhauls of internet infrastructure to support an increasingly cloud-driven technology landscape.
As the hearing document states, “Users want assurances that they will have ubiquitous access to cloud services. Therefore, network resiliency and broadband accessibility are crucial factors in determining cloud adoption.”
Witnesses for the committee hearing will provide expert insight and comments. These witnesses include Microsoft’s Dan Reed; Nick Combs, Federal CTO at EMC, David McClure, the Associate Administrator from the Office of Citizen Services and Innovative Technologies within the General Services Administration, and Michael Capellas, CEO of Virtual Computing Environment Company and the co-chair for the Commission on the Leadership Opportunity in U.S. Development of the Cloud (CLOUD2), which is a commission launched by the TechAmerica Foundation—a group that seeks to lay the groundwork for federal cloud policies.
Before we get too far ahead of ourselves in the United States, we need to step back and take a look at the critical infrastructure that will support the grand cloud visions espoused by our government, businesses, and regular citizens who increasingly rely on cloud-based services for nearly every web action.
As Derrick Harris argued in advance of the committee’s hearings:
"But even in areas with broadband access, the availability, price and restrictions of fast connections is problematic. As more people use more cloud services, our country’s 3.9 Mbps average broadband connections will soon become saturated in the last mile. And as more business — and even the federal government — utilize cloud computing literally infinitely more than they did when our current infrastructure was built, that means even more congestion over the Internet backbone.
The government is pushing for 100 million homes to have 100 Mbps connections by 2020, but we need that today. There’s also the issue of broadband caps limiting access to the cloud and forcing consumers and web workers to troubleshoot and act as cops on their home networks."
Full story at House Subcommittee on Technology and Innovation
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.