November 22, 2010
For the most part, it’s been all quiet on the cloud front for HP until the recent announcement of its G-Cloud Theatre, which formally introduces their vision of secure, flexible cloud computing for government use.
Currently in the testbed stage, HP Labs is presenting the G-Cloud as a demo to IT professionals in the public sector to show how services can be constructed from hosted components with a management layer on top—and of course, with one security feature piled atop another.
On the security front, the HP G-Cloud Theatre facility, located in Bristol, England is intended to “demonstrate ways in which cloud-based systems can withstand even the most serious threats and attack” and is based on technology that’s still in HP’s research pipeline. Targets for this facility include honing how virtual machines monitor other VMs, how suspicious behavior generating in the CPU and I/O are identified and handled, and how automation policies can be put into place to protect systems via shutdown, cloning and isolation if a threat is detected.
As Martin Sadler, director of the Cloud and Security Lab at HP said, the security responses are similar to those of a human body. “The G-Cloud can automatically respond to a threat, making a calculation of its seriousness and producing the equivalent of white blood cells to counteract it. When the threat has been removed and those resources are no longer required, it goes back to its previous state.”
Highly secure government facilities are indeed a priority, but to some, this could sound like a system administrator’s worst nightmare and best friend wrapped into one. While possible threats could be thwarted, there are extensive shut down and isolation procedures that are automated across the system based solely on abnormal CPU patterns. Anyone who has used any security software on a home computer is already familiar enough with this paradigm.
Despite this questionable aspect, the G-Cloud gets bonus points for its futuristic factor. As an elaborate Philip K. Dick-esque complement, “visibility of the server estate is provided by an innovate administrative console with a touchscreen user interface that generates 3-D images of server activity. For example, it depicts levels of utilization, the ease with which new services can be deployed or taken down, and how resources can be dynamically reallocated from one services cell to another.”
In Sadler’s view, “It’s like a game or a movie where you have a virtual walkthrough of everything that’s going in the cloud…you can zoom in for more granular detail or pan out to get a panoramic view.”
Wow. Government IT just a lot more interesting today…
Then again, there was no real mention of glass Minority Report interfaces per se, but if they’re calling this touchscreen UI a “console” (gaming, anyone) then what are we supposed to think? Bling? Yes. Big government….hmmm.
Granted, this is just a tad off topic but the only governments that have what amounts to what’s pictured on the left for general purpose use are fictional ones in the far-off future. It will take far more than cinematic bells and whistles and what could potentially be cripplingly sensitive security protocols to lure governments to the cloud.
Governments are being pitched on the cloud computing idea by any number of vendors recently but for relative latecomer to the niche-defining race, HP lagged somewhat behind. Unfortunately, it will need to compete with the slew of offerings put forth for this same sector by IBM, another household name—and one with strong ties to government and research already.
While they might not have the Minority Report appeal that HP’s initial release seems to boast, IBM’s government cloud initiative, which is blanketed under what it calls the Federal Community Cloud, provides a cloud platform that runs across agencies and also has security policies as a primary goal. IBM has made inroads in the United States with this effort when it was able to sign on 15 agencies, including the coveted Departments of Defense and Homeland Security (what better way to associate your brand with high security priorities) as well as less James Bond-like agencies like the Department of Housing and Urban Development.
IBM has also stretched out a hand to smaller subsets of government via its Municipal Shared Services Cloud that extends the same services to local governments.
HP might be able to gain some traction in the U.K. where the host research center is based that now supports the G-Cloud effort but for now, government cloud adoption in Europe is lagging behind that in the United States. If the company is able to secure a strong foothold and convince government IT leaders of its usefulness (and not just wow them with the cool interface) this could mark a new phase in HP’s overall cloud strategy market segment-wise.
Posted by Nicole Hemsoth - November 22, 2010 @ 7:24 AM, Pacific Standard Time
Nicole Hemsoth is the managing editor of HPC in the Cloud and will discuss a range of overarching issues related to HPC-specific cloud topics in posts.
No Recent Blog Comments
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.