September 28, 2012
According to recent surveys conducted among enterprise professionals, security concerns have been a major roadblock in the path to cloud adoption. However, new developments show that users and certain government agencies have started warming to the idea of using cloud services to handle more sensitive data.
Take for example the General Service Administration's (GSA) FedRAMP program, a collaborative effort aimed at increasing confidence in the security capabilities of cloud service providers. FedRAMP involves members from the National Institutes of Health, Department of Homeland Security, Department of Defense, National Security Agency, Office of Management and Budget along with the Federal CIO Council and private industry professionals.
One of the benefits of the program is the government's ability to assess and certify the security practices of cloud service providers. This accomplishes a number of tasks.
1) Creates a uniform system for testing cloud service providers.
2) Increases transparency between providers and government agencies.
3) Generates more confidence in cloud providers that achieve certification.
This week Federal Times reported that since the FedRAMP program was launched, more than 50 cloud service providers applied to get their government stamp of approval. Unfortunately less than a handful will have the chance of receiving that recognition. GSA member Dave McClure expected to complete reviews for just three operators by January.
If any of those lucky providers do pass the security test, they will receive a provisional authority to operate (ATO). With an ATO in hand, these companies will be certified for use by the Department of Homeland Security, Department of Defense and General Services Administration. The ATO makes other agencies aware of a provider's capabilities, which in turn, speeds up the process of adoption.
The system seems both rigorous and hopeful, but the devil is in the details. For example, a number of cloud providers have difficulty with certain federal security requirements. If an operator is to receive FedRAMP certification, they have to show that systems housing government data are accessed with two-factor authentication. Also, employees with access to government data have to undergo extensive background investigations. Seems like a lot of legwork to receive a shiny certification.
While the government works to bring cloud vendors up to speed with security, commercial outfits are coming up with some creative solutions to secure data in a public cloud environment. On Wednesday, the NASDAQ OMX group launched a service called FinQloud, powered by Amazon Web Services and aimed at the needs of financial services sector. The platform is hosted by Amazon and protected by a robust key encryption management system.
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.