July 21, 2008
Although cloud computing presents new risk of security fiascos, there are many things providers and customers can do to take advantage of on-demand resources in a safe, reliable way. Smart people have been giving this a lot of thought.
Dave Durkee, CEO of cloud provider ENKI, says cloud security has to start with fundamentals and good hardware defenses -- emphasis on hardware. “I’m a big believer in hardware firewalls,” he says. “It’s not really enough to know the signature of an attack. You need to know where the traffic is coming from in order to block it. If they fill your upstream connections, you can be shut down.”
ENKI builds its system around AppLogic, 3Tera’s grid engine, which “allows us to manage the connections inside the virtual datacenter, so you can specify the interconnections between each server,” Durkee says. “You can go from server A to B but you can’t get to C. We’re able to build this layer of security into the architecture. But you also need to go the traditional route and have hardware-based intrusion detection and a firewall sitting in front of all that grid stuff. You need multiple layers of protection, just like in the middle ages with the moat and outer walls keeping invaders from getting inside the castle.”
While virtualization technology might enable its own security risks, it also allows system designers to build security in from step one. “The key is to build security in at the planning stage, when you’re designing your virtual machines,” says Tamar Newberger, vice president of marketing for Catbird, which provides security monitoring tools for virtual and physical networks. “You have to design in policies like ‘No financial machines can leave the country,’ for example. If you don’t want employees being able to send certain types of virtual machines to, say, Tokyo, then you build that into your policies and then into your virtual infrastructure. You can have monitoring tools that alert you if someone tries to do something that violates security policy.”
IBM takes advantage of virtualization capabilities within the servers it uses in cloud centers to implement security measures, says Dennis Quan, chief technology officer for High-Performance On-Demand Solutions at IBM. Most important, he says, are isolation techniques to keep customer data and resources separate.
“A lot of work still needs to be done to secure the channels that reach the cloud outside your enterprise,” Quan says, but the company has developed solutions around its current technology. “We build isolation into the hardware, but network-based isolation is also necessary. This can be provided as part of a virtual LAN or we can use different routing technologies. In the cloud we set up for the city of Wuxi in China, we had to implement a lot of different forms of security. They have multiple software companies making use of that facility, and those companies have clients that are large enterprises around the world. So they need to have isolation. We implemented a VPN to make sure all the traffic going into the cloud is authenticated. We use virtual LANs and virtualization technologies to keep virtual machines completely isolated between different tenants in the cloud. There’s a lot more we need to do to strengthen authentication as the cloud evolves, and that’s part of what we’re learning as we build clouds around the world. The security products have had to improve to satisfy the demands of customers using these clouds.”
Security tools will have to adapt to the cloud’s pay-for-what-you-use model, says Craig Balding, technical security lead for a Fortune 500 company and proprietor of cloudsecurity.org . Real cloud security will require “dynamic provisioning and configuration of firewalls and network security monitoring devices to watch traffic from virtual compute instances spun up on demand, perhaps across multiple continents,” Balding says. “What happens when the situation suddenly changes due to demand? This will only get solved by smart security autonomics.”
Needed: A Standard Stick
Some sort of formal agreement on who is responsible for what will help improve security for cloud customers. Dominique Levin of network security provider LogLogic suggests something like the standard developed by credit card companies for data security, known as PCI (Payment Card Industry). “PCI defines a set of minimum measures that all organizations should implement to protect sensitive information. These include things like using a firewall, limiting unnecessary risky services on your network, and user activity monitoring through log data.”
“The problem with the cloud-based services is the lack of a strong and centralized ‘stick’ to force providers to comply with any standard. Visa and MasterCard can force adoption of the standard by levying steep fines or even refuse to process credit card transactions in the case of non-compliance. Customers of cloud providers could vote with their feet and refuse to do business until a security standard is adopted, but there is no organized ‘cloud providers customer group’ to drive and enforce a standard. The best we can hope for is that some enlightened services providers will adopt a standard to differentiate themselves. Once one provider is successful with such a strategy, others will follow and we will end up with a de-facto standard.”
One word seems to come up in virtually every conversation about strengthening cloud security: Transparency. (See also: lack thereof.)
“The lack of transparency is a big issue in cloud computing,” says John Engates, chief technology officer for Rackspace, the large IT systems hosting company with its own cloud division, Mosso. “As cloud computing is a fairly new concept, most of the current players are holding many details of their platform close to the vest. This includes details like size of the cloud, technology powering the cloud, security practices around the operations of the cloud, locations of data centers, personnel background, etc. Security-by-obscurity does not work. Large corporate IT buyers will demand security audits and assurances around controls. ... Most existing clouds don't provide this kind of transparency. Before we see widespread adoption by big enterprises and government agencies, we'll need quite a bit more transparency around these sorts of details. Secrecy is not a long-term competitive advantage and companies that are willing to be upfront and tell their customers more about the cloud infrastructure that's hosting their data will win in the end.”
Engates recommends testing transparency by also asking:
Not everyone says they’re worried about security issues in the cloud. “A lot of this is more about irrational psychological barriers. It’s like saying I don’t trust the bank to keep my money,” says Geva Perry, chief marketing officer at GigaSpaces Technologies, whose products enable companies to run and scale high-performance applicationss on grids. “There’s no reason to think that any corporate datacenter is more secure than the datacenter of a serious cloud provider like an Amazon or a Google or any of the others. These companies specialize in running massive datacenters.”
“I would not shy away from using cloud-based services,” says Levin of LogLogic, “but I would demand that my cloud provider doubles down on security and compliance measures. I would ask simple questions such as ‘Will you know at any given time who is accessing my data?’”
As the cloud evolves and more people store data “there,” the most important security tool might be a really simple one: “You may have the best security people, the smartest technology, but are you ready to have the security conversation?” asks Balding. “Are you willing to engage at a meaningful level? The biggest problem today from outside the cloud is that security is definitely cloud y -- when it needs to be transparent.”
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.