July 07, 2008
Unless you hold totalitarian views, you probably agree that part of the appeal of on-demand computing is that you don’t have to be one of a select few to do some demanding. That’s part of the idea behind Q-layer’s upcoming Delegation Manager product.
The first significant thing about Delegation Manager is that, working with other Q-layer technologies, it essentially converts a virtual datacenter into an on-demand compute utility, or cloud. Second, it can give the power to build virtual datacenters to even “non-technical end-users.” “Delegation Manager is a tool that turns a virtual infrastructure that you have already into a cloud computing platform,” says Paul Speciale, vice president of product management at Q-layer. “Anyone from administrators to helpdesk technicians or even authorized end users can easily provision a datacenter.”
“The value that we’re bringing to users and to datacenter operators is to make the datacenter more agile.” Speciale says. “It’s the whole idea of utility computing, that you have a service pool that’s easily provisioned by either IT people or even by non-technical end-users. The thing we enable is that you can come up to the portal that we provide and very quickly define the elements you need in a small datacenter.”
Q-layer’s browser-based portal presents users with a pane of available resources -- Linux, Windows and Solaris servers; NAS and other storage devices; network connections; security appliances; or whatever IT components are required. To provision your datacenter, you drag and drop components into a canvas pane, link them to the network and storage, and when ready hit the “Deploy” button. “Once you say 'Deploy,' our software makes the calls to translate your virtual reality into the actual underpinnings of your datacenter,” Speciale says. “Within a few minutes, typically, all these things will be running and you can go in and manage them.”
Delegation Manager works with two other Q-layer technologies to make all this happen. The Virtual Private Data Center is the top layer that provides the easy interface to provisioning and control capabilities. The Data Abstraction Layer “works under the covers to deploy the resources you’ve specified,” Speciale says. “When you say ‘Create a virtual machine,’ it understands how to make that happen underneath by calling into VMware or whatever hypervisor’s appropriate. The Delegation Manager works with these other layers so that you can create a datacenter as an end-user in minutes, and you don’t have to be technical, and you never have to touch the hardware.”
The first version of Delegation Manager works with VMware. Support for Xen and Sun’s VirtualBox is coming soon, and for Microsoft’s Hyper-V by the end of the year, the company says.
Q-layer provides administrators with controls to manipulate certain resource details. “If you have a server that isn’t meeting demands, for example, we can provide interfaces that let you grow the memory or increase the amount of processing capacity that a machine has associated with it,” Speciale says. “The DAL is the engine that knows how to build and allocate things behind the scene. The DAL framework is all about workflows, scripted workflows, so it knows how to schedule things, how to automate processes, and so on. You could do a backup of all your virtual machines, for example, through our scheduler built into the workflow engine.”
Keeping a Tab
Part of the notion of cloud computing
is that you pay for what you use. Q-layer provides for the paying
part. Before you can start designing your datacenter, you have to
acquire credits. The third pane of the interface shows a tally of
money spent. When you drag a resource into your datacenter, credits
are deducted. “This pane represents the billing aspect of what
you’re provisioning,” Speciale says. “If I’m a service
provider, I’ll charge you for each of the assets you want to model
and deploy. In an enterprise, this could be the basis of a
charge-back capability. You can see your costs and credits in real
Q-layer’s credit system also acts as a check against users grabbing resources willy-nilly. “People can’t consume more than they’ve bought, so there’s a built-in threshold level that provides further control. It protects that backend from being over-provisioned. Businesses have to have some kind of capacity-planning capability.” The Virtual Private Data Center can also be configured as “read-only”: users can see what their cloud looks like, but they can’t touch it.
Q-layer, founded in Belgium by a team with a heavy datacenter background, has been testing its technology with hosting providers in Europe for the past year. “We are now actively involved in conversations with service providers and enterprises in the US. We’re in the throes of starting proofs of concept here.” The company has an office in Mountain View, Calif.
“This is a product that people can install on top of existing infrastructures and really enable the agility you need to do utility computing,” Speciale says. “You don’t have to have a brand new environment. With Q-layer, you can have a running VMware environment or other hypervisor environment and we’ll install on top of it. We’ll find out what you’re running, and we’ll enable that capability on top of it.”
“We think our technology really enables the concept of cloud computing,” Speciale says. “Real cloud computing requires that services such as servers and even applications be available on demand to any user, but with chargeback. In our view, we’re helping datacenters become services, available to just about anyone on demand.”
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.