November 17, 2008
It’s funny how life works inside large software companies. For example, CA today announced a new push around virtualization management and cloud computing enablement, but this wasn’t necessarily a long-term strategy.
In fact, says Stephen Elliot, vice president of strategy for CA’s Infrastructure Management and Data Center Automation business unit, when the company decided to get aggressive around virtualization management, people were surprised with the capabilities that had been developed across the product portfolio. From backup/disaster recovery with XOsoft (and related VMware partnership) to mainframe support with Mainframe VM Manager to security with AccessControl, CA’s individual divisions had built in virtualization capabilities that, when taken as a whole, enable advanced management of the virtual layer. “We have some new products, but then we have additional capability where some of these key products have added, quite quietly, the ability to manage virtual machines and virtual infrastructure,” he explained to me.
What spurred CA’s movement to gets its “arrows pointed in the same direction” is the transformational effect virtualization is having on customers, and their subsequent inquiries into how CA fits in with the individual virtualization vendors’ management platforms. Elliot says the answer is easy: “It’s really the difference between how do you manage the platform … and how does it really move from the business management of the enterprise.” As customers hit virtualization tipping points, he told me, they need features like deep performance visibility and detail views of virtual and physical infrastructure.
CA believes its solution set offers end-to-end management of and visibility into enterprise infrastructures, with a prime example being the “great triumvirate” of CA, VMware and SAP. According to Elliot, “SAP is increasingly one of the top types of application workloads that customers want to virtualize,” and the business model whereby SAP is reselling CA Wily Introscope is facilitating this move. Customers don’t want six different vendor relationships just to manage the virtualization layer, Elliot told me, and CA (along with partner VMware) can help create a unified interface.
Of course, expanding your virtualization footprint isn’t all roses, so CA also helps customers with best practices, manage complexity and maximize ROI. The company also plans to expand multiple hypervisor support into more products, and Elliot says CA is carefully watching the inflection points where customers are choosing to utilize multiple virtualization platforms.
These services and future directions probably are a good thing, as Elliot doesn’t see any sign of a virtualization slow-down. “We haven’t talked to any customers who said, ‘Geez, I’m pulling back. I don’t want to put more on my virtual machine infrastructure,’” he told me. “It’s been all the opposite … where the question is now becoming ‘What shouldn’t we put on virtual machines?’”
One of the new products Elliot mentioned is Data Center Automation Manager (DCAM), which also happens to be the focal point of CA’s new cloud computing strategy. DCAM helps companies compress change management, automate provisioning and allocate resources on demand, but beyond that, CA’s cloud strategy isn’t too clear. To hear Elliot tell it, CA’s main goal right now seems to be getting feedback so it can produce the best-possible cloud computing, whenever that might be. “The last two or three quarters … even some of the larger enterprises are asking us what are we going to do for the cloud, how do we plan to manage clouds, how do we plan to consider what are the key requirements for the so-called cloud.”
Some of this interest probably relates to the world’s current economic woes, Elliot added. “I think this particular cycle, what we’re seeing here, is IT really looking at the different types of ways they can get their functionality to the customers -- whether it’s a cloud or those types of services, whether it’s software as a service, or whether they’ve got their internal IT staff that is really thinking more like an internal cloud or service provider.”
Finally, Elliot noted, CA is very cognizant of the bottom line as it moves into the clouds. Aside from getting customer feedback, he says, “It’s gonna be just an important for us to recognize what we should be doing as it will be to put development dollars into what we will be doing.”
Posted by Derrick Harris - November 17, 2008 @ 1:24 PM, Pacific Standard Time
Derrick Harris is the Editor of On-Demand Enterprise
No Recent Blog Comments
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.