December 02, 2008
MILAN, Italy, Dec. 2 -- Neptuny, a leading solution provider in IT performance optimization and datacenter management, today launches Caplan 3.0, the business-aware capacity management solution for large datacenters and networks. The new version of Caplan facilitates successful implementation of virtualization and consolidation initiatives as well as the production of future scenarios to enable the alignment between IT resources and business initiatives.
Rationalizing IT resources while still being able to support the business is imperative for all corporations. Also, with a slumping economy, it becomes even more important to optimize the use of resources and to reduce operating and capital costs. The new version of Caplan 3.0 reduces overcapacity by its ability to determine the amount of capacity required at any given time leading to a more cost effective and greener IT department. Caplan 3.0 can also predict capacity shortages, thereby reducing the likelihood of any performance or capacity-related incidents as well as ensuring that the minimum capacity required to keep the business running is available in the event of any IT failure.
Neptuny's CEO Fabio Violante comments: "By using Caplan 3.0, enterprises are able to uniquely analyse business data and system data to forecast capacity requirements in near real time. IT managers are not only able to plan capacity but are also alerted by Caplan before additional resources need to be implemented and, more important, Caplan can identify underutilized capacity and determine areas for consolidation. A unique feature in Caplan is the ability to forecast what resources will be needed and if the IT infrastructure can cope when or if new services or business functions are implemented."
By doing "what-if" analysis, Caplan can reduce the time and increase the accuracy of analysing the impact of infrastructure change. This analysis allows a quick comparison of different scenarios and hence a reduction of unforeseen critical situations, such as service disruptions, not meeting SLAs or even application crashes. Caplan enables the IT manager to identify critical situations that may occur if changes are made to the infrastructure and reallocate underutilised resources. This way, Caplan facilitates communication between the IT department and the rest of the business through the ability to objectively justify investments based on business objectives.
Caplan 3.0 can be leveraged for virtualized environments as well. The latest release of Caplan v3.0 now supports several virtualization technologies including VMware ESX Server, AIX Micro partitions, HP nPartition/vPartition, Solaris Dynamic System Domains, and MS Virtual Server 2005. For all these technologies, Caplan has the ability to identify systems to be consolidated and to safely perform consolidation and virtualization initiatives as it can simulate both physical consolidation and virtualization. Once initiatives have been undertaken Caplan will automatically monitor the infrastructure behaviour to notify whether its behavior does not differ from detected baseline. Therefore, Caplan can help customers to reduce risks involved in overloading the virtualized infrastructure in their datacenters.
Additionally, Caplan 3.0 provides specific features enabling Capacity Management to be easily integrated with ITSM tools and workflow platforms, thus allowing external third-party application to leverage Caplan services (via SOAP Web Services and HTTP/XML calls) and to asynchronously push data into Caplan Capacity Database (via J2EE JMS Messages). This allows Caplan to promote an improved IT culture through the introduction of a structured capacity management process that complies with ITIL v3 best practices regarding Capacity Management.
Neptuny is a leading solution provider in performance optimization for both IT and digital media. For more than a decade, Neptuny expertise and technologies have been crucial to help customers in different industries (telco, banking, insurance, etc.) to improve the business outcome of their infrastructures and services. Neptuny solutions have been proven to provide sensational ROI by means optimization and capacity management initiatives. www.neptuny.com.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.