December 10, 2007
PORTLAND, Ore., Dec. 3 -- The Distributed Management Task Force Inc. (DMTF), the industry organization leading the development, adoption and promotion of interoperable management standards and initiatives, today celebrates the tenth anniversary of the launch of its Common Information Model (CIM) standard. Initially developed in 1997 as a conceptual model to describe the components of managed computing and networking environments, CIM (pronounced \sim\) has expanded to new markets and evolved to become one of the most widely implemented system and network management information models to-date.
In 1997, the DMTF CIM Sub-Committee, comprised of participants from CA, Compaq (now HP), HP, Intel, Microsoft, Novell, Sun Microsystems and Tivoli Systems (now IBM) made CIM Version 1.0 available. During the next decade, CIM gained broad industry adoption.
The standard has been implemented in all major operating systems since Windows 98 and is used as the fabric for server and desktop management. CIM has even moved into the virtual world to serve as the basis for DMTF’s virtualization management technology. The technology has also expanded to provide definitions for storage management, peripherals, network components and applications. CIM has been implemented into many products currently offered from many major corporations.
“The tenth anniversary of the CIM standard is a significant milestone for the industry,” said Winston Bumpus, DMTF president and an active driver of CIM development since its inception in 1997. “Ten years ago, DMTF created a ubiquitous standard to help IT managers streamline increasingly heterogeneous environments. Since then, we’ve expanded CIM into new market segments. Looking ahead, we’re excited to pursue new innovations and further adoption of the CIM standard.”
A number of DMTF Alliance Partners and member companies have collaborated with DMTF to expand CIM into new areas during the last 10 years, such as:
There are also a number of open source projects developed around the CIM standard, including:
DMTF has also worked with its alliance partners the Open Grid Forum and the Printer Working Group to add CIM into grids and printers respectively. DMTF is currently working to further adoption of CIM to include networking, power management and new areas of virtualization.
To learn more about the CIM standard, CIM related tools, and all CIM Schemas and specifications to-date, see www.dmtf.org/standards/cim/.
With more than 4,000 active participants representing 44 countries and nearly 200 organizations, the Distributed Management Task Force Inc. (DMTF) is the industry organization leading the development, adoption and promotion of interoperable management standards and initiatives. During the last 15 years of its history, DMTF management technologies have become critical to enabling management interoperability among multi-vendor systems, tools, and solutions within the enterprise. By deploying solutions that support DMTF standards, IT managers can choose to deploy a mix of systems and solutions that best meet their users’ needs, while reducing management complexity and total cost of ownership. Information about the DMTF technologies and activities can be found at www.dmtf.org.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.