December 11, 2006
The Distributed Management Task Force
Inc. (DMTF) has announced new milestones and highlighted details of its
participation in the Management Developers Conference (MDC), held
December 4-7, 2006, in Santa Clara, Calif. Continuing its work in
delivering the standards that enable vendor-independent, end-to-end
system and network management, the DMTF is unveiling the latest in
systems, server and Web services management standards; host
interoperability events; and provide numerous educational sessions to
the management industry.
The DMTF announced several new technology releases, including the Systems Management Architecture for Server Hardware (SMASH) 1.0 specification, which completes the DMTF's total server management solution, as well as the WS-CIM specification, which enables management via today's Web services infrastructure. The DMTF also announced the release of SMBIOS 2.5, which addresses how motherboard and system vendors present management information about their products in a standard format. These latest releases, along with the organization's other critical standards -- including the Common Information Model (CIM) and Web-Based Enterprise Management (WBEM) -- streamline integration and reduce costs by enabling multi-vendor interoperability in management systems for the data center and beyond.
As part of the MDC agenda, the DMTF's Common Diagnostic Model (CDM) and SMASH Forums will host plugfests to further the interoperability of these standards. These in-depth, private technical events will help refine and validate the tools to be used in the groups' forthcoming conformance testing and certification programs.
In addition, the DMTF's standards are playing a lead role in numerous educational sessions, with developer-oriented sessions from the DMTF featured in the following MDC conference tracks: the Common Diagnostic Model (CDM), Web Services, Server Management, Storage Management, Developer, Developer Tools, and Grid and Networking.
"The DMTF's significant progress is evident at MDC this week, where key new technology advances are on display," said Winston Bumpus, president, DMTF. "By providing the key standards that form the foundation for managing everything from servers to storage to Web services, the DMTF continues to aggressively meet the evolving needs of the industry, and we're pleased to highlight these advances to the developer community at this event."
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.