May 23, 2012
Developed with EMC, open source 'Razor' enables dynamic management of the entire IT lifecycle, from bare metal to fully operational cloud applications
LAS VEGAS, May 23 — Today at EMC World, Puppet Labs, the leading provider of IT automation software for system administrators, announced its open sourcing of Razor, a next-generation provisioning solution. Developed with EMC, Razor is the first solution to dynamically provision hardware using auto-discovered, real-time inventory data, eliminating inefficient, error-prone manual processes and speeding the delivery of deployed applications for DevOps environments.
Building on Puppet Labs' foundational enterprise message bus and node inventory technologies, MCollective and Facter respectively, Puppet Labs and EMC collaboratively developed Razor to meet the needs of system administrators managing infrastructure at the scale and agility needed for the cloud. In addition, open sourcing enables the large and active Puppet community to take advantage of Razor's modular architecture and develop plug-ins to extend support for any operating system and any boot sequence.
"Barcodes, scanners, manually updated spreadsheets, and golden images were good enough – in the 1990s," said Luke Kanies, founder and CEO of Puppet Labs. "Working together with EMC, we're proud to open source Razor, an operating system-agnostic provisioning solution that enables system administrators to programmatically inventory, provision, and configure their hardware infrastructure as productively and rapidly as they do their virtual and cloud infrastructures."
"Virtualization and cloud computing have forever changed the way IT environments are built and managed. By contributing to the Razor project, EMC is delivering critical provisioning tools required for the success of this open source community – thereby accelerating cloud adoption," said Dan Hushon, Office of the CTO, EMC. "Puppet Labs' users will reap the benefits of this cutting-edge project within their DevOps environments."
"Virtualization and cloud computing have raised the bar on what businesses expect from their IT organizations in terms of agility," said Ronni Colville, VP & Distinguished Analyst, Gartner. "Solutions that deliver flexible, dynamic provisioning and configuration of hardware infrastructure will facilitate further adoption of DevOps practices and be enablers for system administrators."
"Last year we deployed Puppet Enterprise to support our move to DevOps," said Mark Schena, Manager of Systems at Constant Contact, the leading provider of email marketing solutions for small businesses. "Razor enables us to 'puppetize' our server hardware and gives us a complete end-to-end IT automation solution with which we can provision and deploy applications even faster."
Pricing and Availability
Razor is open sourced under the Apache 2.0 license and is immediately available for free download from forge.puppetlabs.com.
About Puppet Labs
Puppet Labs, Inc. (www.puppetlabs.com) was founded in 2005 and shipped the first release of the open source Puppet Project later the same year. Puppet's popularity has since grown to where it now is responsible for managing millions of nodes across thousands of companies and organizations, both on-premise and in the cloud, including Zynga, Citrix, Shopzilla, Match.com, Oracle/Sun, to name a few. In 2011, Puppet Labs shipped its first commercial software product, Puppet Enterprise. Now numbering seventy employees and based in Portland, Oregon, Puppet Labs is backed by investors Kleiner Perkins Caufield & Byers, Google Ventures, VMware, Cisco, True Ventures, Radar Partners, and Emerson Street Partners.
Source: Puppet Labs, Inc.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.