December 10, 2012
SEATTLE, Dec. 10 – Opscode, the leader in cloud infrastructure automation, today announced its second annual user conference, #ChefConf 2013, taking place at the Hyatt Embarcadero in San Francisco, April 24-26, 2013. #ChefConf 2013 will deliver three days of technical sessions, training, and keynotes presentations designed to help guide businesses on their path to infrastructure automation. To register and take advantage of ultra-early bird launch pricing for #ChefConf 2013, please go visit http://chefconf.opscode.com/register.
"With nearly two million downloads and tens of thousands of users around the world, Opscode Chef has become a critical part of automating and managing large-scale infrastructure and an enabler of the DevOps movement," said Adam Jacob, Chief Customer Officer, Opscode. "We're now at an inflection point in the next wave of IT infrastructure, as companies of all types and sizes look to speed application release cycles and reduce business risk with an integrated approach to development and operations. #ChefConf 2013 represents a key gathering place during a period of rapid change in IT, bringing together hundreds of IT innovators to establish shared best practices in DevOps for improving IT agility and efficiency worldwide."
"Our research is showing that IT complexity continues to be further exacerbated by the lack of collaboration between development and operations in enterprise organizations," said Ronni Colville, VP of Research, Gartner. "Well-executed DevOps strategies can solve both these issues, and we expect at least 20 percent of Global 2000 organizations to adopt a DevOps methodology to do just that by 2015. Innovative solutions will help drive DevOps adoption by building shared knowledge, skills and strategies for leveraging this integrated approach to application delivery for maximum benefit."
Last year's #ChefConf sold out, attracting nearly 500 system administrators, application developers, IT operations professionals, and enterprise architects to collaborate in creating shared knowledge and best practices for executing DevOps strategies in the data center, public and private clouds. #ChefConf 2013 will feature presentations from IT leaders representing some of the most leading-edge consumer and technology companies in the world. Already confirmed speakers include:
For individuals and companies interested in submitting a speaking proposal for #ChefConf, please see the Call for Proposals at https://chefconf2013.busyconf.com/proposals/new. For organizations interested in sponsoring and exhibiting at #ChefConf 2013, please visit http://chefconf.opsocde.com/sponsors.
Registration is $800 for ultra-early bird pricing, running through February 3rd, for the April 24-26 conference, which includes plenary sessions and three full tracks of deeply technical content. Registration for the Chef workshops on Wednesday, April 24, costs $450. Rooms are available at the Hyatt Embarcadero for a discounted rate.
About Opscode Chef
Opscode Chef is an open source systems integration framework built for automating infrastructure in the data center, private or public cloud. It allows software developers, engineers, and architects to easily deploy thousands of servers and scale applications throughout an entire infrastructure. Through a combination of configuration management and service-oriented architectures, Chef, Hosted Chef and Private Chef make it easy to create an elegant, fully automated infrastructure while simplifying systems management.
About Chef Community
In just four years, Opscode's Chef Community has grown to tens of thousands of users and 700 community cookbooks, supporting everything from Amazon EC2 and HP Cloud to Windows, Linux and VMware. More than 1,000 individuals and 160 organizations are contributing code to Chef, which has been downloaded nearly 2 million times and has surpassed a growth rate of more than 100,000 downloads per month.
Opscode is the leader in cloud infrastructure automation. Opscode helps companies of all sizes develop fully automated server infrastructures that scale easily and predictably; can be quickly rebuilt in any environment; and save developers and systems engineers time and money. Opscode's team is comprised of web infrastructure experts responsible for building and operating some of the world's largest websites and cloud computing platforms.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.