December 07, 2010
LAS VEGAS, NV., December 7, 2010 - At the Gartner Data Center Conference, Nimbula, the Cloud Operating System Company, today announced the availability of the public beta of its flagship Nimbula Director product. The software is now available for free download at http://nimbula.com.
Based on Nimbula's Cloud Operating System technology, Nimbula Director delivers Amazon EC2-like services behind the firewall. Allowing customers to efficiently manage both on- and off-premises resources, Nimbula Director quickly and cost effectively transforms inflexible, inefficient and under-utilized private data centers into muscular, easily configurable compute capacity while supporting controlled access to off-premise clouds.
Nimbula Director provides powerful utility-grade cloud features like policy based authorization, enabling secure multi-tenancy, topology-independent distributed network security and monitoring and metering. Nimbula Director also uniquely provides highly automated deployment and cloud management to scale and facilitates easy migration of existing applications into the cloud by supporting multi-platform environments and flexible networking and storage.
"Mid-size and large organizations have a growing interest in cloud computing as they see how private cloud computing can help them gain more control over their infrastructure," said Tom Bittman, Vice President and Distinguished Analyst at Gartner. "This interest is turning into real activity and near-term spending on private cloud solutions."
"We are fortunate to have a strong community of IT professionals who have helped us shape Nimbula Director over the past few months," said Willem van Biljon, Co-Founder and VP of Products at Nimbula. "We look forward to a greater number trying out the public beta of Nimbula Director and providing valuable input so we can deliver the best final product possible. Nimbula Director is the first example of Nimbula delivering on our mission to provide an industry leading Cloud Operating System."
"We have been working with the Nimbula team over the past few months as they marched towards the availability of the public beta of Nimbula Director," said Jim Harding, Chief Technology Officer at Sabey Corporation, who has built and operates some of the largest data centers in the country. "Nimbula Director is delivering on the team's vision and will help us reach new levels of agility with our infrastructure. As an IT team, we will be able to focus on services delivery and innovation."
"More and more frequently, we hear from companies eager to realize the proven benefits of the public cloud infrastructure in their own data centers. To do so, they need a fundamentally new approach to how they manage their private infrastructure," said Dave Bartoletti, Senior Analyst at Taneja Group. "As data centers continue to scale, virtualization alone is no longer enough to deliver higher levels of automation, efficiency and agility -- what's needed is the right Cloud Operating System. Nimbula Director has an innovative architecture which delivers higher service-levels from existing infrastructures and also provides a solid platform on which to develop new cloud-based services to extract even more value from IT investments, both on- and off-premises."
For the latest on Nimbula, go to: http://nimbula.com/news/.
Founded by the team that developed the industry-leading Amazon EC2, Nimbula delivers a comprehensive cloud operating system that uniquely combines the scalability and operational efficiencies of the public cloud with the control, security and trust of today's most advanced data centers. Nimbula was named one of the most promising startups in The Wall Street Journal and was dubbed "one of three cloud properties ready to burst" in Fortune. Nimbula is headquartered in Mountain View, California and has an office in South Africa. For more information, visit http://nimbula.com.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.