May 05, 2011
BOSTON, May 5, 2011 -- Red Hat, Inc. (NYSE: RHT), the world's leading provider of open source solutions, today announced that it has expanded its technology partnership with the University of Wisconsin-Madison (UW-Madison) to establish the Center for High Throughput Computing as the first Red Hat Center of Excellence Development Partner. In addition, Red Hat announced that it has recognized the UW-Madison CHTC as the first recipient of its Red Hat Cloud Leadership Award for its advancements in cloud computing based on the open source Condor project and Red Hat technologies.
Red Hat and UW-Madison first partnered in 2007 around the Center for High Throughput Computing when the organizations signed a strategic partnership to co-develop the Condor technologies to bring innovation from the research community to the enterprise. The goal of the CHTC has been to advance the state of the art and promote the adoption of technologies that support High Throughput Computing on large collections of distributively owned computing resources.
Building on the partnership with the UW-Madison, Red Hat also includes hardened Condor technology in its Red Hat Enterprise MRG product, a next-generation IT infrastructure incorporating Messaging, Realtime and Grid functionality that offers increased performance, reliability, interoperability and faster computing for enterprise customers. Together, Red Hat and the CHTC have driven continued innovation extending into cloud computing through Condor technologies, initially with support for public cloud scheduling and further extending Condor's capabilities to run cloud infrastructure. Red Hat introduced its CloudForms Infrastructure-as-a-Service (IaaS) product this week at the Red Hat Summit in Boston. The Red Hat CloudForms Cloud Engine solution leverages Condor technology to enable cloud scheduling and to support High Throughput Computing applications.
As part of the Wisconsin Institutes for Discovery, the Center for High Throughput Computing has been utilizing Condor and Red Hat-based technologies to support a broad range of scientific computing including the following projects and organizations:
Human Genome Project: To scale in a cloud infrastructure so scientists can utilize more nodes to conduct research and continue to map the human genome.
University of Notre Dame: To design software and applications to harness the power of the entire University’s computing resources to conduct research on biometrics and other sciences; also established the Green Cloud at the Center for Research Computing to provide computing power at a lower price and lower environmental impact.
UW-Madison Department of Botany: To understand how plants grow and develop by statistically modeling data to map genetics of distinct plants.
Morgridge Institute for Research: To accelerate the movement of science from the laboratories of researchers to public use around the world as treatments and cures.
UW-Madison Physics Department work on the Large Hadron Collider (LHC): To map the huge volumes of data produced by the Large Hadron Collider, which handles 40 million energy proton collisions per second, and have ease of scalability to process the data.
University of Nebraska: To run a cloud leveraging idle student labs at night to enable physicists to run research at lower costs and with increased load-balancing.
“Red Hat and the Center for High Throughput Computing have long held a strategic technology partnership around distributed computing,” said Carl Trieloff, technical director of Red Hat’s Cloud Computing Initiatives. “The CHTC has achieved fantastic results dealing with the scale and complexity of self-service, resource-based computing – extending Condor technologies into cloud computing is a natural evolution and has been done in the science world for years. With its strength in grid and cloud computing, we’re excited not only to expand our partnership with UW-Madison to establish the CHTC as the first Center of Excellence Development Partner, but we’re also thrilled to grant them the first-ever Red Hat Cloud Leadership award for their innovative advancements around the cloud and Condor.”
“We began developing and deploying grid technology in today’s cloud computing models many years before the cloud became such a compelling industry trend,” said Miron Livny, professor of computer science at University of Wisconsin, director of the Center of High Throughput Computing and CTO of the Wisconsin Institutes for Discovery. “We see a natural partnership between what we do with Condor technologies on the university and national laboratory level and with what Red Hat does with these technologies in the commercial software industry. In both cases, we are committed to the open source model of moving innovation into the marketplace and bringing the benefits of grid and cloud technology to the masses.”
For more information about the Center for High Throughput Computing, visit http://chtc.cs.wisc.edu/.
For more information about Red Hat Cloud, visit www.redhat.com/cloud.
For more information about Red Hat, visit www.redhat.com. For more news, more often, visit www.press.redhat.com.
About Red Hat, Inc.
Red Hat, the world's leading provider of open source solutions and an S&P 500 company, is headquartered in Raleigh, NC, with over 65 offices spanning the globe. CIOs ranked Red Hat as one of the top vendors delivering value in Enterprise Software for seven consecutive years in the CIO Insight Magazine Vendor Value survey. Red Hat provides high-quality, affordable technology with its operating system platform, Red Hat Enterprise Linux, together with virtualization, applications, management and Services Oriented Architecture (SOA) solutions, including Red Hat Enterprise Virtualization and JBoss Enterprise Middleware. Red Hat also offers support, training and consulting services to its customers worldwide. Learn more: http://www.redhat.com.
Source: Red Hat, Inc.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.