September 24, 2012
Partnership provides cost-effective cloud storage and preservation services
Sept. 24 — DuraSpace and the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, have announced a partnership under which DuraSpace's DuraCloud organization and SDSC's cloud storage services have become fully integrated to provide additional preservation services and storage for researchers, students, academics, and industry users.
Customers now have the option of choosing the world's largest academic cloud storage system through DuraCloud. Access to commercial cloud providers Amazon and Rackspace along with SDSC Cloud Storage are available simultaneously through DuraCloud, a single, Web-based platform.
"SDSC is an important strategic partner for us in that they are the first academic-based production cloud offering and we envision together building an academic-based open cloud infrastructure with a layer of services provided by DuraCloud," said DuraSpace CEO Michele Kimpton.
"SDSC is pleased to deploy its OpenStack-based cloud storage service to support essential data preservation services for academic institutions and other users," said Ron Joyce, SDSC's associate director of IT services. "This new partnership is a win-win for both of us. It provides SDSC with a new avenue of potential opportunities in the academic community as we look to fill a critical gap in academic data management services for institutions across the country, and gives DuraCloud expanded resources with which to grow their own customer base."
DuraCloud and SDSC's cloud storage combined services integration gives customers the following benefits:
About SDSC Cloud Storage
SDSC's cloud storage services provides academic and research partners with a cost-effective way to store, share, and archive data, including extremely large data sets. Utilizing the open standard cloud storage software, OpenStack, with Swift Object Storage, content is written to multiple physical storage arrays simultaneously, ensuring at least two verified copies exist on different servers at all times. Once objects are in the SDSC Cloud they are immediately available over the web to other users, the public, or just the content originator. By assigning a URL to each file or container created, SDSC's Cloud provides a simple way to expose and share any amount of data.
DuraCloud is a platform that enables users to manage, preserve, and access content in the cloud, as well as utilize multiple cloud providers through a single web based dashboard. DuraCloud simplifies the task of moving copies of content into the cloud and storing them with several different providers. As a result, it eliminates the risk of storing content with a single cloud provider.
Source: San Diego Supercomputer Center
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.