November 05, 2012
LAS VEGAS, SEG, Nov. 5 — At the 82nd annual Society of Exploration Geophysicists Meeting and Exhibition, Aspera, Inc., creator of next-generation technologies that move the world’s data at maximum speed, and OvationData, the global leader in the field of Exploration and Production (E&P) data management services and solutions, announced a global partnership to deliver complete end-to-end solutions for full data lifecycle management of massive amounts of seismic data from the field to the data center to the desktop to archive storage.
Aspera’s patented fasp software brings breakthrough cost savings and efficiency gains to organizations that need to move large volumes of data over public and private IP networks. Easy-to-deploy, fully cross-platform, file type and network type agnostic, Aspera solutions deliver unprecedented levels of transfer performance – including maximum speed, security and bandwidth efficiency. A standard in industries such as Media and Entertainment, Software and Gaming, Telecommunications and Life Sciences, Aspera’s transport technology and solutions have seen growing adoption within the E&P industry, as both large and small oil and gas companies grapple with the challenges of managing ever-growing volumes of seismic data.
Under the arrangement, OvationData will add Aspera’s software to its broad portfolio of lifecycle data management solutions covering the transfer, conversion, management, storing, migration and preservation of geophysical data.
“Gaining access to both legacy and current seismic data inevitability leads to vast amounts of data that must be moved from remote sites and locations into and between geographically dispersed data centers,” said Gregory Servos, president and COO, OvationData. “Aspera’s high-speed data transport solutions are a perfect complement to our broad portfolio of specialized data management and delivery services.”
“Physical media shipments and traditional WAN transfer technologies are inefficient, slow, do not scale and create artificial bottlenecks within business processes,” said Richard Heitmann, vice president marketing, Aspera. “OvationData’s long-standing leadership in the oil and gas segment will help companies leverage our software to move large seismic data sets from the field to the data center, lowering costs and reducing the turnaround times for data processing and distribution.”
Since 1976, Ovation Data Services (OvationData) and its subsidiaries, including DPTS have helped thousands of customers worldwide to preserve the integrity of their vital information assets. With technology and service centers in the United States and United Kingdom, and operating globally, OvationData has provided services and solutions to over 1,500 companies within the past three years alone. OvationData provides support for over 230 different digital media technologies from the past 50 plus years, along with various analog tape formats dating back even further. Today, OvationData is recognized as a leader in understanding the science and technology of digital data storage hardware and software solutions.
Aspera is the creator of next-generation transport technologies that move the world’s data at maximum speed regardless of file size, transfer distance and network conditions. Based on its patented fasp protocol, Aspera software fully utilizes existing infrastructure to deliver the fastest, most predictable file-transfer experience. Aspera’s core technology delivers unrivaled control over bandwidth utilization, complete security and uncompromising reliability. As organizations turn to the cloud for improved efficiency and unprecedented scalability, Aspera enables data- and processing-intensive workflows with high-speed transfer available on-demand, as well as maximum speed ingest and distribution of big data to and from cloud storage. More than 1,800 organizations across a variety of industries on six continents rely on Aspera software for the business-critical transport of their digital assets.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.