November 28, 2012
BURLINGTON, MA, Nov. 28 – Attunity Ltd., a leading provider of information availability software solutions, announced today the official launch and commercial availability of Attunity CloudBeam, a fully-managed data transfer SaaS platform for Amazon Web Services (AWS) Simple Storage Service (S3). With its beta successfully completed, the high-performance and secure data transfer solution will be unveiled and demonstrated live at the AWS re: Invent Customer and Partner Conference from November 27-29 in Las Vegas, NV.
Attunity CloudBeam is designed to move data to, from and between cloud environments quickly, reliably and affordably for AWS customers. Commercially-available services include:
In addition to the above commercially-available services, Attunity CloudBeam provides the following services in beta:
"A few years ago, enterprises were skeptical about adopting cloud technology due to security concerns," explained Jeffrey M. Kaplan, Managing Director of THINKstrategies Inc., a cloud strategy consultancy, and founder of the Cloud Computing Showplace. "Since then, vendors have addressed concerns head-on by including state-of-the art security technology as part of the SaaS platform. There are now legions of successful enterprise cloud customers worldwide and the market continues to expand. Increasingly, companies are learning that platforms like Attunity CloudBeam can help extend and significantly scale their data center operations to meet Big Data demands more efficiently and affordably in the cloud."
“Attunity is honored to be a part of the Amazon Partner Network and we are excited to help customers deliver on the promise of Big Data,” explained Matt Benati, VP Global Marketing at Attunity. “Attunity CloudBeam is designed to support many important initiatives including BI/analytics, content availability, disaster recovery, backup and archive. One of the most interesting use cases we have seen so far during beta is of a customer using our solution for content distribution. This content distributor has been testing Attunity CloudBeam to load videos multiple times per day on a global scale. Data volumes for this service are expected to reach 20 terabytes per month and are managed in combination with AWS S3 so the company’s local offices can easily access and deliver it. Simply put, Attunity moves the data that moves today’s business.”
Attunity is a leading provider of information availability software solutions that enable access, sharing and distribution of data, including Big Data, across heterogeneous enterprise platforms, organizations, and the cloud. Our software solutions include data replication, Change Data Capture (CDC), data connectivity, enterprise file replication (EFR) and managed-file-transfer (MFT). Using Attunity’s software solutions, our customers enjoy significant business benefits by enabling real-time access and availability of data and files where and when needed, across the maze of heterogeneous systems making up today’s IT environment.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.