January 29, 2013
SEATTLE, Wash., Jan. 29 – Amazon Web Services, Inc. (AWS), an Amazon.com company, today announced Amazon Elastic Transcoder, a highly scalable service for transcoding video files between different digital media formats. For example, customers can use Amazon Elastic Transcoder to convert their large high resolution “master” video files into smaller versions that are optimized for playback on websites, mobile devices, connected TV’s and other video platforms. Amazon Elastic Transcoder removes the need to manage infrastructure and transcoding software, providing scalability and performance by leveraging AWS services. The service manages all aspects of the transcoding process transparently and automatically. It also supports pre-defined transcoding presets that make it easy to transcode video for smartphones, tablets, web browsers and other devices. With Amazon Elastic Transcoder, customers can create enterprise, training, user-generated, broadcast, or other video content for their applications or websites.
Traditionally, transcoding has been complex for customers in three significant ways. First, customers need to buy and manage transcoding software, which can be expensive and also requires substantial configuration and management. Second, audio and video settings for each device that the customer wants to support need to be created and tested. Often, this is a trial and error process, which can be wasteful as compute resources are used each time a new combination of settings is tried. Third, to accommodate peak workloads, customers need to provision the transcoding capacity appropriately. This can be expensive because most of the time the capacity is underutilized.
With Amazon Elastic Transcoder these complexities are eliminated. There is no need to buy, configure or manage the underlying transcoding software. In addition, Amazon Elastic Transcoder provides pre-defined presets for popular devices that remove the trial and error in finding the right settings and output formats for different devices. The service also supports custom presets (pre-defined settings made by the customer), making it easy for customers to create re-useable transcoding settings for their unique requirements such as a specific video size or bitrate. Finally, Amazon Elastic Transcoder automatically scales up and down to handle customers’ workloads, eliminating wasted capacity and minimizing time spent waiting for jobs to complete. The service also enables customers to process multiple files in parallel and organize their transcoding workflow using a feature called transcoding pipelines. Using transcoding pipelines, customers can configure Amazon Elastic Transcoder to transcode their files when and how they want, so they can efficiently and seamlessly scale for spikey workloads. For example, a news organization may want to have a “high priority” transcoding pipeline for breaking news stories, or a User-Generated Content website may want to have separate pipelines for low, medium, and high resolution outputs to target different devices.
“Our customers told us that it was difficult and expensive to transcode video due to the explosion in the number of devices they need to support,” said Charlie Bell, Vice President of Utility Computing Services. “They had to be both experts in the intricacies of video support on different devices and manage the software required to run the transcoding jobs. None of this work had anything to do with their goal: getting a high quality video that would look great on the devices they wanted. We built Amazon Elastic Transcoder to give our customers an easy, cost effective way to solve these problems.”
“The Language Learning Center offers hundreds of hours of video content in over 50 languages to students and faculty, with a growing library of video assets,” said Bob Majors, Senior Computing Specialist, University of Washington. “With Amazon Elastic Transcoder, we’ve been very impressed with how easy it is to convert our content into versions that work well on the web and on mobile devices.”
"Zuffa encodes videos of the Ultimate Fighting Championship and prides itself on delivering the best and most up to date content to its fans. Amazon's Elastic Transcoder has the horsepower to scale for very large videos, even hundreds of gigabytes in size and this will help us continue to be leaders in the digital space,” said Christy King, VP Technology R&D, Zuffa, LLC.
OneScreen is a technology solutions provider that connects video producers, publishers, and advertisers across all screens through its Media Graph platform. "At OneScreen, we are excited to add Amazon Elastic Transcoder to our broad utilization of the AWS cloud,” said Patrick Ting, CTO, OneScreen, Inc. “With the scalability and cost-efficiency the service offers, we’re able to pass those benefits on to our producer and aggregator partners, making it easier for them to bring high quality content to the market."
Amazon Elastic Transcoder offers simple pay-as-you-go pricing. Customers are charged based on the number of minutes they need to transcode and the selected resolution. There are no upfront fees or minimum commitments required. To help customers understand how this service can be used with their applications, AWS is providing a free tier of service, in which up to the first 20 minutes of content transcoded each month is provided free of charge. Amazon Elastic Transcoder is available in six regions: US East (N. Virginia), US West (Oregon), US West (N. California), EU (Ireland), Asia Pacific (Singapore) and Asia Pacific (Japan).
About Amazon Web Services
Launched in 2006, Amazon Web Services (AWS) began exposing key infrastructure services to businesses in the form of web services -- now widely known as cloud computing. The ultimate benefit of cloud computing, and AWS, is the ability to leverage a new business model and turn capital infrastructure expenses into variable costs. Businesses no longer need to plan and procure servers and other IT resources weeks or months in advance. Using AWS, businesses can take advantage of Amazon's expertise and economies of scale to access resources when their business needs them, delivering results faster and at a lower cost. Today, Amazon Web Services provides a highly reliable, scalable, low-cost infrastructure platform in the cloud that powers hundreds of thousands of enterprise, government and startup customers businesses in 190 countries around the world. AWS offers over 30 different services, including Amazon Elastic Compute Cloud (Amazon EC2), Amazon Simple Storage Service (Amazon S3) and Amazon Relational Database Service (Amazon RDS). AWS services are available to customers from data center locations in the U.S., Brazil, Europe, Japan, Singapore and Australia.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Cloud computing has become mainstream in today’s HPC world. In order to enable the HPC researchers who currently work with large distributed computing systems, to bring their expertise to cloud computing, it is essential to provide them with easier means of applying their knowledge.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.