February 15, 2013
SEATTLE, Wash., Feb. 15 – Amazon Web Services, Inc., an Amazon.com company, today announced that Amazon Redshift, a fast and powerful, fully managed, petabyte-scale data warehouse service in the cloud, is now broadly available for use. With a few clicks in the AWS Management Console, customers can launch an Amazon Redshift cluster, starting with a few hundred gigabytes and scaling to a petabyte or more, for under $1,000 per terabyte per year. Since Amazon Redshift was announced at the AWS re: Invent conference in November 2012, customers using the service during the limited preview have ranged from startups to global enterprises, with datasets from terabytes to petabytes, across industries including social, gaming, mobile, advertising, manufacturing, healthcare, e-commerce, and financial services.
Traditional data warehouses require significant time and resource to administer. In addition, the financial cost associated with building, maintaining, and growing self-managed, on-premise data warehouses is very high. Amazon Redshift not only significantly lowers the cost of a data warehouse, but also makes it easy to analyze large amounts of data very quickly. With Amazon Redshift, customers can dramatically increase query performance when analyzing virtually any size data set, using the same SQL-based business intelligence tools they use today. Amazon Redshift uses a number of techniques, including columnar data storage, advanced compression, and high performance IO and network, to achieve significantly higher performance than traditional databases for data warehousing and analytics workloads. Amazon Redshift is fully managed, automating all the common tasks associated with provisioning, configuring, monitoring, backing up, scaling, and securing a data warehouse. Amazon Redshift is currently available in the US East (N. Virginia) Region and will be rolled out to other AWS Regions in the coming months.
“When we set out to build Amazon Redshift, we wanted to leverage the massive scale of AWS to deliver ten times the performance at 1/10 the cost of on-premise data warehouses in use today,” said Raju Gulabani, Vice President of Database Services, Amazon Web Services. “With order of magnitude improvements in price/performance, Amazon Redshift makes big data analytics accessible to more people, allowing large organizations to analyze more of their data and smaller ones to afford fast, scalable data warehousing technology. We are delighted by the excitement from our preview customers as they’ve experienced the performance improvement and lower costs that Amazon Redshift delivers.”
Hundreds of customers participated in the Amazon Redshift limited preview, and the benefits most frequently cited were significantly lower cost, substantially improved performance, and freedom from having to manage the pain and undifferentiated heavy lifting of operating an on premise data warehouse.
HasOffers records and reports billions of desktop and mobile interactions for performance marketers. "Amazon Redshift introduces a major opportunity to improve the performance of our real-time reporting, allowing us to run queries up to 50 times faster than our current OLAP solution," said Nick Sanders, VP of Engineering at HasOffers. "On top of that, Amazon Redshift reduces overhead for managing our own stack and drastically simplifies our sharded schema design through distribution and sort keys. Fundamentally, we believe the combination of price and performance provided by Amazon Redshift will shake-up the data warehousing world."
Photobox is one of Europe’s leading on-line photo service providers. “We started using Amazon Redshift immediately after it was announced and we obtained crazy performance, especially in loading data," said Maxime Mézin, Data Scientist at Photobox. “It took just 5 minutes to load a dataset that previously took days to extract on our side.”
Kongregate’s gaming portal provides thousands of free online and mobile games to social gamers. “Kongregate was able to realize exponential gains in performance by rolling one of our largest data tables off our main database and onto the Amazon Redshift platform,” said Jim Greer, CEO & Co-Founder, Kongregate. “Amazon Redshift has enabled us to perform traffic analysis at a scale that was previously impossible."
Accordant Media provides over twenty billion impressions per day for digital marketing campaigns. “After testing many relational and non-relational database and data warehouse options, Amazon Redshift is our clear winner," said James Rooney, Vice President of Platforms at Accordant Media. "Accordant requires both speed and efficiency when handling the massive data generated by our advertising campaigns, and we have been tremendously pleased with the performance, price, and ease of use of Amazon Redshift.”
Scribol recommends content to publishers based on their readers’ interests. "Amazon Redshift enables us to rapidly analyze and identify data points across our audience of millions of users and billions of actions,” said Tomek Klas, CTO, Scribol. “We’ve been pleased thus far with our experience in the preview and are very excited about the performance and ability to scale of Amazon Redshift. The fact that this comes with an attractive price tag has us sold on the technology and makes Redshift an essential part of our technology stack."
In the ten weeks since Amazon Redshift was announced, AWS technology software partners including SAP, IBM, Informatica, Tableau, Attunity, Actuate, Pentaho, Talend, Birst, Roambi and Pervasive have joined MicroStrategy and Jaspersoft in enabling customers to continue using the tools they do today. A growing number of these partners’ solutions that leverage Amazon Redshift are available from the AWS Marketplace for 1-Click deployment with pay-as-you-go pricing. Technology consulting companies including Capgemini, Cognizant, and Full360 have consultants ready to help customers with their Amazon Redshift implementations.
"Having implemented many large-scale data management solutions, we are excited to use Amazon Redshift to deliver high performance, large scale, and extremely low cost managed information systems to our clients,” said Karthik Krishnamurthy, Global Head Business Intelligence and Data Warehousing for Cognizant. “By combining Cognizant's expertise with the performance, price, and ease of deployment and operation of Amazon Redshift, we will empower our customers to spend more on growing their businesses and less on their IT."
"Digital marketers have seen an explosion in the amount and variety of customer data," said StrongMail CTO Jeremy Sterns. "Working with AWS to leverage Amazon Redshift with our next generation data model, will give StrongMail's enterprise marketers enormous cost, performance and ease of use advantages for executing better targeted, more effective, insight-driven marketing campaigns. These technologies open up new possibilities that are unavailable to those offered by legacy email service providers with siloed databases."
About Amazon Web Services
Launched in 2006, Amazon Web Services, Inc. began exposing key infrastructure services to businesses in the form of web services -- now widely known as cloud computing. The ultimate benefit of cloud computing, and AWS, is the ability to leverage a new business model and turn capital infrastructure expenses into variable costs. Businesses no longer need to plan and procure servers and other IT resources weeks or months in advance. Using AWS, businesses can take advantage of Amazon's expertise and economies of scale to access resources when their business needs them, delivering results faster and at a lower cost. Today, Amazon Web Services provides a highly reliable, scalable, low-cost infrastructure platform in the cloud that powers hundreds of thousands of enterprise, government and startup customers businesses in 190 countries around the world. Amazon Web Services offers over 30 different services, including Amazon Elastic Compute Cloud (Amazon EC2), Amazon Simple Storage Service (Amazon S3) and Amazon Relational Database Service (Amazon RDS). AWS services are available to customers from data center locations in the U.S., Brazil, Europe, Japan, Singapore and Australia.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.