February 07, 2012
Amazon has cut prices on S3, its cloud-based storage service, effective Feb. 1. The news comes a week after the company announced that S3 had achieved an impressive year-over-year growth rate of 192%. At the end of 2011, there were 762 billion objects stored in Amazon S3, with 500,000 requests per second coming in during peak times.
Here's what Amazon Web Services Evangelist Jeff Barr had to say when he broke the news on the company blog:
We continue to innovate on your behalf to drive down storage costs and pass along the resultant savings to you at every possible opportunity. We are now happy (some would even say excited) to announce another in a series of price reductions.
Barr also talks up the benefits that AWS's scale and focus creates for their customers. "Our ability to lower prices again now is an example of this principle at work," he writes.
Amazon S3 pricing is based on a tiered system, so the actual discount depends on the amount of data used. For example, with 50 TB of data stored, there will be about a 12% price reduction, while a customer who uses an average of 500 TB of storage will see a 13.5% reduction in costs. When S3 launched in 2006, it costs $0.15 per GB per month for the first TB, and now the same level of storage costs $0.125 per GB per month.
Barr reminds readers that an "added advantage of using a cloud storage service such as Amazon S3 over using your own on-premise storage is that with cloud storage, the price reductions that we regularly roll out apply not only to any new storage that you might add but also to the existing storage that you have."
Amazon's storage rates vary depending on where the data is stored. Prices for standard storage customers are listed below, while pricing for other regions can be found on the Amazon S3 Pricing page. Users of the AWS GovCloud region will also see lower prices, according to Barr.
|Storage||Old (GB / Month)||New (GB / Month)|
|Next 4000TB||$0.080||$0.080 (no change)|
|Over 5000TB||$0.055||$0.055 (no change)|
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.