November 27, 2012
Watching the competition between public cloud providers is like following a multi-party ping pong game – there's a lot of back and forth. On Monday Google delivered a counterhit to rival Amazon, revealing upgrades to its infrastructure as a service (IaaS) offering, Google Compute Engine, as well as reduced storage pricing and enhanced European datacenter support.
When Google Compute Engine debuted in June, it supported just four standard instance types. In the coming weeks, Google will be rolling out 36 additional instance types, and pricing of the four original instances will be cut by 5 percent.
Google Product Management Director Jessie Jiang summarizes the new instance categories thusly:
Google is also decreasing the cost of its standard storage offering by over 20 percent, from $0.12 per GB down to $0.095 per GB (for the first terabyte). And for customers who are willing to trade data availability for a lower price point, Google is announcing Durable Reduced Availability (DRA) storage, at a cost of $0.07 per GB for the first TB.
Yet another new service, Object Versioning, is designed to help protect against accidental overwriting or deletion. And Persistent Disk Snapshotting, which lets users create backups that they can transfer around Google datacenters, is also in the works.
Google is actively seeking to expand its European presence. Google App Engine, Google Cloud Storage and Google Cloud SQL will be accessible from Europe-based datacenters with Google Compute Engine soon to follow.
Two weeks ago, the search giant announced enhancements to its MySQL database, Google Cloud SQL, including faster performance, larger databases (100GB), and EU availability.
The latest upgrades to Google's cloud portfolio were unveiled the day before Amazon kicked off its first annual user conference, AWS re:Invent, in Las Vegas. Google Compute Engine is still in preview mode and no official launch date has been released by the company.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.