December 10, 2008
SEATTLE, Dec. 10 -- Amazon Web Services LLC (AWS), subsidiary of Amazon.com Inc., today announced that it has extended Amazon Elastic Compute Cloud (Amazon EC2) to Europe.
European developers and businesses can now run their Amazon EC2 instances in multiple Availability Zones in the EU to help achieve lower latency, operate closer to other resources like Amazon S3 in the EU, and meet EU data storage requirements when applicable. The new European Region for Amazon EC2 contains two Availability Zones enabling developers to easily and cost effectively run fault-tolerant applications with the same scalability, reliability and cost efficiency achieved with Amazon EC2 in the U.S. Amazon EC2 reduces the time required to obtain and boot new virtual server instances to minutes, allowing developers to scale capacity, both up and down, as computing requirements change. This enables businesses to reduce capital expense and pay for resources as they are consumed. Developers worldwide can begin using Amazon EC2 and other AWS infrastructure services by going to http://aws.amazon.com.
With today's launch, European developers and businesses with European customers can take advantage of the latest features for Amazon EC2 including multiple Availability Zones, Elastic IP addresses, and Amazon Elastic Block Store (Amazon EBS). In the near future, Amazon EC2 will also add support for Windows Server and SQL Server in the EU which is a new feature that was recently introduced on Amazon EC2 in the United States.
"We are very excited to fulfill one of our top user requests by launching Amazon EC2 in Europe. In conjunction with Amazon S3 and Amazon CloudFront in the EU, Amazon Web Services customers can now locate their storage, distribution, and compute in Europe, better allowing them to achieve low latency to their European customers and to meet EU data storage requirements when applicable," said Peter De Santis, general manager of Amazon EC2. "Starting today, businesses needing globally distributed infrastructure can easily manage and deploy their storage and compute with a simple, powerful set of Web service APIs and tools."
"The ease of getting up and running on Amazon EC2 has made us rethink the way we currently operate our distributed computing on the Grid," said Stefan Kluth, local project lead in Munich for computing for the ATLAS experiment at the LHC accelerator at CERN. "We're excited by the potential to move faster than previously possible because we can focus on the experiment as opposed to the technology to run it."
"For the last two years CohesiveFT has been working to help European customers leverage the power of Amazon Web Services," said Alexis Richardson, co-founder and director of business development at CohesiveFT. "The launch of Amazon EC2 services in Europe sets the stage for an exciting new wave of production deployments with reduced latency, expanded regulatory compliance options, and the ability to deploy scale-up/scale-out architectures within the region."
Amazon EC2 lets developers pay only for what they consume and charges no up-front or minimum fee. The following are $U.S. prices for Amazon EC2 in Europe:
Standard (per instance hour consumed)
$0.11 for small instances
$0.44 for large instances
$0.88 for x-large instances
High CPU (per instance hour consumed)
$0.22 for medium instances
$0.88 for x-large instances
$0.10 per GB - all data transfer in
$0.17 per GB - first 10 TB / month data transfer out
$0.13 per GB - next 40 TB / month data transfer out
$0.11 per GB - next 100TB
$0.10 per GB - over 150TB
About Amazon EC2
Amazon Elastic Compute Cloud (http://aws.amazon.com/ec2) is a Web service that provides resizable compute capacity in the cloud. Amazon EC2's simple Web service interface allows businesses to obtain and configure capacity with minimal friction. It provides complete control of your computing resources and lets you run on Amazon's proven computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change. Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use.
Amazon.com Inc., a Fortune 500 company based in Seattle, opened on the World Wide Web in July 1995 and today offers Earth's Biggest Selection. Amazon.com, Inc. seeks to be Earth's most customer-centric company, where customers can find and discover anything they might want to buy online, and endeavors to offer its customers the lowest possible prices. Amazon.com and other sellers offer millions of unique new, refurbished and used items in categories such as books, movies, music & games, digital downloads, electronics & computers, home & garden, toys, kids & baby, grocery, apparel, shoes & jewelry, health & beauty, sports & outdoors, and tools, auto & industrial.
Amazon Web Services provides Amazon's developer customers with access to in-the-cloud infrastructure services based on Amazon's own back-end technology platform, which developers can use to enable virtually any type of business. Examples of the services offered by Amazon Web Services are Amazon Elastic Compute Cloud (Amazon EC2), Amazon Simple Storage Service (Amazon S3), Amazon SimpleDB, Amazon Simple Queue Service (Amazon SQS), and Amazon Flexible Payments Service (Amazon FPS). Amazon and its affiliates operate Websites, including www.amazon.com, www.amazon.co.uk, www.amazon.de, www.amazon.co.jp, www.amazon.fr, www.amazon.ca, and the Joyo Amazon Websites at www.joyo.cn and www.amazon.cn.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Cloud computing has become mainstream in today’s HPC world. In order to enable the HPC researchers who currently work with large distributed computing systems, to bring their expertise to cloud computing, it is essential to provide them with easier means of applying their knowledge.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.