April 25, 2012
500% increase in managed servers lets company offer insights from customer feedback and data collection
MOUNTAIN VIEW, Calif., April 25 — Cloudyn today unveiled data collected by its heuristics-based solution and from one-one conversations with key customers about trends in cloud usage and spending. The information supports current market thinking about cloud utilization and provides insights into how cloud customers are managing their costs. Most revealing is that despite recent changes to Amazon's AWS reservation system, most users are still using the more expensive "On-Demand Pricing," illustrating the need for a cloud cost management tool. Cloudyn customers' cloud usage is also increasing consistently by 7% per month, with weekly usage spikes of up to 70%. This average monthly increase is in line with market data showing the typical cloud application is changed on average more than twice a month to add more data, power or storage.
Since launching in February, Cloudyn has quadrupled the customer base of its cloud cost management solution and is now actively monitoring approximately 250,000 virtual servers daily−a 500% increase. The average Cloudyn customer spends approximately $100K a year on cloud resources. The company has also increased its support to additional AWS services and is now providing insights, control mechanisms and actionable recommendations to Amazon Elastic Compute Cloud (Amazon EC2), Amazon Relational Database Service (Amazon RDS) and Amazon Elastic Block Store (Amazon EBS).
Cloudyn provides IT managers and CIOs with personalized, timely analytics to identify unnecessary spending, unused resources and over-provisioned services; ongoing notifications for proactive control of their cloud expenses; and comprehensive prescriptive actions for continuously optimizing frequently changing pricing plans and deployment configuration while maintaining operational efficiency.
Data collected by Cloudyn shows agility continues to be a critical factor for managing the cloud. Pricing plan revisions and reductions are introduced almost on a monthly basis by cloud providers, making it challenging to keep track of the latest best-price/model updates. For example, recent changes in AWS reservation offerings and price reductions, made the Light Utilization Reserved Instances most cost effective for any instance running only 23% of the year or more. However, Cloudyn is seeing that customers are yet to take full advantage of that.
"The system data and customer feedback shows that cloud costs can be confusing for users who try to keep up with cost management on their own," said Sharon Wagner, CEO of Cloudyn, "For example, we observed that more than 75% of all launched instances are running less than 12 hours. Of these short-running instances about half are performing recurring tasks, and could have run under a Light Utilization plan that would save money for the organization, but their owners are not aware of that."
Cloudyn discovered that most of its customers remain unaware of the cost implications of the new reservation models and therefore are overpaying for resources. Company data also shows that cloud users tend to default to the most expensive On-Demand pricing plan, revealing that only 30% of all instances that could use the Heavy Utilization reservation plan are actually reserved, and less than 1% of all the instances that may benefit from Light Utilization reservation plan are using this option.
Nevertheless, Cloudyn is seeing an increase in the use of the AWS reservation model, especially in the past few months− increasing from less than 20% in Jan 2012 to 45% today. This trend appears to be related to a combination of the recent market buzz around cloud cost efficiency and customers awareness to the high cost often incurred by cloud usage.
The nature of cloud applications is also variable. Most of them are dynamic by nature, tend to scale and are changed on a monthly basis. According to information from Cloudyn's customers, in addition to the average 7% growth per month, weekly usage spikes of up to 70% often occur due to bugs or system malfunction, such as instances that were not terminated on time, or storage aggregation errors that over-utilize storage space. Thus, a tool that provides the insight to what, how, and when resources are used, and alerts of violations, gives Cloudyn customers the ongoing cost controls they need to resolve such issues as soon as they become apparent.
Founded in 2011, Cloudyn offers organizations clarity and control of cloud costs, actionable recommendations for cost savings and maximized utilization of resources in a dynamic cloud environment. Already actively managing hundreds of thousands of cloud resources today, Cloudyn delivers an average of 40% cloud-related cost reductions to organizations with significant cloud investments in various industries worldwide. Privately held and backed by Elron Electronic Industries Ltd. (TASE: ELRN), Cloudyn's team of software veterans combines extensive experience in cloud computing and network optimization with a proven track record in transforming innovative ideas into viable and valuable products. For more information on the company, visit http://www.cloudyn.com.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.