November 29, 2012
SAN FRANCISCO, Calif., Nov. 29 – Boundary, the cloud application monitoring leader, has partnered with Locaweb, the leading provider of infrastructure hosting services to companies operating in Brazil and Latin America. Through the agreement, Locaweb customers, including many of the region's largest enterprise organizations, will have access to Boundary's real-time cloud failure detection system, the first solution capable of alerting application providers to cloud outages before consumers suffer the consequences.
Locaweb was the first data center operator in Brazil to provide cloud computing solutions. Today, nearly 24 percent of the Brazilian Internet run through its data centers. The hosting provider employs over 900 people, has more than 250,000 customers, and owns three data centers (two in Brazil and one in Miami) with upwards of 25,000 servers.
By partnering with Boundary, Locaweb customers in Latin America can more confidently migrate their applications to cloud and hybrid IT environments. Unlike traditional IT monitoring tools that assume a static infrastructure, Boundary is ideal for dynamic environments that experience change on a massive and continuous scale, including public and private clouds, fixed networks, data centers, highly-agile application development environments, SQL or NoSQL clusters and Big Data application stacks.
"Many of our customers are interested in capitalizing on the benefits of hosting applications in the cloud, yet legacy monitoring applications don't provide the visibility of performance that is necessary to ensure stability in these dynamic environments," said Gilberto Mautner, CEO at Locaweb. "Only Boundary can tell our customers when a cloud outage is happening and exactly where the problem is, so they can work around the problem and maintain service availability. Boundary provides the real-time insight, tools and services that our customers need to be successful."
"Locaweb is the largest cloud infrastructure as a service (IaaS) provider in Latin America, and we are thrilled to enter this exciting market with them as a partner," said Gary Read, CEO at Boundary. "There is a tremendous appetite among companies in Latin America and elsewhere for cloud applications, but providers are understandably concerned about their ability to guarantee high levels of service. Boundary is their insurance policy. Our solution monitors the performance of cloud applications every single second, so users can navigate around problems before customers are impacted."
Locaweb is the leader for Hosting & Infrastructure for Brazil and Latin America, according to IDC in 2011. With 14 years of expertise and more than 21.000 developers partnerships, the company provides hosting for more than 250.000 customers and has a datacenter for 25.000 servers to provide solutions for IaaS, PaaS and SaaS: Infrastructure as a service (IaaS); on-demand provisioning for Cloud and dedicated servers with managed services; platform as a service (PaaS): Hosting; software as a service (SaaS): Hosted software solutions like Email, Email Marketing, Online store, Virtual PABX, Webchat and WebDesk. Locaweb believes that innovation, quality of service and a high qualified team are the key factors to be successful on the market.
Boundary provides a new kind of application monitoring for new IT architectures: one-second app visualization, cloud-compatible, and only a few minutes from setup to results. Boundary is a privately-held company based in San Francisco, California, with venture funding from Lightspeed Venture Partners and Scale Venture Partners.
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.