May 04, 2010
NEW YORK, May 4, 2010 -- NTT America, Inc., a wholly owned U.S. subsidiary of NTT Communications Corporation, today announced that it is leveraging OpSource’s cloud capabilities, the NTT Communications’ Global IP Network and NTT America’s highly secure data center infrastructure to deliver an enterprise-grade, multi-tenant public cloud solution to its customers in the U.S. market.
“The OpSource Cloud capabilities combine the flexibility, availability and community of the public cloud with the security, performance and control the enterprise needs”
The NTT America Cloud is an extension of the company’s suite of virtualization services based on the best-of-class VMware virtualization platform. Satisfying the immediate needs of a public cloud offering for U.S. enterprise customers, this local service implementation complements the global virtualization service announced earlier by NTT Communications. The NTT America Cloud offers enterprises the ability to rapidly build, deploy and manage their own networks and servers in a highly secure environment. It supports multiple operating systems, including Windows, Red Hat and CentOS, and is backed by an industry-leading SLA which includes a 100% uptime guarantee.
Customers will have the ability to rapidly provision dedicated VLAN’s; configure firewalls, deploy server environments and manage those environments from anywhere in the world. Customers can configure and turn on resources as needed, and shut them down when they are no longer required, making the solution highly scalable and cost-effective. All of this can be controlled via a self service portal or automated through an API.
The NTT America Cloud is built to run enterprise-class software as a service (SaaS) and web applications, and is ideal for customers who are developing applications and/or seeking flexible solutions to deploy computing power on demand. The service enables customers to create as many private networks as are needed, each with their own firewalls, to keep test and development environments separate. It also allows them to develop and test their applications in the same VMware environment the application will run in, and allows testing of applications in the exact configuration that future users will utilize.
“We are excited to offer the NTT America Cloud to our customers, as it demonstrates our commitment to providing them with a complete and highly competitive portfolio of hosting options and solutions,” said Doug McMaster, Vice President, Product & Solutions. “With NTT America, our customers can be assured we have the products and services to meet their hosting needs, and are continuing to invest in ‘best of breed’ platforms and technologies to deliver the quality and comprehensiveness they have come to rely on.”
“The OpSource Cloud capabilities combine the flexibility, availability and community of the public cloud with the security, performance and control the enterprise needs,” said Treb Ryan, CEO, OpSource. “By pairing OpSource Cloud’s technology with NTT America’s secure data center infrastructure, companies can take advantage of a solution that is not only powerful and completely reliable but scalable, removing any remaining barriers to enterprises adopting cloud computing today.”
Components of NTT America Cloud include: CPU Hours; RAM Hours; Storage Hours; Networks (per hour per additional network); Inbound Bandwidth (per GB); Sub-Administrators (per hour per additional admin). Services are available on a Month-To-Month, Pay-As-You-Go basis with no annual commitment. Customers may also select pre-paid monthly plans at reduced rates with overages paid on a Pay-As-You-Go basis.
About NTT America
NTT America is North America’s natural gateway to the Asia-Pacific region, with strong capabilities in the U.S. market. NTT America is the U.S. subsidiary of NTT Communications Corporation, the global data and IP services arm of a Fortune Global 500 telecom leader: Nippon Telegraph & Telephone Corporation (NTT). NTT America provides world-class Enterprise Hosting, managed network, and IP networking services for enterprise customers and service providers worldwide.
About NTT Communications Corporation
NTT Com delivers high-quality voice, data and IP services to customers around the world. The company is renowned for its diverse information and communication services, expertise in managed networks, hosting and IP networking services, and industry leadership in IPv6 transit technology. The company’s extensive global infrastructure includes Arcstar private networks and a Tier 1 IP backbone (connected with major ISPs worldwide), both reaching more than 150 countries, as well as secure data centers in Asia, North America and Europe. NTT Com is the wholly owned subsidiary of Nippon Telegraph and Telephone Corporation, one of the world's largest telecoms with listings on the Tokyo, London and New York stock exchanges.
Source: NTT Communications
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.