April 11, 2011
NEW YORK, April 11, 2011 -- NTT America, a global infrastructure services provider and wholly owned subsidiary of NTT Communications Corporation, today announced the first commercial availability of its enterprise private cloud hosting environment, built using VMware vSphere and VMware vCloud Director virtualization and cloud infrastructure software. The new utility-based computing solution, the NTT America Enterprise Cloud, is designed to appeal not only to mid-sized customers, but also to large enterprises seeking a flexible approach to managing their IT computing resources through cloud computing.
NTT America, who first announced the platform integration with the VMware platform at VMworld in August 2010, is one of the first providers globally to launch cloud services based on VMware vCloud Director that provide customers with additional computing capacity on demand as needed. With the new NTT America Enterprise Cloud, customers will now have available and scalable access to a pool of dedicated and/or shared computing resources that deliver private cloud services to users at NTT America’s premier data centers without having to purchase their own physical servers.
The NTT America Enterprise Cloud creates customized computing resources or virtual datacenters for each customer. Through NTT America’s Customer Portal, end-users can now serve themselves by creating, using, and managing virtual machines and virtual applications within their computing resources. While empowering customers with the ability to fully control their infrastructure resources, NTT America also offers multiple layers of physical and logical security as well as guaranteed availability SLAs with each environment.
“The NTT America Enterprise Cloud built on VMware vSphere® and VMware vCloud® Director will enable customers to leverage the full benefits of cloud computing in a highly secure and cost-effective manner, and delivering IT as a service to users,” said Dan Chu, vice president, cloud infrastructure and services, VMware. “VMware continues to work with partners such as NTT America to provide an evolutionary path to cloud computing that will continue to help organizations harness the benefits of their own IT infrastructure and a hosting provider's cloud computing offering, enabling greater agility to respond to changing IT and business demands.”
NTT America Enterprise Cloud complements the existing NTT America Cloud, which together provide both public and private cloud computing resources. NTT America Enterprise Cloud computing resources will provide flexibility for customers that require fewer resources to opt for a multi-tenant environment and for those with greater needs to choose completely dedicated computing resources, all while in a private, secure environment.
With NTT America Enterprise Cloud, customers will be able to purchase the minimum amount of resources they require to design their computing environment, including CPU, RAM, storage and network access. They will then be billed a defined monthly rate based on the amount of these resources. Customers will also have access to an additional pool of computing resources beyond that which they have purchased, which will allow for temporary capacity and is ideal for end users with seasonal needs or variances.
“As companies are still in the transition mode transforming IT, private clouds continue to be a preferred deployment model for CTOs/CIOs to entrust to their computing resources,” said Agatha Poon, Research Manager with Tier1 Research. “Aside from mitigating risks associated with data security and regulatory compliance, a highly automated private cloud that simplifies the complexity of IT management without compromising performance and productivity is instrumental in bringing enterprise cloud computing into the enterprise mainstream.”
“Traditional Managed Hosting can be the right choice for certain applications, but the Enterprise Cloud provides the best alternative for customers that only require a few virtual machines or need scalable, cost-effective, efficient and secure computing resources that can be managed anytime and from any location,” said Doug McMaster, NTT America’s vice president for data center and cloud solutions.
About NTT America
NTT America is North America’s natural gateway to the Asia-Pacific region, with strong capabilities in the U.S. market. NTT America is the U.S. subsidiary of NTT Communications Corporation, the global data and IP services arm of a Fortune Global 500 telecom leader: Nippon Telegraph & Telephone Corporation (NTT). NTT America provides world-class Enterprise Hosting, managed network, and IP networking services for enterprise customers and service providers worldwide. For additional information on NTT America, visit us on the Web at www.us.ntt.com.
Join us on Facebook at http://www.facebook.com/NTTAMERICA, follow NTT America on Twitter at @NTT_America or LinkedIn at http://www.linkedin.com/companies/ntt-america for up-to-date news and announcements.
About NTT Communications Corporation
NTT Communications provides a broad range of global networks, management solutions and IT services to customers worldwide. The company is renowned for reliable, high-quality security, hosting, voice, data and IP services, as well as expertise in managed networks and leadership in IPv6 transit technology. NTT Communications’ extensive infrastructure includes Arcstar Global IP-VPN and Global e-VLAN, as well as a Tier 1 IP backbone reaching more than 150 countries in partnership with major Internet service providers, and secure data centers in Asia, North America and Europe. NTT Communications is the wholly-owned subsidiary of Nippon Telegraph and Telephone Corporation, one of the world’s largest telecoms with listings on the Tokyo, London and New York stock exchanges.
Source: NTT Communications Corporation
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.