December 04, 2006
Ciena Corporation is partnering with MIT Lincoln Laboratory and the Naval Research Laboratory (NRL) to upgrade the Boston South Network (BoSSNET) -- an applications and network test bed for researching high bandwidth (tens of Gbps per wavelength), long reach (approx. 1,000 km) all-optical transmissions -- using the company's standards-based, all-optical DWDM technology.
Designed and managed by MIT Lincoln Laboratory in its role as a Federally Funded Research and Development Center for the U.S. Department of Defense, BoSSNET is the research network for the Lab's Advanced Networks Group, which architects, designs and builds prototypes of next-generation optical, wireless and satellite communication networks to address critical problems of national security, while also transferring those technical advances to commercial applications.
BoSSNET spans from Boston to Washington, D.C., with local metro area connections to Lincoln Lab, New York and the D.C. area. Qwest Communications International Inc. has been a partner with this leading-edge network since 1998, and provides fiber, facilities, and operations support.
BoSSNET will be upgraded with Ciena's CoreStream Agility Optical Transport System, which is capable of routing wavelength services up to speeds of 40 Gbps for long-haul and ultra long-haul network routes. The standards-based, all-optical DWDM platform simplifies networks by eliminating costly optical-electrical-optical (OEO) signal conversions and transceivers, and lowers ongoing operational costs by automating end-to-end service provisioning and management, inventory and resource tracking, and dynamic power management. BoSSNET will also leverage CoreStream's software-defined ROADM technology to enable automated wavelength activation and reconfiguration in capacity increments of 10G and 40G today, with a clear migration path to 100G. CoreStream is also the core optical transport and switching foundation of the Department of Defense's GIG-BE network, ensuring full compatibility for BoSSNET.
"Higher line side bit rates with fewer lambdas to manage not only lowers operational expenditures, but provides a path to 100G," said Eve Griliches, research manager at IDC. "Systems that cost-effectively deliver 40G today with a future-proof vision are well suited for the advanced networking needs of R&E environments."
A key partner in the BoSSNET upgrade is NRL, the corporate laboratory for the Navy and Marine Corps, which conducts a broad program of scientific research and advanced technology development. NRL is also a Principal Federal Agency Member of the Advanced Technology Demonstration Network (ATDnet), a high-performance networking test bed in the Washington D.C. area.
"BoSSnet and ATDnet are unique all-optical assets where researchers can deploy next-generation technologies that enable dynamic peer-to-peer resource sharing between typical and high-end users," stated Dr. Henry Dardy, DoD/Navy Senior Technologist. "The networks provide researchers the capability to test low-latency, high-bandwidth streaming of data at 40G and 100G, providing the next steps toward the DoD's goal of scaling to terabit streams riding on a wavelength as technology progresses in future years."
BoSSNET plans include an initial upgrade of the network to 40G in early 2007, followed by an aggressive deployment schedule to the rest of the network. Once the network is fully upgraded, MIT Lincoln Laboratory, NRL and Ciena will begin planning the migration to 100G.
"As Ethernet increasingly becomes the foundation for high-capacity networks, a highly-resilient core optical network scalable to beyond 100 Gb/s per channel will be critical to accommodate the growth in bandwidth-intensive applications carried across BoSSNET and other advanced networks," said Steve Alexander, chief technology officer for Ciena. "To maximize BoSSNET's capacity and operational efficiency, CoreStream delivers an open standards line system with 40G capabilities and a path to 100G, allowing operators to increase capacity without forklift upgrades of proprietary technology, additional network elements or costly optical-to-electronic conversions."
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.