October 01, 2012
PHILADELPHIA, Oct. 1 — The nation's first 100G open, national-scale, software-defined network – built to support advanced services and cloud applications – is now available to spur new waves of innovation in education, research and industry.
Internet2, operator of the nation's fastest, coast-to-coast research and education network, announced today at the Fall 2012 Internet2 Member Meeting in Philadelphia that their new 100G-enabled and 8.8 Terabit per second optical network is operational for member institutions.
"We are excited about officially launching the new capabilities of the nation's first 100G open, national-scale, software-defined network through massive collaboration with our partners in government and business that will be used by Internet2 members to help solve practical, far-reaching problems that benefit society," said Internet2 CEO and President H. David Lambert. "We look forward to seeing the collaboration of our community using this dynamic network to advance education, transform university business models, and accelerate global Big Data collaborative research outcomes. When we equip the research and education community with great technology and no barriers to innovation, that's when they start creating the future."
The new capabilities now available on the Internet2 Network include:
Internet2's members also can leverage the new Internet2 Network to deliver Internet2 NET+ Services and keep pace with the exponential growth in Big Data science being driven by the nation's collaborative researchers in labs and universities. Internet2 NET+ Services currently offers 29 cloud services to college campuses nationwide that are cost-effective, easy to access, simple to administer, and tailored to the unique needs of the research and education community. The new network and cloud services enable transformational new solutions for education delivery and provide better-yielding solutions for university business functions – helping higher education institutions remain competitive nationally and globally.
Working with its regional network partners, Internet2's U.S. UCAN project will utilize the newly upgraded 100G-enabled Internet2 Network to allow advanced networking features for more than 200,000 of the country's community anchor institutions, including libraries, hospitals, K-12 schools, community colleges, and public safety organizations. The network infrastructure will support advanced applications, such as HD and multi-cast video distance learning and telemedicine. The upgraded network was predominately funded through the U.S. Department of Commerce's Broadband Technology Opportunities Program.
Internet2 is a member-owned advanced technology community founded by the nation's leading higher education institutions in 1996. Internet2 provides a collaborative environment for U.S. research and education organizations to solve common technology challenges, and to develop innovative solutions in support of their educational, research, and community service missions. For more information, visit www.internet2.edu.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.