December 19, 2005
"This course brings state of the art computing into the undergraduate curriculum," said Barry Wilkinson from the University of North Carolina at Charlotte. "In this class the NCREN wasn't just used to transmit lectures; it was also used to create a Grid infrastructure for the students."
Wilkinson, along with UNC Wilmington faculty member Clayton Ferner and a team of six faculty and students, led development for the course, which is funded by the UNC Office of the President and the National Science Foundation. Students learn about grids from the bottom up, studying technology first and applications later. Lectures are transmitted from UNC Charlotte to 12 institutions, and students complete a series of five assignments to learn how to set up and use a grid.
"We start them off with an assignment in Web services, next they generate a Grid service and contact it, then they submit jobs to the Globus Resource Allocation Manager (GRAM), use a scheduler and a workflow editor," explained Ferner. "The goal is to train students who will one day support or implement research on grids."
Wilkinson and Ferner found developing the new course a challenge. The summer of 2004 was spent setting up the first class Grid based on Globus Toolkit 3.2, developing the lengthy course assignments and creating hundreds of PowerPoint slides from which to lecture. This past summer was spent updating the grid and assignments to GT4. They also integrated some of their own Grid research into the curriculum, using the GridNexus workflow editor developed at UNC Wilmington.
"This is not a course regularly taught at any level," said Wilkinson. "There was no textbook to follow, and so we've created all the material from scratch. The class has been an enabler for the students -- several of the students from 2004 and 2005 have continued on in Grid research after the end of class."
Learn more at the Web site, www.cs.uncc.edu/~abw/ITCS4010F05/.** This article originally appeared in Science Grid This Week (www.interactions.org/sgtw/). SGTW is produced at Fermilab and jointly funded by the National Science Foundation and the Department of Energy's Office of Science.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.