April 04, 2011
HUBzero's latest open source release enhances the platform for creating powerful websites supporting computation, research, education and collaboration in science, engineering and almost any other field.
HUBzero production release 1.0, announced at the HUBbub 2011 workshop in Indianapolis Tuesday, April 5, includes several improvements that help users form interest groups within a hub to exchange data, share models and publish results. The new version also includes additional features for building and deploying Web-ready computational research tools.
HUBzero makes posting and using computational tools about as easy as posting and viewing a YouTube video and brings access to high-performance computing, cloud and grid resources as close as your Web browser. Built-in social networking features akin to Facebook create communities of researchers, practitioners, educators and students and facilitate virtual research partnerships and education.
Groups in HUBzero 1.0 have been redesigned to provide more ways to share information. A group can now be, in effect, a customizable mini website within a hub, complete with calendaring, messaging, wikis, discussion forums, and blogs, as well as multimedia materials such as embedded videos and slideshows.
The additional features come in modules that can be turned on or off as desired, says Michael McLennan, senior research scientist and hub technology architect at Purdue University, which developed HUBzero. Permissions in groups also are controllable, allowing content to be private, available to anyone or only to group members and invitees.
HUBzero 1.0 also includes major improvements to HUBzero's Rappture toolkit, the key resource for turning research modeling and simulation codes into interactive, Web-enabled programs. These enhancements make building online computational research tools even easier - and also make it easier to validate results.
Rappture's new "builder" tool lets researchers drag and drop objects from a graphical palette in preparing codes for hub deployment, preview the project as it's constructed and ultimately generate XML and other programming code that is compiled to make the production version.
Rappture also adds a regression tester tool allowing researchers to create sets of test results exercising their code against a large collection of input values in order to verify that the software is functioning correctly and validate it against experimental results. The regression tester can be invoked whenever the software is updated to guard against errors.
Among the other new features in Version 1.0:
* Faster and enhanced search covering content in question and answer sessions, wish lists, blogs and other areas, as well as tags attached to resources that might not fit a search otherwise.
* A blog module for personal profiles, in addition to blogs for user groups.
* Improved analytics and per-contributor reports for impact and outreach activities.
* A Twitter feed module to embed Twitter feeds within a hub.
"Social media is a powerful new paradigm for communication," McLennan says. "The latest release of HUBzero will help people harness that power and focus it on science, engineering and other problems."
The open source package is available on the HUBzero website at http://hubzero.org/getstarted. Purdue offers a hub hosting service. But the open source release allows users to download the software and host a hub themselves.
HUBzero beta 0.9, originally released as open source in 2010, has helped get the Ethics Core Digital Library off the ground quickly. The hub supports the National Center for Professional and Research Ethics funded by the NSF in October.
HUBzero's flexible, extensible and scalable architecture is allowing collaborators from business, law, engineering, education and library sciences to build a gateway to research and professional ethics training and materials for researchers, professionals, business and campus administrators, teachers and students.
"If we had to do this all from scratch, we wouldn't be anywhere near where we are," says William Mischo, one of the hub's leaders and head of the Grainger Engineering Library Information Center at the University of Illinois.
HUBzero is supported by a consortium of universities including Purdue, Indiana, Clemson and Wisconsin. The platform is now used by more than 30 hubs and growing. The existing hubs enable a wide spectrum of projects in science and engineering, health care, social science and education. They deliver hundreds of research tools, seminars and other materials to nearly a half million visitors annually, while helping to satisfy National Science Foundation and other grant funder cyberinfrastructure requirements.
HUBzero was originally developed at Purdue to power nanoHUB.org, an international resource for nanotechnology science and engineering. But the underlying technology proved to be so attractive that Information Technology at Purdue (ITaP), Purdue's Central IT organization, extracted and tailored it for application in other fields. Hubs now link researchers transforming laboratory discoveries into new medical treatments; working to revolutionize cancer prevention, detection, treatment and care delivery; improving manufacturing processes for pharmaceuticals; studying volcanoes; modeling environmental pollution; and engineering earthquake-resistant buildings, bridges and related structures, among other things.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
he study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.