August 09, 2012
Aug 9 — We rely implicitly on things such as our cars and our electricity supply, but how can we encourage the same level of trust in and use of IT utility services?
As part of a major £1.5 million investment by the Research Councils UK (RCUK) Digital Economy Programme, a consortium of UK universities, led by the University of Southampton, is to establish a research network focusing on the challenge of 'IT as a Utility' (ITaaU).
IT as a Utility is about the provision of information and technology in a transparent and highly usable manner.
In our ever increasing connected digitally-driven society, many people are accustomed to broadband access to applications ranging from unlimited storage email accounts, to social networking on Facebook, Twitter and LinkedIn, to web-based services for managing and sharing documents, music and photographs. Increasingly, commerce and industry use the same technologies to support their staff, market their products, service their customer base and manage their supply chains.
Many of these services are provided through digital content, multi-functional sensors and other connected devices and as their use increases, user communities and urban/rural infrastructure will become more integrated with the Internet and the Web.
The three-year ITaaU Network+ project will work towards simple, usable and safe IT provision from smart services, surroundings and information stores. It will also examine the perceived barriers that inhibit new users of these services.
The consortium is led by Professor Jeremy Frey from the University of Southampton, with co-investigators Professor Mark Sandler (Queen Mary, University of London), Professor Gerard Parr (University of Ulster), Dr Michael Surridge (University of Southampton) and Dr Richard Mortier (University of Nottingham).
Professor Jeremy Frey comments: "IT as a Utility is closely related to Grid and Cloud Computing with its emphasis on making IT resources effortlessly and almost invisibly available the end user. Cloud models for access to applications and infrastructure are now well established, and are changing the way users interact with applications, especially where the application is accessible from multiple devices and users.
"Users access not just utility storage but rich and complex utility content, including live sensor data, user participation and 'just in time' personalised media. This will be managed and exploited by a wide range of interacting applications and autonomous agents interacting with consumer-usable end-points that aren't always computers or mobile phones."
In the public arena smart phone apps have shown that there is a demand for such easy-to-use IT functions, yet the potential scope is much greater.
Professor Jeremy Frey adds: "Information Technology is now crucially important to almost everyone in the UK. For many of us it will pervade the way we follow the Olympic events. We are already used to the Hawkeye system at Wimbledon, rapid access to all the statistical data and up to the minute news via the web and the move to deliver governmental services over the Web and even while we are on the move, via our smart phones."
The ITaaU Network+ will bring to together researchers from universities and industry, alongside users and content and data providers and processors from the many disciplines needed to understand the provision, uptake, usability, management, robustness, security, trustworthiness and sustainability of applications and services delivered in the future Internet.
The network will be arranging a variety of workshops, including smaller focussed discussions, as well as larger-scale community meetings, initiate new collaborative research and provide academic and industrial secondments.
Source: University of Southampton
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.