June 13, 2011
Next week in Brussels the First Digital Agenda Assembly will gather to discuss elements that factor into the Digital Agenda for Europe roadmap. As a central component of Europe’s digital future, cloud computing will be on the table as a key technological movement—albeit one that requires policy interventions to be productive.
The two-day workshop is titled “Towards a Cloud Computing Strategy for Europe: Matching Supply and Demand” and will identify the primary components of the larger European cloud strategy and ways that policy interventions could be implemented to further goals under the Digital Agenda for Europe roadmap.
The organizers of the event see the role that cloud computing could play in Europe’s economic future but they note it “has to be ensured that there will be sufficient supply of cloud computing facilities and services so that European companies of all sizes, government institutions and citizens can use these to develop innovative services.” They go on to claim that this supply of cloud services must be aligned with broader European legislation, especially in the realm of data protection and emphasize the need for greater standards to ensure interoperability.
Dr. Ignacio Llorente will be among the presenters during the event. He prefaced the presentation this week by noting he plans to describe how European Commission funding can address current gaps in the cloud computing strategy. He will also take a look at a number of technology challenges for clouds by examining key research issues that are needed to create more secure, robust cloud services.
This is reflects follow-up efforts after the formal launch of the European Cloud Computing Strategy that was announced early in 2011, which aims to make Europe a more “cloud friendly” place via policy and standardization action.
Full story at CloudPlan.org
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.