May 11, 2011
The e-Infrastructure Reflection Group (e-IRG), which is comprised of well over one hundred members from various large European institutions, has been vocal over the last couple of years about particular infrastructure challenges ahead in its yearly reports issued as public whitepapers.
The group uses the concept of e-Infrastructure as an umbrella terms to refer to the “new research environment in which all researchers—whether working in the context of their home institutions or in national or multinational scientific initiatives—have shared access to unique or distributed scientific facilities (including data instruments, computing and communications), regardless of their type and location in the world.”
While the membership is European in terms of focus, many of the topics they address have universal relevance. The core issues they address are similar to those tackled by other global institutions with resources to share. These include queries about opening opportunities for cloud computing services from commercial players to enter the paradigm and extending the use of Internet-delivered tools for researchers.
The e-IRG has released its 2011 white paper covering a number of topics relevant to e-infrastructure and e-science. This report includes in-depth analysis of issues related to international data and infrastructure management, legal and financial complications, the future of research networking, green IT, exascale computing and its software challenges, as well as matters related to authentication and accounting.
One consistent throughout this year’s report are questions about use of infrastructure as a service and how it plays into the current grid and other distributed computing setups at various institutions. The arrival of cloud computing offers many new ways of considering distributed resources for research and science, but from an operational point of view, there are some complications, especially in terms of funding and finding ways to integrate these newer commercial possibilities into existing infrastructure.
The group hopes to “refine the long-term financial strategies for e-Infrastructure aiming at a sustainable operation of e-RI services in a flexible and open environment, including cloud offerings.” They also note in their report that in 2011 they will be placing special emphasis on exploitation of e-Infrastructure such as HPC and clouds as well as greater focus on Internet-delivered applications (SaaS).
The authors provide some background to cover why this cloud angle causes some complexity. They note that, “In many countries, high performance computing is funded via the resource owners, in some others partly via the user budgets. In some projects, central funding is used to compensate for users in disadvantaged regions but in order to operate e-infrastructure in a mid- to long-term perspective in a sustainable manner, users will need to be given a choice for the best services available regardless of boundaries, including the choice between the current e-Infrastructure and the commercial market for commodity services or clouds.”
One of the key points in this year’s assessment of the e-Science landscape is that research infrastructures will need to make a shift toward a user-driven approach. The authors argue that there is no “one size fits all” approach for the future e-Science infrastructures. They claim that “different technical, political and commercial development such as the virtualization of services, the emergence of cloud computing, the ambition of establishing an ERA and the ever increasing need of leading edge user communities for services far beyond what the commercial market can offer, will drive the process.”
While cloud computing is taking off in the commercial sector, for groups that have been grid-oriented and based on a less commercially-integrated approach to distributed computing for research this is quite a step. The 2011 report, which is open to public comments and suggestions, reveals that while there are possibilities for clouds and research, there are a number of complex issues to be untangled before practical integration can occur.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.