May 24, 2011
The government in Australia has undertaken a number of measures over the last two years to bring cloud computing into the fold for data management and consolidation and efficiency. This week another agency within the government made a move toward considering the possibility, especially since it handles a wide variety of data from a number of sources—and all for different sub-agencies and groups.
Geoscience Australia is a government agency that handles resources, energy, and tourism matters in Australia and also provides geoscientific information for government and civic uses. The agency monitors resource exploitation issues, environmental management and protection, sustainable and energy infrastructure monitoring and a host of other projects.
As the leading public sector source for environmental planning, mapping and data mining for energy, resource and other matters, the agency has a great demand for data management infrastructure, including massive storage, database and compute resources.
The Australian Department of Finance has suggested that the agency should consider a move to a singular system that will refine data management and improve current deduplication efforts. According to the Finance Department, a move to cloud computing might allow the agency to combine ICT efforts to work toward a more unified way of computing, thus leading to better organization.
According to a report this week, the agency is looking for ways to simplify their complex systems for data management, which currently include roughly 3,000 applications that manage a variety of sub-organizations under Geoscience Australia’s umbrella. Reports indicate that of this list of applications, the majority are off-the-shelf codes.
Full story at Experian Australia
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Cloud computing has become mainstream in today’s HPC world. In order to enable the HPC researchers who currently work with large distributed computing systems, to bring their expertise to cloud computing, it is essential to provide them with easier means of applying their knowledge.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.