February 11, 2011
A study released today from research group INPUT notes that “demand for vendor-furnished IT services by the U.S. government will increase from $38.1 billion in 2010 to $5.17 billion in 2015, creating strategic openings for contractors after the release of the fiscal year budget request for this year.
The analyst firm notes that while there is a ten percent reduction planned this year for federal professional and technical service contracts, many of the cuts will be centered on “cost benefit analysis, policy review, program evaluation and management services” versus standard IT services.
In 2010, Federal CIO Vivek Kundra called for massive enhancements to existing government IT infrastructure with the central goal of consolidating data centers and eliminating waste and inefficient IT practices across agencies. Despite the ten percent decrease projected, which INPUT claims will not have an effect on standard IT contracts, Kundra’s directives will continue to open opportunities for federal IT contractors as the process of consolidation continutes.
John Slye, principal analyst for INPUT’s study noted that, “Due to contemporary demands including data center consolidation, enhancements in cyber security and national trends toward cloud computing, the IT service industry will be equipped to bear the force of federal cuts better than others.”
INPUT’s report, which can be accessed here, provides insights about the tough decisions that the Cloud First policy will spark, how agencies are considering questions of outsourcing, and more general matters of security, automation, SOA, and massive data demands.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.