January 11, 2011
ALEXANDRIA, Va., January 11, 2011 -- Four out of five C-level Federal IT executives said mission-critical government capabilities are at risk if agencies do not modernize legacy applications, according to a study by MeriTalk and the Unisys Center for Innovation in Government. The new study, “Federal Application Modernization Road Trip: Express Lane or Detour Ahead?,” looks at the progress of the Federal government in modernizing its business applications and the cost and risk of maintaining outdated and redundant systems.
The study, based on a survey of C-level and senior IT managers, revealed that agencies spend almost half of the annual Federal IT budget, $35.7 billion, maintaining and supporting legacy applications and that nearly half (47 percent) of all existing IT applications are based on legacy technology in need of modernization. Yet, only one-third of the survey’s 166 respondents said that application modernization is a top priority at their agency, and half of all respondents considering application modernization reported that their agencies are in the discovery or planning phases of implementation.
“Federal IT leaders see application modernization as vital to their agencies’ ability to successfully meet current and emerging needs,” said Mark Cohn, chief technology officer, Unisys Federal Systems. “In an age of tightening budgets, application modernization can free vital resources and budget currently allocated to maintaining legacy systems that are often duplicated across an agency. While many agencies have launched discovery and planning activities, pressure will grow to implement modernized applications for functionality and security benefits and to retire redundant systems to reduce cost.”
Federal IT leaders view application modernization as a long-term project, with 60 percent of respondents saying that modernization initiatives will take three years or longer to implement. Application redundancy – running multiple systems to perform the same tasks and processes – is an issue in Federal agencies, according to the study, particularly in the areas of IT governance and risk management, enterprise document and content management, and business process management systems. While Federal IT leaders see modernization of these systems as top targets for return on investment, progress lags in all but one of these areas – business process management. Respondents reported that their agencies are making the most progress in the areas of federated identity management, business process management, geographic information systems, and service-oriented architecture.
Lack of communication and understanding of application modernization initiatives are contributing to delays. Just over half (56 percent) of all respondents said their department fully understands their agency’s application modernization goals. Aside from additional budget and staff, respondents most often identified stronger leadership support and prioritization of modernization initiatives as well as a better understanding of the modernization process as factors that would help them surmount challenges and accelerate application modernization in their agencies.
“We’ve heard the word about cloud and data center consolidation from OMB leaders Jeffrey Zients and Vivek Kundra,” said Steve O’Keeffe, founder, MeriTalk. “Here’s evidence from Federal IT operators and agency executives that we need a radical shift in Federal IT modernization direction.”
“Application modernization can and must be a direct route to greater productivity and efficiency, but current approaches to modernization require years to produce results,” said Cohn. “We believe Federal agencies should adopt commercial best practices that will produce results more quickly and without extensive upfront capital expenditures. To jumpstart progress, it is imperative for agencies to develop a clear roadmap for modernization and to focus on initiatives with the greatest potential for return, such as cloud computing-based models.”
The “Federal Application Modernization Road Trip: Express Lane or Detour Ahead?” study is based on an online survey of 166 Federal IT leaders in October and November 2010. To download the full study results, please visit www.meritalk.com/fedappmod and www.govtinnovation.com.
The voice of tomorrow’s government today, MeriTalk is an online community that combines professional networking and thought leadership to drive the government IT community dialogue. Developed as a partnership among the Federal Business Council, Federal Employee Defense Services, Federal Managers Association, GovLoop, National Treasury Employees Union, USO, and WTOP/WFED radio, MeriTalk is a community network. For more information, visit www.meritalk.com or follow us on Twitter, @meritalk.
Unisys is a worldwide information technology company. We provide a portfolio of IT services, software, and technology that solves critical problems for clients. We specialize in helping clients secure their operations, increase the efficiency and utilization of their data centers, enhance support to their end users and constituents, and modernize their enterprise applications. To provide these services and solutions, we bring together offerings and capabilities in outsourcing services, systems integration and consulting services, infrastructure services, maintenance services, and high-end server technology. With approximately 23,000 employees, Unisys serves commercial organizations and government agencies throughout the world. For more information, visit www.unisys.com.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.