May 18, 2011
REDWOOD CITY, Calif., May 18, 2011 --
Informatica Corporation (NASDAQ: INFA), the world’s number one independent leader in data integration software, today announced the availability of a financial services industry research report by the Ponemon Institute entitled Financial Data at Risk in Development: A Call for Data Masking.
The Informatica-sponsored report builds on previous research by Ponemon where 31% of surveyed customers indicated they would switch financial institutions should their personal information become compromised by a data breach.
Ponemon’s report reveals a pervasive use by financial institutions of real data in their application development and testing activities. It details how this is exposing them to the twin risk of costly non-compliance to specific regulations like
Gramm-Leach-Bliley Act (GLBA) and the Massachusetts Data Privacy Act (201 CMR 17),and customer churn and provides guidelines for reducing this exposure - including the vital practice of masking and securing live data.
ey research findings, based on a survey of more than 430 financial services IT professionals, include:
Widespread exposure of sensitive data - 84% of respondents’ organizations use real customer information during software development and test, 70% use consumer data, and 51% use credit, debit or other payment information.
Data protection is far from pervasive - Despite the data’s sensitivity, 45% do not protect real data used in development and testing.
Breaches are commonplace - 38% have had a breach involving real data in a development and test environment and 12% are unsure if they have had a breach or not.
Consequences are high - 54% of those experiencing a breach said it resulted in disruption of operations, 39% experienced customer churn, and 35% lost revenues.
Most organizations wouldn’t know if data was lost or stolen - 75% are not confident or else are un-decided as to whether their organization could even detect the theft or accidental loss of real data in development or test.
Outsourcing and cloud computing increase the security risk - Outsourcing development and test activities and/or using cloud-computing resources introduce additional risk factors, which frequently prevent financial organizations from turning to these potentially advantageous resources. Of those that outsource development or test, 51% share real data with third-parties, while 35% do not outsource due to security concerns. Meanwhile, 41% use cloud resources for development and test, but only 25% are confident or very confident about security in a cloud environment.
Given the high rates of real data used in financial industry development and test environments, Ponemon Institute recommends immediate actions to ensure customer privacy including:
Centralized executive oversight - Create a single point of executive-level responsibility coupled with policies and procedures for safeguarding your organization’s real data in non-production environments.
Data masking - Invest in key technologies including tools to "transform or mask sensitive or confidential data without diminishing the richness of the data necessary for successful testing and development."
Data masking helps safeguard sensitive, private or confidential data such as personally identifiable information (PII) or payment card information detailed in the Payment Card Industry Data Security Standards (PCI DSS) by masking it in-flight or in-place. As a result, fully functional, real data sets can be used safely in development, test and other non-production environments, as well as in outsourcing, offshore or cloud computing environments.
With Informatica Data Masking, private data is de-identified using techniques and algorithms that obfuscate the original data, and the masked data retains its original format and properties so that applications function properly during development and test activities.
Tweet this: Financial Data at Serious Risk during Application Development Reveals Research http://bit.ly/mO2PcV #security #risk
"Financial services organizations are among the most highly regulated and risk-aware enterprises in the world, yet a mere 34 percent of respondents believe that their organization is successful at protecting customer privacy in application development and test environments," said Dr. Larry Ponemon, Chairman and founder of the Ponemon Institute. "It is our hope that Financial Data at Risk in Development: A Call for Data Masking will help alert the industry to the risks they are facing in using unprotected data in development, while pointing the way towards a set of fairly easily implemented procedural and technical solutions."
"Financial institutions are not alone in inadequately protecting sensitive data during their development and test activities, but they certainly face greater risks and tighter regulation than most types of organizations," said Adam Wilson, general manager, Application Information Lifecycle Management, Informatica. "To manage this risk and ensure on-going compliance, some of the largest financial services companies in the world have standardized on Informatica Data Masking to extend their data security program beyond their production applications. This ensures the ever exploding number of copies that are kept for development, testing, training, and for regulatory reporting are appropriately de-identified.
Informatica Corporation (NASDAQ: INFA) is the world’s number one independent provider of data integration software. Organizations around the world turn to Informatica to gain a competitive advantage in today’s global information economy with timely, relevant and trustworthy data for their top business imperatives. Worldwide, over 4,350 enterprises rely on Informatica for data integration and data quality solutions to access, integrate and trust their information assets held in the traditional enterprise, off premise and in the Cloud.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.