كتاب روابط اجتياز لـ 2.3. Data Identification and Management
2.3. Data Identification and Management
يسري تنفيذه من تاريخ 8/9/2021LFIs should have in place adequate processes to ensure that customer and transactional data feeding into their TM program (whether using manual or automated processes, or both) meets established data quality standards, that data is subject to testing and validation at risk-based intervals, and that identified data quality and completeness issues are remediated in a timely manner.
As an initial matter, LFIs should identify and document all data sources that serve as inputs into their TM program. TM data sources may include both internal customer databases, core banking or other transaction processing systems, and applicable “flat-file” databases, as well as external sources such as Society for Worldwide Interbank Financial Telecommunication (“SWIFT”) message data. Source system documentation should include the identification of a system owner or primary party responsible for overseeing the quality of source data and addressing identified data issues. Where automated TM systems are used, LFIs should institute data extraction and loading processes to ensure a complete, accurate, and fully traceable transfer of data from its source to TM systems. LFIs should also ensure that staff’s access rights to both source systems and TM systems are commensurate with their roles and responsibilities, so as to ensure that relevant staff can perform their duties effectively and that access is not extended to unauthorized persons or those no longer requiring system access.
Both prior to the initial deployment of a TM system or process and at risk-based intervals thereafter, LFIs should test and validate the integrity, accuracy, and quality of data to ensure that accurate and complete data is flowing into their TM program. Data testing and validation should typically occur at minimum every 12 to 18 months, as appropriate based on the LFI’s risk profile, and the frequency of such activities should be clearly mandated and documented in the LFI’s policies and procedures. Such testing can include data integrity checks to ensure that data is being completely and accurately captured in source systems and transmitted to TM systems, as well as the reconciliation of transaction codes across core banking and TM systems. Testing may also utilize quantitative data quality standards or benchmarks to track data quality over time and specify a threshold or range beyond which data irregularities or other data quality issues shall require corrective action.
In addition, LFIs should put in place appropriate detection controls, such as the analysis of trends observable through management information system (“MIS”) data and the generation of exception reports, to identify abnormally functioning TM rules or scenarios and ensure that any such irregularities caused by data integrity or other data quality issues are appropriately diagnosed and remediated. Where appropriate, a root cause analysis should be performed, and any findings and recommended remedial actions should be escalated to senior management to address the underlying issue in a timely manner.