Accurate and representative historical data is the backbone of financial models. Institutions must implement a rigorous and formal data management framework (“DMF”) to support the development and validation of accurate models.
3.2.2
The management of data sets used for modelling should not be confused with the management of data used for the production of regular risk analysis and reporting. While these two data sets may overlap, they are governed by two different processes and priorities. The construction of data for modelling focuses on consistency through long time periods, while risk analysis and reporting relies more on point-in-time data. In addition, numerous data points needed for modelling are often not included in the scope of reporting.
3.2.3
The DMF must be constructed to fully support each step of the model life-cycle process. The DMF must not be the responsibility of the model development or validation teams. The DMF must be organised by a separate dedicated function / team within the institution, with its dedicated set of policies and procedures.
3.2.4
The DMF must be comprehensive to adequately support the scope of models employed by the institution. It must be coherent with the breadth and depth of models used in production. In particular, sophisticated models with numerous parameters and complex calibration requirements must be supported by an equally sophisticated DMF.
3.2.5
At a minimum, the DMF must include the following components: (i) systematic identification of sources, (ii) regular and frequent collection, (iii) rigorous data quality review and control, (iv) secure storage and controlled access and (v) robust system infrastructure.
3.2.6
The data quality review is a key component of the DMF. It must incorporate standard checks to assess the data completeness, accuracy, timeliness, uniqueness and traceability.