Skip to main content

9.5 Reporting and Decision-Making

9.5.1
 
Institutions must implement a regular process to report the results of model monitoring to the Model Oversight Committee, the CRO and to the model users.
 
9.5.2
 
Reports must be clear and consistent through time. For each model, monitoring metrics must be included along with their respective limits. Times series of the metrics should be provided in order to appreciate their volatility and/or stability through time and therefore help make a view on the severity of limit breaches. Explanations on the nature and meaning of each metric must be provided, in such a way that the report can be understood by the members of the Model Oversight Committee and by auditors.
 
9.5.3
 
Regardless of the party responsible for model monitoring, all reports must be circulated to both the development team and the independent validation team, as soon as they are produced. For some models, monitoring reports can also be shared with the model users.
 
9.5.4
 
In each report, explanations on the significance of limit breaches must be provided. Sudden material deterioration of model performance must be discussed promptly between the development team and the validation team. If necessary, such deterioration must be escalated to the Model Oversight Committee and the CRO outside of the scheduled steps of the model life-cycle. The Committee and/or the CRO may decide to suspend the usage of a model or accelerate the model review upon the results of the monitoring process.
 
9.5.5
 
Institutions must define the boundaries of model usage. These are the limits and conditions upon which a model is immediately subject to adjustments, increased margins of conservatism, exceptional validation and/or suspension. Specific triggers must be clearly employed to identify abnormalities in model outputs.