Skip to main content
  • 9 Model Performance Monitoring

    • 9.1 Objective

      9.1.1
       
      Institutions must implement a process to monitor the performance of their models on a regular basis, as part of their model life-cycle management. The relationship between model performance and usage is asymmetric. A robust model does not guarantee relevant usage. However, an improper usage is likely to impact the model performance. Consequently, institutions must ensure that models are used appropriately prior to engaging in performance monitoring.
       
      9.1.2
       
      The objective of the monitoring process is to assess whether changes in the economic environment, market conditions and/or business environment have impacted the performance, stability, key assumptions and/or reliability of models.
       
      9.1.3
       
      Institutions must implement a documented process with defined responsibilities, metrics, limits and reports in order to assess whether models are fit for purpose, on an ongoing basis. Upon this assessment, there must be a clear decision-making process to either (i) continue monitoring or (ii) escalate for further actions.
       
    • 9.2 Responsibility

      9.2.1
       
      The responsibility for the execution of model monitoring must be clearly defined. Institutions have the flexibility to assign this task to the development team, the validation team or to a third party. If model monitoring is assigned to the development team, the monitoring reports must be included in the scope of review of the independent validation process. If model monitoring is assigned to a third party, institutions remain the owners of monitoring reports and remain responsible to take appropriate actions upon the issuance of these reports. Institutions are expected to fully understand and control the content of monitoring reports produced by third party providers.
       
      9.2.2
       
      Monitoring reports must be presented regularly to the Model Oversight Committee. All reports containing limit breaches of monitoring metrics must be discussed by the committee.
       
      9.2.3
       
      The internal audit function must verify that model monitoring is performed appropriately by the assigned party. In particular, the internal audit function must review the relevance, frequency and usability of the monitoring reports.
       
    • 9.3 Frequency

      9.3.1
       
      Model monitoring must be undertaken on a frequent basis and documented as part of the model life-cycle management. Institutions must demonstrate that the monitoring frequency is appropriate for each model. The minimum frequency is indicated in the Article (10) of the MMS, which covers the independent validation process.
       
    • 9.4 Metrics and Limits

      9.4.1
       
      Institutions must develop metrics and limits to appropriately track model performance. The metrics must be carefully designed to capture the model performance based on its specific characteristics and its implementation. At a minimum, the monitoring metrics must capture the model accuracy and stability as explained in Article 10.4.3 pertaining to the scope of the post-implementation validation. In addition, the monitoring metrics must track the model usage to assess whether the model is used as intended.
       
    • 9.5 Reporting and Decision-Making

      9.5.1
       
      Institutions must implement a regular process to report the results of model monitoring to the Model Oversight Committee, the CRO and to the model users.
       
      9.5.2
       
      Reports must be clear and consistent through time. For each model, monitoring metrics must be included along with their respective limits. Times series of the metrics should be provided in order to appreciate their volatility and/or stability through time and therefore help make a view on the severity of limit breaches. Explanations on the nature and meaning of each metric must be provided, in such a way that the report can be understood by the members of the Model Oversight Committee and by auditors.
       
      9.5.3
       
      Regardless of the party responsible for model monitoring, all reports must be circulated to both the development team and the independent validation team, as soon as they are produced. For some models, monitoring reports can also be shared with the model users.
       
      9.5.4
       
      In each report, explanations on the significance of limit breaches must be provided. Sudden material deterioration of model performance must be discussed promptly between the development team and the validation team. If necessary, such deterioration must be escalated to the Model Oversight Committee and the CRO outside of the scheduled steps of the model life-cycle. The Committee and/or the CRO may decide to suspend the usage of a model or accelerate the model review upon the results of the monitoring process.
       
      9.5.5
       
      Institutions must define the boundaries of model usage. These are the limits and conditions upon which a model is immediately subject to adjustments, increased margins of conservatism, exceptional validation and/or suspension. Specific triggers must be clearly employed to identify abnormalities in model outputs.