Skip to main content
  • 3 General Standards

    This Part outlines the general principles of the MMS, that is, the key components of the Standards. Part I must be read in conjunction with Part II, which explains how these principles must be applied. Both Part I and Part II must be regarded as minimum requirements. The key components of model management are as follows: (i) model governance, (ii) data management, (iii) model development, (iv) model implementation, (v) model usage, (vi) performance monitoring and (vii) independent validation. The timeframes and minimum frequencies of model review are addressed in Part II.
     
    • 3.1 Model Governance

      3.1.1
       
      Model governance must reinforce the continuous improvement of modelling practices in order for institutions to comply with the requirements of the MMS. Institutions must establish a clear plan to comply.
       
      3.1.2
       
      Institutions must define a comprehensive model management framework to ensure that models are used effectively for decision-making and that Model Risk is appropriately understood and mitigated. The scope of the model governance must cover all models used to make decisions within the institution.
       
      3.1.3
       
      Model Risk must be incorporated in institutions’ risk framework alongside other key risks faced by institutions, as inherent consequences of conducting their activities. Consequently, Model Risk must be managed through a formal process incorporating the institution’s appetite for model uncertainty. The framework must be designed to identify, measure, monitor, report and mitigate this risk. A large appetite for Model Risk should be mitigated by counter-measures such as conservative buffers on model results, additional provisions and/or potentially a Pillar II capital add-on.
       
      3.1.4
       
      The model management framework must be structured around key components to be effective. First, the responsibilities of the stakeholders must be clearly defined with a transparent process for modelling decisions, oversight, escalation and for managing relationships with third parties. Second, a limit framework must be established to control and mitigate Model Risk. Third, the nature, objective and priorities of the modelling tasks must be defined. Fourth, appropriate systems, tools and data must be established to support model management. Fifth, the framework must include a granular reporting process to support pro-active management of Model Risk.
       
      3.1.5
       
      Institutions must manage each model according to a defined life-cycle composed of specific steps, from model design to re-development. The roles and responsibilities of stakeholders must be defined for each step of the life cycle. To facilitate model management and prioritisation, models must be grouped according to their associated Model Risk, or at least based on their associated materiality, as defined in the MMS.
       
      3.1.6
       
      Institutions must establish a Model Oversight Committee which must be accountable for all significant modelling decisions related to each step of the model life-cycle. The committee must ensure that these decisions are transparent, justified and documented. The committee’s main objective is to optimise the ability of models to support decision-making throughout the institution, covering all model types. The Model Oversight Committee is accountable to Senior Management and to the Board, who must ensure that the Model Oversight Committee manages Model Risk appropriately and meets the requirements articulated in the MMS.
       
      3.1.7
       
      The Chief Risk Officer (“CRO”) must ensure that the design and usage of models is appropriate to support decision-making throughout the institution, in order to minimise Model Risk. Therefore, the scope of the CRO’s responsibility in this matter must cover the whole institution and must not be limited to the risk function. The CRO must ensure that Model Risk is fully managed with suitable identification, measurement, monitoring, reporting and mitigation.
       
      3.1.8
       
      In accordance with Article 2.2 of the Risk Management Regulation 153/2018, the Board bears the responsibility for the suitability of the risk management framework. In addition, Article 4.3 states that the Board is ultimately accountable for the appropriate usage and management of models, whether the approval for the use of models is provided directly by the Board or through authorities delegated to Senior Management. Consequently:
       
       (i)
       
      The Board bears the responsibility of all modelling decisions with material implications for the institution and it must define the appetite of the institution for Model Risk. Consequently, the Model Oversight Committee must refer decisions with material consequences to the Board (or the Board Risk Committee). If a Board decision is not deemed necessary, the Board (or the Board Risk Committee) must nonetheless be informed of key decisions taken by the Model Oversight Committee, with appropriate rationale.
       (ii)
       
      To support the appropriate management of models, the Board must ensure that institutions have a sufficient number of internal employees with robust technical expertise. The Board must also ensure that Senior Management possess an adequate level of technical knowledge to form a judgement on the suitability of material modelling decisions.
       
      3.1.9
       
      The internal audit function is also a stakeholder in model governance. It must assess the regulatory compliance and the overall effectiveness of the model management framework as part of its regular auditing process. For this purpose, the internal audit function must be familiar with the requirements articulated in the MMS and review the model management framework against these requirements. The internal audit function must not be involved in the validation of specific models.
       
      3.1.10
       
      Institutions can use third parties to support the design, implementation and management of models. However, institutions must take responsibility for all modelling decisions, model outputs and related financial consequences, even if third parties are involved.
       
      3.1.11
       
      To achieve and maintain the quality of models, institutions must ensure that a sufficient number of internal technical resources are hired, trained and retained. Each institutions’ designated human resource function is responsible for supporting this requirement, operationally and strategically.
       
      3.1.12
       
      One of the key elements to manage Model Risk is a robust process for model review and challenge. Such review must be independent to be effective. Consequently, institutions must clearly define the roles and responsibilities of the development and the validation teams to ensure this independence. The validation team must communicate its findings to Senior Management and the Board on a yearly basis. The management and reporting of Model Risk must also be independent from the development teams.
       
      3.1.13
       
      Dedicated and consistent documentation must be produced for each step of the model life-cycle. Institutions must therefore develop model documentation standards. The documentation must be sufficiently comprehensive to ensure that any independent party has all the necessary information to assess the suitability of the modelling decisions.
       
      3.1.14
       
      The management of models must be supported by a comprehensive reporting framework reviewed and analysed at several levels of the organisation, from the development and validation teams, up to the Board. This reporting must be designed to support the management of Model Risk, covering the identification, measurement, monitoring and mitigation of this risk. Reporting must be clear, comprehensive, specific and actionable.
       
    • 3.2 Data Management

      3.2.1
       
      Accurate and representative historical data is the backbone of financial models. Institutions must implement a rigorous and formal data management framework (“DMF”) to support the development and validation of accurate models.
       
      3.2.2
       
      The management of data sets used for modelling should not be confused with the management of data used for the production of regular risk analysis and reporting. While these two data sets may overlap, they are governed by two different processes and priorities. The construction of data for modelling focuses on consistency through long time periods, while risk analysis and reporting relies more on point-in-time data. In addition, numerous data points needed for modelling are often not included in the scope of reporting.
       
      3.2.3
       
      The DMF must be constructed to fully support each step of the model life-cycle process. The DMF must not be the responsibility of the model development or validation teams. The DMF must be organised by a separate dedicated function / team within the institution, with its dedicated set of policies and procedures.
       
      3.2.4
       
      The DMF must be comprehensive to adequately support the scope of models employed by the institution. It must be coherent with the breadth and depth of models used in production. In particular, sophisticated models with numerous parameters and complex calibration requirements must be supported by an equally sophisticated DMF.
       
      3.2.5
       
      At a minimum, the DMF must include the following components: (i) systematic identification of sources, (ii) regular and frequent collection, (iii) rigorous data quality review and control, (iv) secure storage and controlled access and (v) robust system infrastructure.
       
      3.2.6
       
      The data quality review is a key component of the DMF. It must incorporate standard checks to assess the data completeness, accuracy, timeliness, uniqueness and traceability.
       
    • 3.3 Model Development

      3.3.1
       
      The development process must support the construction of the most appropriate models in order to meet the objectives assigned to these models.
       
      3.3.2
       
      The development process must be structured with sequential logical steps that take into consideration multiple factors, including but not limited to, the business and economic context, the data available, the development techniques, the implementation options and the future usage. Consequently, institutions are expected to employ judgement and critical thinking in the execution of this process, rather than run it in a mechanistic fashion.
       
      3.3.3
       
      Model development requires human judgement at each step of the process to ensure that the assumptions, design and data meet the objective of the model. Judgement is also required to ensure that development methodology is adequate, given the data available. Therefore, institutions must identify where judgment is needed in the development process. Suitable governance must be implemented to support a balanced and controlled usage of human judgement.
       
      3.3.4
       
      Each of these components must be regarded as an essential part to complete the whole process because each step involves key modelling decisions that can materially impact the model outcome and the financial decisions that follow. The process must be iterative. This means that if one step is not satisfactory, some prior steps must be repeated.
       
      3.3.5
       
      The development process must incorporate a degree of conservatism to mitigate Model Risk. Any material degree of uncertainty associated with the development steps, in particular related to data, must be compensated by conservative choices. For instance, conservatism can be reflected during the model selection process or by the usage of buffers at any point during the development process. However, conservatism should not be employed to hide defects and deprioritise remediation. When conservatism is applied, institutions must justify the reasons for it, identify the uncertainty being addressed and define the conditions for model improvement.
       
      3.3.6
       
      The choice of methodology for model development must be the result of a concerted structured process. This choice should be made upon comparing several options derived from common industry practice and/or relevant academic literature. Methodologies must be consistent across the organisation, transparent and manageable.
       
      3.3.7
       
      Institutions must pay particular attention to the model selection process for all types of models. When several models are available, institutions must put in place a documented process to select a model amongst several available options.
       
      3.3.8
       
      The pre-implementation validation must be considered an integral part of the development process. This step must ensure that the model is consistent, fit for purpose and generates results that can be explained and support decision-making appropriately. The depth of the pre-implementation validation should be defined based on model materiality.
       
    • 3.4 Model Implementation

      3.4.1
       
      Institutions must consider model implementation as a separate phase of the model life-cycle process, with its own set of principles.
       
      3.4.2
       
      The implementation of a model must be treated as a project with clear governance, planning, funding and timing. It must include comprehensive user acceptance testing with record keeping and associated documentation. Upon request, these records shall be made available to the CBUAE, other regulators and auditors to assess whether a particular model has been implemented successfully.
       
      3.4.3
       
      The system infrastructure supporting the ongoing usage of models must be carefully designed and assessed before the model implementation phase, to adequately address the needs of model usage. It must cope with the demand of the model sophistication and the volume of regular production.
       
      3.4.4
       
      After the model implementation, institutions must regularly assess the suitability of their system infrastructure for their current and future usage of models. This assessment must be made in light of (i) evolving model design and methodologies, (ii) rapid technology developments and (iii) growing volume of transactions to be processed.
       
      3.4.5
       
      Institutions should avoid spreadsheets for the implementation of large and complex models. Where this is unavoidable, and preferably on a temporary basis, institutions must implement rules and rigorous validation to mitigate the risks posed by spreadsheet tools which are highly susceptible to operational errors. Institutions must implement internal policies and guidelines for the development of spreadsheet tools used in production.
       
    • 3.5 Model Usage

      3.5.1
       
      The conditions for using models must be defined, monitored and managed. Model usage must be treated as an integral part of model management because the appropriate usage of a model is independent from the quality of such model.
       
      3.5.2
       
      Institutions must develop policies to manage model usage. At a minimum, the following must be included: (i) the definition of the expected usage, (ii) the process to control this usage, (iii) the governance surrounding the override of model inputs and outputs, and (iv) the management of user feedback.
       
      3.5.3
       
      Institutions must pay particular attention to circumstances under which model results are overridden. They must establish a clear, approved and controlled policy to govern overrides. This requirement is applicable to all models.
       
    • 3.6 Model Performance Monitoring

      3.6.1
       
      Institutions must implement a process to and monitor the performance of their models on a regular basis, as part of their model life-cycle management. The monitoring frequency must depend on model types. The required minimum frequencies are set in Part II of the MMS.
       
      3.6.2
       
      Prior to engaging in performance monitoring, institutions must ensure that models are used appropriately. This means that the analysis of model usage must have been completed successfully.
       
      3.6.3
       
      The objective of performance monitoring is to assess whether exogenous changes in the economic and business environment have impacted the assumptions of the model and therefore its performance. The monitoring process must be organised with specific responsibilities, monitoring metrics, limits associated with these metrics and required reporting for each model and/or model type. The process must incorporate a clear decision-making and escalation mechanism.
       
      3.6.4
       
      The responsibility for the execution of model monitoring must be clearly defined. This can be assigned to the development team, the validation team or any independent third party. If model monitoring is not performed by the validation team, then the validation team must review the quality and relevance of the monitoring reports during the validation cycle. Monitoring reports must be presented to the Model Oversight Committee on a regular basis, at least every quarter.
       
      3.6.5
       
      Metrics and limits must be designed to appropriately track the performance of each model based on its specific characteristics and its implementation.
       
      3.6.6
       
      Monitoring reports must be comprehensive, transparent and contain explanations regarding the nature of metrics, their acceptable range and respective interpretation. These reports must be designed in such way that non-technical readers can understand the implications of the observations. Each monitoring report must contain an explicit conclusion on the model performance. The report should also include suggestions for defect remediation, when deemed appropriate.
       
      3.6.7
       
      Upon the production of monitoring reports, a clear process must be followed to decide whether to either continue using a model (with further monitoring) or suspend it and work on remediation. This decision must be made by the Model Oversight Committee.
       
      3.6.8
       
      The monitoring process is a key preceding step towards the validation process. The results of the monitoring process must be used as inputs to the validation process (when available), if the monitoring reports are deemed of sufficient quality and relevance by the validator.
       
    • 3.7 Independent Validation

      3.7.1
       
      The independent validation must be established as a key step of the model lifecycle management and is the basis upon which Model Risk can be assessed and managed. Institutions must implement a process to validate independently all their models on a regular basis based on model types, as part of their model life-cycle management. Minimum frequencies are mentioned in Part II of the MMS.
       
      3.7.2
       
      In the context of model management, the model owner acts as the first line of defence, the independent validator acts as a the second line of defence and the internal audit function acts as the third line of defence.
       
      3.7.3
       
      The validation process must be organised with specific responsibilities, metrics, limits and reporting requirements for each model type. The validation process must be constructed to ensure an effective identification and remediation of model defects to manage Model Risk appropriately. This is referred to as the Effective Challenge principle.
       
      3.7.4
       
      Model validation can be performed either by an internal independent team or by a third party. In all cases, the validation process must remain independent from the development process. If model validation is assigned to a third party, institutions remain the owners of validation reports and remain responsible for taking appropriate actions upon the issuance of these reports. If the institution has an internal validation team and also uses third party validators, the internal validation team must maintain oversight of all validation exercises conducted by third parties. If the institution does not have an internal validation team, all validation reports produced by third parties should be owned by an appropriate internal control function separate from the model owner.
       
      3.7.5
       
      The validation must be independent by virtue of excluding the development team from involvement in the assessment of the model. The development team may be involved in the validation process once a set of observations has been produced, in particular for the remediation of these observations. Institutions must be able to demonstrate to the Central Bank, the appropriate arm’s length independence of the validator. Consequently, if a third party provides a methodology to develop a model for an institution, any subsequent validation exercise must be performed by a party different from the original provider. Validation teams must not report to the business lines.
       
      3.7.6
       
      The validation team must possess sufficient technical skills and maturity to formally express its opinion without the interference of the development team or from the business lines. The business lines may be consulted during the validation process, but the conclusion of such process must be formed independently from business line interests.
       
      3.7.7
       
      The validation scope must cover both a qualitative validation and a quantitative validation. A qualitative validation alone is not sufficient to be considered as a complete validation. If insufficient data is available to perform the quantitative validation of a model, the validation process must be flagged as incomplete and the institution must recognise and account for the uncertainty and thus the Model Risk related to such model.
       
      3.7.8
       
      A validation exercise must result in a full articulated judgement regarding the suitability of the model to support decision-making. The analyses and tests performed during the validation of a model must be rigorously documented in a validation report, such that (i) management is able to form a view on the performance of the model, and (ii) an independent party is able to repeat the process on the basis of the report.
       
      3.7.9
       
      Institutions must put in place an effective process to manage and remedy findings arising from validation exercises. Observations and findings across all models must be documented, recorded, tracked and reported to Senior Management and the Board at least once a year. Findings must be classified into groups based on their associated severity, in order to drive the prioritisation of remediation.
       
      3.7.10
       
      Institutions must ensure that model defects are understood and remedied within an appropriate time-frame. They must implement an effective process to prioritise and address model defects based on their materiality and/or associated Model Risk. High severity findings must be remedied promptly. If necessary, such remediation may rely on temporary adjustments and/or manual override. Such adjustments and overrides must not become regular practice, in that they must have an expiry horizon and must be coupled with a plan to implement more robust remediation. Further requirements and minimum remediation timings are mentioned in Part II.
       
      3.7.11
       
      Models employed by institutions must be fit for purpose to support decision-making. Therefore, institutions must aim to resolve all model defects associated with high and medium severity and aim to minimise the number of defects with low severity. If an institution decides not to address some model defects, it must identify, assess and report the associated Model Risk to Senior Management and the Board. Such decision may result in additional provisions and/or capital buffers and will be subject to review by the CBUAE.
       
      3.7.12
       
      The internal audit function is responsible for verifying that the model validation process is performed appropriately and meets the MMS requirements. This review must be planned as part of regular audit cycles. The audit team must comment on the degree of independence of the internal validation process. For technical matters, the audit team may decide to be assisted by third party experts. Where third party assistance is utilised, the internal audit function remains the sole owner of the conclusions of the audit report.