“Progress in adopting the principles for effective risk data aggregation and risk reporting” – December 2013

The Basel Committee have defined a set of 11 ‘principles’ that set a very high standard for the management of data used in risk aggregations in banking. This new standard has to be complied with by 2016 for the Global Systemically Important Banks (G-SIBs). The principles span three major areas: (1) Governance and Infrastructure, (2) Risk data aggregation capabilities, and (3) Risk reporting practices.

Some key compliance points include:

  • Data scope applies to risk management data and to data used in key internal risk models, including any new data required for ‘forward looking’ reporting, and including group level data as well as data from “all material business units or entities within the group.”1
  • Compliance with all principles relies on “strong governance framework, risk data architecture and IT infrastructure” 2
  • The establishment of “integrated data taxonomies and architecture across the banking group, which includes information on the characteristics of the data (metadata), as well as use of single identifiers and/or unified naming conventions for data including legal entities, counterparties, customers and accounts.” 3
  • Risks to data quality (not just known issues) must be assessed and managed
  • Data quality controls “through the lifecycle of data.” 4. These controls must be as robust as accounting controls.
  • “Capabilities to rapidly producing risk data in stress situations” 5
  • Data should be accurate, complete, timely, and adaptable. In particular, ‘adaptable’ data means that a dataset and relevant processes can be easily changed or extended to accommodate quick decision making, ad-hoc reporting, and regulatory requirements.
  • “Data should be aggregated on a largely automated basis” 6

The principles were first published in a consultation document in 2012, comments were taken over six months, and a final version published in January 2013. The Independent Data Professionals Group (IDPG) provided detailed comments on the document and these comments were published, along with comments from other bodies, on the Basel website.

In December 2013 the Basel Committee published a compliance progress review, based on a self-assessment exercise conducted by 30 G-SIBs. This progress report reveals a number of important concerns that might lead an observer to conclude that banks are further behind in their compliance schedule than they think. In particular:

  • Some banks have not appreciated that compliance in Risk Reporting Practices is dependent on the other areas (governance, infrastructure and aggregation capabilities). Therefore the self-assessed scores for Reporting are too high
  • Some banks have only assessed group data – they need to assess relevant data belonging to material business units
  • Some banks have not yet assessed data for all types of risk
  • Bank definitions of ‘materiality’ and tolerance levels are not clear, and therefore the data in scope may be wider than they currently assume.

Overall, the fact that some banks have not yet assessed all data in scope leads to the risk that these banks are underestimating the effort required to develop their data architecture, quality controls, and governance frameworks.

1. Para 2, “Progress in adopting the principles for effective risk data aggregation and risk reporting” (December 2013)
2. Para 26, “Principles for effective risk data aggregation and risk reporting” (January 2013)
3. Para 33, “Principles for effective risk data aggregation and risk reporting” (January 2013)
4. Para 11c, “Progress in adopting the principles for effective risk data aggregation and risk reporting” (December 2013)
5. Annex2/47, “Progress in adopting the principles for effective risk data aggregation and risk reporting” (December 2013)
6. Annex2, “Principles for effective risk data aggregation and risk reporting” (January 2013)

Questions on the Risk Aggregation progress report from RISK.NET (possibly to be published)

1. Why are banks struggling with their data architecture and IT infrastructure? What is so hard? Why is work on this taking so long?

DT – Existing data architecture is, in many cases, old and creaking, and often the situation is a patchwork of different architectures and infrastructures. New strategies are being developed but will take time to implement. For example, simply consolidating disparate sources of risk data into a single source may impact many business functions, each with competing priorities, leading to a fairly complex project.

2. Likewise with accuracy and integrity – What is so hard? Why is work on this taking so long?

DT – There are significant challenges to upgrade data quality controls to the level of accounting data, and to achieve single authoritative sources of risk data. The right data has to be identified, lineage mapped out, data owners established, risk assessments made, quality rules implemented, and controls put in place. Where data quality issues are found they must be addressed, possibly using root-cause-analysis or similar techniques. Where disparate and conflicting data sources exist these must be rationalised.

Banks may decide that their level of automation needs to be improved. This means replacing manual processes with automated processes, and integrating the new processes with existing systems. This is fairly time-consuming work.

Monitoring data quality at appropriate points in data lifecycle is a significant challenge. The lifecycle may not be fully understood, and it may not be straightforward to understand it. Some banks are deploying data tools to help in this area but there are significant challenges getting these tools fully implemented and working.

Generally the work is ‘hard’ because in many areas of a bank the concept of managing data, at least to the degree now required, is new, and this means a change to the culture if data management in these areas.

3. Likewise with adaptability – What is so hard? Why is work on this taking so long?

DT –Data sources are usually used by more than one business function. Banks have to approach changes to data sources carefully so as not to impact on any of the data’s customers.

Some reporting systems, and their data feeds, are not able to accommodate changes to data easily. Data structures are ‘hard-coded’, and even a small change requires extensive development and testing. These systems are best replaced as they do not properly support the new adaptability requirements.

4. Can banks really be so advanced in their compliance with risk reporting practices (Principles 7-11) as they say in their self assessment, yet be struggling with their data architecture and IT infrastructure (Principle 2) and with data quality issues (Principles 3-7)?

DT – What banks seem to have scored (incorrectly) is the effort required to comply with the principles, rather than the current level of compliance. By scoring Reporting as compliant whilst some other principles are still not compliant, banks are saying that no further effort will be required in the Reporting area to achieve compliance.

5. According to the report, 20 of the 30 globally systemically important banks (G-SIBs) that participated in the survey expect to comply with all principles by the deadline. Is this a realistic given the issues highlighted above?

DT – It is hard to say what constitutes a compliant architecture/infrastructure. Banks should be discussing this with their regulators so that there are no surprises. The IDPG have called for an objective reference architecture/infrastructure to be established that would greatly help banks understand what they need to do. But it is up to each bank to judge what compliant means, and to have that agreed with the regulator.

6. Supervisory authorities say they are committed to achieving full compliance of their G-SIBs by the deadline. The report claims that “supervisory authorities have a broad range of tools and remedial actions to enforce the Principles and have the expertise/resources to monitor banks’ progress towards implementation.”

Do the supervisors really have the tools, remedial actions, expertise and resources to enforce this compliance?

DT – Hard for me to say, but I would expect the Supervisors to be developing benchmark compliance models, and to be training people to properly assess banks against this. Monitoring is likely to be progress reports, regular meetings, focusing on banks with declaration of low compliance.

Leave a Reply

Case studies:

Due to the nature of work we undertake, and the types of clients we typically help, we do not post details of case studies online. Please contact us for more details.