Skip to main content

BCBS 239 - an opportunity for efficient business intelligence in the banking sector

In an analysis of the causes of the financial crisis of 2008, the European Central Bank (ECB) and many other regulatory authorities discovered, inter alia, major deficiencies in institutional IT and data architecture. As a result of these deficiencies, risk assessments could not be performed sufficiently quickly and accurately, and not completely either. In short: Risks could neither be reported nor controlled to an appropriate extent. In the wake of this lesson, the Basel Committee on Banking Supervision (BCBS) published standard 239, widely known through press releases and also referred to as the "principles for effective risk data aggregation and risk reporting" and AT 4.3.4 of MaRisk.

These institutions are impacted by standard BCBS 239

BCBS239 is a regulatory requirement for financial institutions. Its implementation is graded according to the systemic relevance of the institutions. Three different general gradations are provided here:

  • Global systemically important banks (= G-SIBs)
  • Domestic systemically important banks (= D-SIBs, or national SIFIs)
  • Other, e.g. regional systemically important banks (= R-SIBs)

Implementation of BCBS239 for the G-SIBs and D-SIBs was to be completed no later than April 2018, as indicated by the timeline below. Implementation for all other institutions is not mandatory, but viewed by regulatory authorities as important and highly urgent.


However, the ECB's status report on implementation of BCBS239 dated May 2018 clearly shows that implementation at most institutions has either not been performed at all, or is incomplete. There is therefore still much to be done!

Even if BCBS 239 is a mandatory requirement only for financial institutions, the added value through transparency, assured quality of data and their availability, as well as the infrastructure and organization necessary for this is also a clear added value for institutions across all sectors at the present time, when "data is the gold of the future". It is therefore advisable to take into account the basic values of BCBS 239 at least partly in the BI strategies of other companies.

The four pillars of BCBS 239

BCBS239 describes the steps necessary for change in the handling of data in the banking sector. The following four fundamental and closely inter-related groups of topics have been recognized here:


Establishment of data governance organization and IT architecture compliant with BCBS239, and resultant development of a general definition across corporations and businesses in terms of collecting and processing of risk-related data, is an important building block on the way to enterprise conformance with BCBS239. Furthermore, there is a need for action in the areas of risk-data aggregation and risk reporting, because of high demands on the quality and consistency of risk data and reports.

Especially in relation to internationally active institutions, it is important to not view all described action groups in isolation separately for each country. Rather, regulatory authorities are to examine local implementations as well as interactions between countries and the headquarters. It is essential to avoid inconsistencies due to isolated approaches!

Provided below (simply expand the heading) is a full list and overview of all fields of action to be taken into account during transformation into an institution compliant with BCBS239, accompanied by specific descriptions of recommendations.

The group of topics titled "Regulation" refers to recommendations and guidelines for regulatory authorities and is therefore not considered further here.

Overview of fields of action

Governance & processes

Data governance 
  • Establishment/adaptation of data governance organization with definition of roles/processes.
  • Creation of clear, universally known and applicable escalation processes for the topics of data governance.
Data quality 
  • Establishment/adaptation of data quality management with roles/processes within the entire bank.
  • Establishment of bank-wide KPIs to measure data quality.
  • Control of data quality by implementing automatic, technical measurements of such quality along the risk process chain.
  • Follow-up processes for handling DQ problems.
  • Documentation/reporting along the DQ processes, including availability for all relevant risk contacts and higher management.
Metadata management 
  • Establishment/adaptation of metadata management with roles/processes within the entire bank (e.g. responsible persons / owners of data on the IT and business sides).
  • Initial registration of data, logic, systems and processes related to risk-relevant data.
  • Assurance that registered metadata are maintained regularly and after any changes, and incorporated into the regular change process.
  • Provision of metadata; creation of technical and professional transparency within the bank.

IT infrastructure & organization

  • Provision of an adequate hardware and software infrastructure which can ensure fulfilment of requirements concerning data management and evaluation capability.
  • Avoidance of redundant/contradictory systems, in particular, as a source of risk data.
  • Centralized and neutral data availability in the form of "one" DWH in IT.
  • Automation of manual processes.
  • Abolition of data sources comprising "IT outside of IT". If not possible in exceptional cases, then assurance that they are fully documented and secure in terms of process.
  • Avoidance / dissolution of data silos.
  • Standardization of risk indicators and nomenclature.
  • Planning & provision of sufficient personnel for meeting BCBS239 requirements.

Data management

Accuracy & integrity

  • The accuracy of the provided data should be guaranteed at all times in terms of both granularity and aggregation.
  • Accuracy must be ensured by regular inspections as well as adequate measures and processes.
  • The consistency, unambiguousness and integrity of data is to be established by means of governance processes and measures.
  • All data needed to describe bank risk must be available.
  • Completeness of provided data is to be guaranteed by processes, and made transparent as a status to consumers.
  • Risk-relevant data must be fully available during the period required for risk control.
  • Availability depends on business criticality as well as audit and test requirements, but should have a duration of at least one month.
  • The data architecture must be chosen to allow simple adjustment in terms of complexity and response time for:
    • New data
    • Additional key figures
    • Additional aggregation
    • Depiction of new/changed business processes
  • It should be possible to respond more simply and promptly to individual, ad hoc requirements of risk management as well as stress / crisis scenarios from the regulatory perspective.

Risk reporting

  • The accuracy and precision of risk-relevant evaluations must be ensured.
  • Risk evaluations must include key figures for all risk-relevant business areas.
  • Evaluations must be possible from all perspectives of relevance to risk.
  • Evaluations must consider the future in addition to the past. Evaluations should be anticipatory, both in terms of key figures as well as business fields.


  • It is necessary to ensure that the contents of reports meet technical requirements and are understandable.
  • Transparency of the employed information, logic and sources is to be ensured.
  • All evaluations are to be catalogued, together with their detailed data, technical requirements, recipients/owners and criticality.
  • To be established is a process which regularly reviews/scrutinizes the scope, content and meaningfulness of reports.


  • Risk evaluations and related data must be available in the period and granularity defined as necessary from the technical standpoint.
  • Availability in terms of time and granularity must undergo regular checks.
  • During critical periods (e.g. crisis times), risk evaluations must be provided much faster, especially in terms of significant key figures. If required, availability must be ensured within one day.
  • Recipients of risk evaluations are to be planned according to the "need to know" principle. Technical requirements must be fulfilled while taking confidentiality into account.
  • Availability of risk evaluations for the technically necessary persons must not only be ensured, but also verified.

Take these steps to successfully achieve organization compliant with BCBS 239

BCBS239 is not a one-time effort which, once implemented, is finished and completed. Rather, it is a cycle which begins with initial implementation, and needs to be regularly checked and adapted in a constantly changing world.

b.telligent's many years of project experience have led to development of an efficient approach comprising structured and organized planning of measures in the following groups of topics:

  • Data strategy
  • Data governance
  • Data architecture / data integration
  • Reporting


Specialists from all four areas should always check compliance according to the principles of BCBS239, in order to derive a sophisticated action strategy as well as concrete measures in terms of the processes to be implemented, as well as technical realization.

1. Data strategy

Every BCBS239 project should ideally start with quick checks in the areas of governance & infrastructure, risk-data aggregation and risk reporting (see above). These cover professional as well as the technical aspects, and should be repeated in further cycles within the framework of continuous control.

2. Data governance

Data governance encompasses definitions of sovereignty/responsibility over systems, data, key figures and business processes. Documentation and adherence to such roles/duties are the theoretical and professional foundations for regulatory requirements. Necessary fields of action in this area are determined from analyses of data strategy.

3. Data integration and architecture

From a technical standpoint, support in DWH and reporting is required in relation to the following BCBS239 topics:

  • Governance & infrastructure
  • Risk-data aggregation
  • Risk reporting

Most of the tasks for this point emerge from analyses of data strategy, definitions of data governance and the technical fields of action identified from executed processes.

For the areas of data integration, databases and reporting, the tool expertise of b.telligent's partners is accessible. As a vendor-independent consultant, b.telligent gladly provides support in tool evaluations. A use of tools commonly available on the market is recommended with regard to the topic of data governance and data-quality measurement.

At the end of each development cycle, an examination should be performed by a neutral body, for example, by the internal audit department. The result of such an examination can serve as input for the next development cycle.

Do you have further questions about BCBS239 and how also to achieve organization compliant with BCBS239? Do not hesitate to contact me!


Principles for effective risk data aggregation and risk reporting, Januar 2013,

Rundschreiben 09/2017 (BA) – Mindestanforderungen an das Risikomanagement – MaRisk, 27.10.2017

Report on the Thematic Review on effective risk data aggregation and risk reporting, Mai 2018

Your Contact
Stefan Steinhaus
Principal Consultant
Stefan is the person to talk to about data quality management, data strategy and data governance. He is of the opinion that regulatory requirements will become stricter and more detailed over the coming years.