Prioritizing Data Based on Criticality: Critical Data Elements (CDEs) in Context

Originally Published: November 2018; Revised: March 2020

Best Practice Scribes
Gareth Isaac, Director and Principal Consultant, Ortecha, a DCAM Authorized Partner
Mark McQueen, EDM Council, Senior Advisor-DCAM

Executive Summary

Objective

One of the most significant regulatory directives following the 2008 financial crisis has been the introduction of the Principles for Effective Risk Data Aggregation and Risk Reporting or BCBS 239. The Principles outlined in this directive require banks to establish sound information infrastructures to support their risk and risk reporting functions. As part of creating the required control environment, a common practice in the financial services industry is the establishment of CDEs or Critical Data Elements.

In spite of this focus on CDEs by the financial service industry, in the 2017 Data Management Industry Benchmark Study conducted by the EDM Council, the management of CDEs was identified as a top challenge universally across the industry. Members report that there is uncertainty regarding the exact definition of a CDE, how is it designated, or how it should be used to satisfy the control requirement.

Subsequently, the EDM Council conducted 14 in-depth interviews with member firms to frame the issue. The research was organized to gain insight into how organizations define critical data, the process to identify critical data, and the implications for heightened levels of control on critical data. The research revealed both the purpose and approach to CDE management remained fragmented and siloed to each organization. To address this, the EDM Council formed a CDE Best Practice workgroup. The workgroup was charged with establishing a Best Practice for the identification and management of critical data covering these three objectives.

The Best Practice provides processes, procedures, and tools for the execution of the identification of critical data all aligned and integrated with the EDM Council Data Management Capability Assessment Model (DCAM) Framework.

This document covers Objective 1 and Objective 2. Objective 3 will be included in a later publication.

Key Observations

  • Purpose of a CDE is to prioritize your data based on criticality – this allows you to identify a scope of the most important data to bring into a heightened level of control and accountability
  • Determining criticality is a business process perspective based on the Data Consumer process – and thus is determined at the conceptual level – the actual physical level data elements aligned to the conceptual level inherit the criticality
  • Organizations that attempted to use a precise calculation to identify criticality did not achieve adoption and ultimately abandoned the science for a more artful analysis which included a negotiation between the Data Producer and Data Consumer to agree upon prioritized data based on the criticality
  • Derived data can be deemed critical; however, the implications of criticality must be applied to the atomic data that is an input to the derived value
  • Granular data used in a derivation should be independently evaluated for criticality based on the material impact each has on the derived value
  • The implication of designating criticality requires a heightened level of control; these controls include governance, metadata, data flow/lineage, data quality, transformation, and movement controls

Issue

The concept of a Critical Data Element (CDE) originated within the Financial Service Industry. In 2013 the BCBS 239 Principles were published. Principle 1 introduced the phrase ‘identify data critical to risk data aggregation.’ This phrase is now commonly referred to as identifying CDEs even though BCBS 239 does not use the term.

Since that time, financial organizations have struggled to execute a process to identify CDEs. There are confusion and variation in the method and the resulting CDEs.

To further understand the issue of managing CDEs, the Best Practice Work Group developed a set of questions to understand the current state and establish the intended scope for the Best Practice.

  • What is a CDE?
  • How are CDEs identified?
  • What makes a Data Element critical?
  • What are the criteria for determining criticality?
  • Who determines criticality?
  • What is the impact of a data element identified as critical?
  • What is an appropriate volume of CDEs?
  • Can CDEs have different levels of importance?
  • Are CDEs atomic elements, or derived?
  • If a CDE is a derived element, does this imply that the composite elements used to create the derived element are also CDEs?
  • Is it possible to identify industry-standard CDEs, or is it an organization-specific exercise?

Industry Current State

The framing questions listed above were used to learn more about actual experiences banks had with implementing CDEs. Fourteen EDM Council member organizations were interviewed. What was learned about the current state of CDE management is summarized in the following section, the full report titled CDE Member Research Interim Report was published in November 2017 and is available to EDM Council members.

The member organizations selected for interviews were those that the EDM Council had an awareness of their efforts to manage CDEs. A high degree of engagement was validated but with significant variation across the organizations in CDE definition, volume, the process for identifying, and, the level of data management rigor applied to manage CDEs. Even from mature efforts, there was confusion and lack of confidence in how to manage CDEs with little to no consistency across the organization.

Current State Finding 1: No Consistent Definition of a CDE

One of the most significant challenges is the lack of consistency in distinguishing a granular data attribute from a derived or calculated business measure. Many firms are using the same terminology to describe logical concepts, business objectives, calculation processes, derived elements, and physical expression. The concepts described above are all real and essential things – but they are not the same thing – and by calling them all critical data elements leads to significant confusion.

Current State Finding 2: Inconsistent Process for Designating CDEs

General agreement that the business process defines criticality existed across the organizations. Still, there was a lack of acknowledgment of all the business processes that may be consuming the same data. Their approach did not include the concept of a data supply chain. Also, the full range of stakeholders of the data often was not included in determining criticality. As a bright spot, some organizations recognized the identification of criticality as a negotiation between the data producer and the data consumer.

Some organizations had attempted to quantify criticality by applying a matrix formula to calculate an objective criticality measurement. Without exception, this absolute measurement was abandoned for more subjective analysis. (See section: Measuring Criticality: Art or Science)

Current State Finding 3: Undefined Guidelines for Managing Criticality

The designation of a data element as critical means it is covered by the organizational policy and standards resulting in increased data management rigor to achieve a heightened level of control. The following were common themes across the firms involved in the initial interviews and the subsequent analysis by the Best Practice Work Group. However, while the issues were consistent, the execution of each theme had a high variation across the organizations.

  • Definition and Meaning – the number one issue is the challenge of locking down a precise meaning and harmonizing language.
  • Lineage – minimally, an understanding of data flow is required. Still, the difficulty, cost, and inability to adequately maintain data lineage have escalated questions as to the role and value of data lineage across all CDEs.
  • Data Quality – difficulty in negotiating agreement across multiple stakeholders to set criteria for fit-for-purpose, quality tolerance ranges and thresholds, business rules, testing requirements, and measurements.
  • Governance – managing the relationships between the data producer and one or more data consumer is the most intensive part of the governance challenge because it requires collaboration across multiple stakeholders who often do not have the framework, skills, or time from various business process subject matter experts.
  • Metadata – the inconsistencies within individual organizations in the execution of standards for metadata capture led to difficulty stitching together the different approaches to metadata collection to provide an enterprise view. This has become more apparent with the higher rigor of metadata required for CDEs.

Best Practice

Stakeholders

The Data Management function Stakeholders include:

  • Executive Leadership
  • Business Executives
  • Data Producer
  • Data Consumer
  • Data Management Practitioners (Reference: Data Management Functional Construct)
    • Chief Data Officer
    • Data Governance Executive
    • Data Officer
    • Executive Data Steward
    • Data Architect
    • Data Domain Manager
    • Business Data Steward
    • Technical Data Steward
    • Business Process Subject Matter Expert
    • Data Custodian

The remainder of the best practice focused on the activities of the Data Management Practitioners.

Scope

The scope of this best practice is comprised of three aspects:

  1. Defining a Critical Data Element
  2. Prioritizing Data Based on Criticality
  3. Implications of Criticality

Description

What is a Critical Data Element?


Objective: Create an agreed-upon understanding
of the purpose and definition of a CDE


To accurately define a CDE, it is necessary to put a CDE in the context of other things in the same neighborhood with a CDE.

The Data Neighborhood

The following is a construct that defines and creates relationships between all the things in the data neighborhood. It is a business-friendly representation of the data architecture that presents an understanding of the components and their relationships. With an understanding of the components, a process to identify criticality can be designed.

Business Element / Data Element Construct

Diagram 1: Business Element / Data Element Construct

Legend

Constructs Components

Business Term

Business Element

Data Element

Business Metadata

Technical Metadata

Physical Metadata

Business Element Types

Atomic

Derived

Determined

Critical Designation

Critical Business Element (CBE)

Critical Data Element (CDE)

The construct contains a business view and a technical view in relationship to each other. As depicted in the diagram, the players in the neighborhood include business and technical oriented resources. Separating the two views creates clearly defined accountability for the business to manage the Business Element and technology to manage the Data Element.

The business view defines the business process requirements for the data produced by the process. The business that owns the process is accountable for defining the requirements for the data including the data criticality. The requirements are defined as Business Terms and Business Elements with all the appropriate business metadata. The Best Practice workgroup determined there was sufficient difference between a Business Term and Business Element they warranted clear separation, and both were different than a Data Element.

Similarly, the technical view is an interpretation of the business process requirements for data transformed into technical data requirements. The data requirements are defined as a Data Element with all the appropriate technical metadata including the physical metadata. The use of the Data Element term is aligned to ISO Data Element standard to ensure architecture consistency with other standards.

The Business Element is conceptual, and the Data Element is the technological execution of the Business Element. The ISO standards body have defined a “Data Element”, and the EDMC Data Neighborhood reflects the ISO definition and clearly distinguishes that from a “Business Element”.

Determining whether data is critical is from the perspective of the business process that is consuming the data. Criticality is a business designation based on an assessment of the material impact the data has on the outcome of a business process. It is this principle that places accountability on the business to identify critical data as part of the requirements for data, so the identification is part of the requirements for the Business Element. Therefore, a Business Element that is critical is a Critical Business Element (CBE) and will also have a corresponding Critical Data Element (CDE).

This will be presented more fully in the section titled Data Producer / Data Consumer Relationship.

Validation

Two approaches were used to validate the Business Element / Data Element Construct worked in real life examples and were consistent with other architectural viewpoints.

  1. Use Case – apply the construct to actual data that have different type and levels of complexity
  2. Data Architecture & Modeling – align the construct with traditional data architecture and modeling standards

For a full review of the validation analysis please review the Best Practice: Business Element / Data Element Construct.

CDE Purpose – Prioritizing Data

The objective of prioritizing the data is to identify which data is critical to the business processes consuming the data and thus requires heightened levels of control to ensure the data is fit-for-purpose.

The Basel Committee on Banking Supervision’s standard titled Principles for Effective Risk Data Aggregation and Risk Reporting (more commonly referred to as BCBS 239) is often cited as requiring the identification of “Critical Data Elements” (CDEs), when actually, the language is “data that is critical”. The related BCBS 239 citings follow:

  • BCBS 239: Paragraph 16 – The Principles and supervisory expectations contained in this paper apply to a bank’s risk management data. This includes data that is critical to enabling the bank to manage the risks it faces. Risk data and reports should provide management with the ability to monitor and track risks relative to the bank’s risk tolerance/appetite.
  • BCBS 239: Paragraph 30 – Senior management should also identify data critical to risk data aggregation and IT infrastructure initiatives through its strategic IT planning process.
  • BCBS 239: Paragraph 43 – Supervisors expect banks to produce aggregated risk data that is complete and to measure and monitor the completeness of their risk data. Where risk data is not entirely complete, the impact should not be critical to the bank’s ability to manage its risks effectively.

BCBS 239 is targeting the bank’s Risk Management data but the concept applies to all data for all business processes, not just Risk or Finance organizations.

Prioritizing Data Based on Criticality


Objective: Develop a best practice process
and tools for the identification of CDEs


Overview

Since not all data has the same significance or impact the highest risk data needs to be addressed first. This section covers an approach using criticality to enable the organization to prioritize and bring under control its data assets. Prioritization is a key part of a successful data management program, without appropriate prioritization the program is at risk of being overwhelmed with too much data to manage sooner than the organization can deliver.

Drivers of Criticality with Materiality Overlay

To identify criticality, one must introduce the concept of materiality which aligns with the standard notion of a risk-based approach.

Some organizations have introduced levels of materiality with the highest level identified as Critical.

Diagram 2: Drivers of Criticality with Materiality Overlay
  • Critical
  • Important
  • Significant
  • Unimportant
  • Not Reviewed

What makes data critical is the material impact it has on the outcome of a business process.

  • Criticality comes from the perspective of the business process that is consuming the data (Data Consumer)
  • Determining materiality is a negotiation between the Data Producer and Data Consumer
  • It is common, for a Consumer to assume that all data are critical
  • There are drivers associated (see next section for more details) with criticality that can guide the negotiation
  • Criticality has not been successfully quantified using well defined rules, leaving it to be determined through an artful analysis rather than a hard science

Measuring Criticality: Art or Science?

Using the drivers of criticality as outlined above, a Criticality Evaluation Matrix is a valuable tool to support the decision-making process.

Many organizations initially attempted to approach identifying criticality with a quantified calculation. This is represented in the Objective Rating row in the matrix below. This may include weighting of the relative importance of any of the criterion. This rating scale needs to be driven by the risk appetite of the organization.

However, member organizations that tried to quantitatively measure data criticality experienced poor adoption and ultimately switched the approach to use the spirit of the measurement as art versus science as represented in the Subjective Rating row of the matrix.

Essentially, the subjective approach uses the matrix as a guideline to identify potential critical data and assess the scope of the impact of the critical data (Global, Regional, Local). This is done to inform a negotiation between the Data Producer and Data Consumer to reach an agreement on the data that is truly critical.

Regulatory
RegulatoryLegalReporting LevelReputa-tional ImpactFinancial ImpactEnterprise Perfor-manceOperational RiskRisk Manage-ment
Indicates whether the data attribute is mandated by regulatory entities to be used and managed in a particular way. Data provided to regulators. Data is subject to audit or traceability.Indicates whether the data attribute requires additional safe-guarding according to Schwab’s corporate standards i.e. it is confidential, highly confidential, proprietary and/or non-public information.Indicates the level for which the data is consumed in the organiza-tion; Board or Manage-ment Team exposure carries higher risk.Indicates whether a data attribute is relied upon by clients or third parties… if it is wrong, reputation and trust is placed at riskIndicates a quantifiable financial result to poor data quality.Indicates whether the data attribute is essential for the manage-ment of the business, to help manage the firm’s ability to meet strategic or operational objectives.Indicates whether the data attribute helps manage operational outcomes. Operational, upstream or downstream processes would fail if data was not present or incorrect.Indicates whether the data attribute enables us to anticipate or manage potential risk to the business.
Objective Rating (Sample Scale)4-Global
3-Regional
2-Local
4-Global
3-Regional
2-Local
4-Global
3-Regional
2-Local
4-Global
3-Regional
2-Local
4->$X
3->$X
2-<=$X
4-Global
3-Regional
2-Local
4-Global
3-Regional
2-Local
4-Global
3-Regional
2-Local
Subjective RatingHigh / Medium / Low
Diagram 3: Criticality Evaluation Matrix

Considerations

  • Private or sensitive data are not identified as a criterion of criticality. While private or sensitive data can be designated as critical, it is due to the material impact of the data on the business process outcome – this is different than information security or privacy concerns. Measuring the materiality of poor quality private data is by using the criteria in the construct (e.g., Regulatory, Reporting Level, Reputational Impact, etc.). More background is available in an EDM Council Best Practice published in May 2018 titled, GDPR (General Data Protection Regulation): The Role of Data Management.
  • In practice, the harder job is identifying data that is not critical.
  • An alternative to artfully measuring the drivers of criticality across all data may be first to prioritize based on the level of Data Consumer (e.g., Enterprise level-most important, External level-second, two or more business processes-third, etc.). The data consumed at the highest level would be first for evaluating criticality.

Prioritizing Criticality

Different than the objective for prioritizing data based on criticality, the objective of prioritizing the critical data is to identify the most important data based on business objectives in ranked priority. Those ranked priorities are then used to apply a heightened level of control within the time and resource constraints of the organization. There are three primary approaches to set the scope of critical data.

  • Everything: Identifying all critical data across the organization
  • Scoping Strategy: Prioritizing a subset of data by prioritized Use Case
    • Regulatory oriented – high-risk reports or programs (e.g. BCBS 239)
    • Business Process oriented criticality (business problem oriented)
    • Application oriented criticality
    • Project oriented (Fix forward – remediate backward)
  • Hybrid: Set a volume or percentage of data that can be critical (set a target %)

Regardless of the approach to setting the scope, if the volume of identified elements exceeds the current capacity of the organization, it will need to prioritize the sequence of the work further.

Prioritization Approach Pro Con
SubsetThe benefit of prioritizing a subset of data aligned to a priority use case is that it is quick and easy and can gain attention and support from senior management.The risk is that you create a false impression that this is the full scope of critical data.
ComprehensiveThe benefit of a comprehensive inventory of all critical data is that you set accurate expectations for the scope of work.The volume of in-scope data can be overwhelming and dilute the attention and support of senior management.
HybridManages the expectations of the management team.Often cannot predict the effort or timelines associated with the effort.

Data Producer / Data Consumer Relationship

Data Supply Chain

Understanding how data exists in the Data Supply Chain is key to understanding the relationship between Data Producers and Data Consumers.

A Domain consumes data from upstream producers, produces data and consumes that data within the domain, and, a domain also produces data for downstream consumers. Domain management includes reconciling all requirements for data across the data supply chain.

Diagram 4: Data Supply Chain Construct
Business Process Perspective

Accountability for data is with the owner of the business process that creates the data – the Data Producer. This adds another perspective to understanding the relationship between Data Producer and Consumer. The Data Consumer is responsible to ensure that the data is appropriately used and fit for purpose for their business process.

  • A business process has requirements for data as inputs and outputs of the process.
  • The “owner” of the business process that creates data is the Data Producer.
  • A business process that consumes data from another data domain is a Data Consumer and is responsible for defining requirements for data and holding the Data Producer accountable.
  • A Data Producer is responsible for meeting the data requirements of the Data Consumer. These requirements include precision of meaning, data quality dimensions, access, and authorization of use, monitoring, measuring, etc.
  • A Data Producer is also usually a Data Consumer. Every Data Producer consumes their data to support their process, but often they also consume data from upstream of their operation.
  • Criticality of a Business Element is proposed by the Data Consumer and validated and accepted or rejected by the Data Producer.
  • The Data Producer must first determine that the requested Business Element is within the scope of their data domain before assessing the proposed criticality of the Business Element.
  • The entire process between the Data Consumer and Data Producer is based on the requirements for data and use of the data in the consumer’s business process.
  • The heightened level of control applied to a Critical Business Element is what permits Data Producers and Data Consumers to agree to the fit-for-purpose of the data consumed.
The Negotiation

When the Data Consumer proposes criticality, the natural inclination is to declare all data consumed as critical to their business process. If the organization’s funding model places the accountability for funding solely on the Data Producer, there is no financial consequence to the Data Consumer for the cost of the enhanced control applied to Critical Data Elements. This results in a strain on the negotiation process between the Data Producer and Data Consumer. The Data Producer and Data Consumer will have to agree to operate within mutually defined resource constraints. The negotiation is further complicated when a Data Producer is managing priorities from multiple Data Consumers at which point the Data Governance framework must provide an escalation protocol for mediating priorities that exceed resource capacity of the Data Producer. The reality of resource constraints, even at the Enterprise level, requires the organization to define the level of resources that can be applied to achieve the implications of managing criticality.

Setting resource constraints aside, even when an assessment of Criticality Dimensions is used to inform the criticality designation there will be differences in opinion that will need to be negotiated. The Data Producer needs to understand the actual use of data by the Data Consumer to reach an agreement for criticality and to ensure the data are fit-for-purpose by the Data Consumer.

As stated above, the negotiation process is compounded because it is not one-to-one but a one-to-many negotiation (multiple consumers who may have variation in their requirements). One of the roles of the Data Producer is to align and manage Data Consumer requirements to develop as simple a data set as possible.

Governance of the process of agreeing to criticality needs to include an opportunity for escalation when the Data Producer and Data Consumer data domains cannot reach an agreement on criticality or prioritize criticality within the resource constraints.


Considerations

  • For the negotiation to be effective it must be fact-based with transparency between the Data Consumer requirements and the Data Producer assessment of the requirements. This transparency is even more critical when the level of subject matter expertise about the business process and the data exists on only one side of the negotiation.
  • Alignment to the organization’s funding model is critical.
  • Alignment to the organization’s governance model is critical.

Derived Data

As part of the exploration of the Data Neighborhood, there are often questions if Derived Data could be considered critical, or even if it should be managed. This section aims to define what Derived Data is and how that fits into criticality and broader data management.

Diagram 5: Derived Data Example Model

While a derived data value can be a Critical Business Element, managing criticality is at the atomic level. In the case of a derived Critical Business Element, the Data Producer should evaluate the inputs to determine if they are also critical. Each input value should be judged separately for material effect on the fit-for-purpose quality of the derived value. Do not assume that all inputs will have a material impact on the quality of the derived output and ultimately on the business process outcome.

Furthermore, if the inputs to the derived CBE are themselves derived Business Elements then first determine the material impact of poor quality of each of the derived inputs. If it is deemed to be material, then the Business Element should be defined as critical, and its inputs will then need to be evaluated for material impact. This deconstruction process must continue until all inputs are at an atomic level (or back to a point where the adequate control of the element is demonstrated), and this helps inform the Data Lineage effort often associated with Critical Elements. The quality of a derived Critical Business Element is managed at the atomic level of its “critical” inputs.

The quality of the inputs and the execution of the business logic of the derivation is a Data Management accountability. The actual business logic of the derivation is a business process accountability.

When derivations create new Business Elements, they should be recorded and an accountability review performed on the new Business Elements. If the subject matter expertise for the new data lies with the Data Consumer or the data is derived from multiple Data Producers then the accountability should shift from the original Data Producer(s) to the producer of the new derived data. Regardless of who has accountability the principle of Authoritative Provisioning Point should be maintained so that all consumers of the Business Element obtain the element from a single point of provisioning.


Considerations

  • How far back do you go to get to and manage the inputs?

CDE Implications


Objective: Develop a best practice process, procedures
and tools for managing the implications of criticality


The final objective of the Best Practice project is a work-in-process to articulate the implications of criticality as the heightened level of control requirements for the Critical Data Elements. As the work is completed a subsequent report will be published. The target areas for analysis include the following.

Governance – Engaged Governance – executive owners and Business and Technical stewards in place for every CDE with collaboration among producers, consumers, IT, and operations

Definition – Precise Meaning – for all front-to-back applications, for all business processes and for all derived formulas

Metadata – Documentation and Metadata – names, definitions, aliases, business rules, provisioning points, authorized data sources, source of data, transformation processes, logical-to-physical mapping, etc.

Data Lineage (vs Data Flow) – End-to-End Lineage – may be required to complete data forensics required to root cause fix of poor quality data (capturing all transformations and calculations across the full business lifecycle)

Data Quality – Fit-for-Purpose – quality measurements, quality thresholds, defect management, root cause analysis and remediation

DCAM Framework Alignment

All best practice aligns with the Capabilities and Sub-capabilities as defined in the DCAM Framework. The DCAM is the What description of the Capabilities needed for a comprehensive and successful Data Management initiative. The best practice article describes the industry-standard How execution of the Capabilities in an organization. For more insight, see, Using A Best Practice Article.

Capability Alignment

The capabilities required for the process of prioritizing data based on criticality align to four key Components of the DCAM Framework.

ComponentCapability Requirement
7.0 Data Control Environment● The process of data domain management is in the Data Control Environment Component.
● The process of prioritizing data based on criticality resides within the data domain management process.
3.0 Business & Data Architecture● The process of prioritizing data based on criticality is dependent on defining requirements for data, identifying data, defining data, and profiling data.
5.0 Data Quality Management● The process of prioritizing data based on criticality does not require the Data Quality Management Component. Data Quality Management is part of the overall Data Domain Management and will be integral to the process of managing the implications of criticality.
6.0 Data Governance● The process of prioritizing data based on criticality leverages the Data Governance Component for approving the metadata and the criticality designation.
Diagram 6: Capability Requirement

Design Requirements

The Work Group determined the execution of these capabilities would take place in the Data Domain Management process which brings together Data Architecture, Data Quality Management, and Data Governance activities and applies them collaboratively on a specific data domain.

Data Management Processes

A version of the Data Domain Management process had not been developed by the EDM Council so the Work Group developed a Level 2 process design and then took that design to Level 3 for those process steps that were required for the process to prirootize data based on criticality.

The full presentation of the Data Domain Management process and how it supports the process of prioritizing data based on criticality is available as a separate best practice process design titled: Data Domain Management Process.

Data Management Tools

Within the Level 3 process designs there are tools identified that support the execution of the process. Sample designs of these tools are not yet available. Users of this best practice are invited to provide comment or upload examples at the end of this article.

The tools identified are as follows:

  • Data Sharing Agreement – an agreement that sets out a common set of rules between the data producer and data consumer that establishes terms and restrictions of the use of consumed data.
  • Service Level Agreement – an agreement between a service provider and a service consumer, minimally covering quality, availability, responsibilities.
  • Business Element Request Form – the form includes a standard set of required and optional (if known) attributes necessary to accurately communicate the request to the Data Producer

Requirements for Data

Every business process, including the data management processes, include requirements for data as an input and output of the process design.

The requirements for data defined for prioritizing data based on criticality are classification data as follows:

  • Critical Business Element flag
  • Critical Data Element flag
  • Critical Prioritization flag
  • Data Consumer Name
  • Data Sharing Agreement criteria
  • Service Consumer Name
  • Service Level Agreement criteria

Appendix

About the Work Group

One of the most significant regulatory directives since the 2008 financial crisis has been the introduction of the “Principles for Effective Risk Data Aggregation and Risk Reporting” or BCBS 239. The Principles outlined in this directive require banks to establish sound information infrastructures to support their risk and risk reporting functions. As part of creating the required control environment, a common practice in the financial services industry is the establishment of CDEs or “critical data elements”.

In the 2017 Industry Benchmark Study, the management of CDEs was identified as a top challenge universally across the industry. Members report that there is uncertainty regarding the exact definition of a CDE, how is it designated, or how should it be used to satisfy the control requirement.

In August of 2017, the Council held a CDE webinar briefing for all members to propose a work effort to develop a best practice recommendation for identifying and managing CDEs. The forum was an open invitation for representatives from member organizations to join a Work Group. The Work Group was then formed and today contains approximately 60 members representing all aspects of the industry (GSIBs, SIFIs, buy-side, sell-side, geographic, consultants, vendors).

The project objective was to create an agreed upon understanding of the purpose and definition of a CDE. Then, based on that purpose and definition develop a best practice process, procedures, and tools for the identification of CDEs and for managing the implications of criticality. The execution of the process, procedures and tools will be aligned with the DCAM Framework and the Data Management capabilities it defines. The output of this effort will be shared with banks and regulatory bodies alike.

Work Group Members – organization affiliation as of November 2018

Arzaga, Raymund – Scotiabank
Atkin, Mike – EDMC
Bala, Sathya – Deutsche Bank
Bersie, Bret* – US Bank
Bland, Karen* – Moody’s Corporation
Brophy, Doris – Societe Generale
Deligiannis, Greg – S&P Global Ratings
Dewsbury, Jeff – DTCC
Dimitrion, Genevey – State Street
Doyle, Martin* – DQ Global
Farenci, Susan – MUFG Union Bank
Finnen, Michael – Mitsubishi UFJ Financial Group
Fruhstuck, Mary – BNY Mellon | Pershing
Giardin, Christopher – IBM Hybrid Cloud
Gordon, Andrew – Deutsche Bank
Hawkins, Matthew* – Goldman Sachs
Isaac, Gareth* – Ortecha
Jeffries, Denise          
Keslick, Rob – BMO
Klaentschi, Kathryn    
Lawson, Andrew – Brickendon
Liu, Irene – PWC
McAdams, Curtis       
McQueen, Mark* – EDMC / Ortecha
Nham, Annie – Macquarie Group Limited
Pandya, Hiten* – HSBC Bank
Robeen, Erica – Mastercard
Rolles, Daniel – EXL Service Holdings, Inc.
Roper, Michael           
Sondhi, Alok – DTCC
Tang, Alec – ADIA
Townsend, Millie – Charles Schwab
Zlat, Olga – Vanguard

* Data Architecture Subgroup Member

About the Authors

Gareth Isaac is a Partner in Ortecha, an EDM Council DCAM Authorized Partner data consultancy located in the UK and the US. He is a professional Data practitioner who works with stakeholders – both leadership and subject matter experts – to understand the complex challenges involved with improving processes and data throughout the end to end information lifecycle. Gareth has worked with multiple GSIBs over the years to help improve their data management practices, specializing in data lineage, control frameworks and governance functions.

gareth.isaac@ortecha.com
+44 20 3239 3823

Mark McQueen is the Senior Advisor, DCAM to the EDM Council. He joined the Council in 2016 and now leads the Best Practice Program to develop Data Management industry-standard processes for executing the DCAM Framework. Mark has over 20 years with a Fortune 25 GSIB where he was the business Data Management Executive for the Wholesale Bank. In addition to Best Practice Program facilitation, he provides training and EDMC Advisory Services related to the adoption and execution of the DCAM Framework in member organizations.

Mark is a DCAM Certified Trainer, Six Sigma Black Belt Certified, and Strategic Foresight accredited – University of Houston.

Mark is a Partner in Ortecha, an EDM Council DCAM Authorized Partner data consultancy located in the UK and the US.

mmcqueen@edmcouncil.org
+1 615.308.6465


Revision History

DateAuthorDescription
November 2018Gareth Isaac, Mark McQueenInitial Publication
March 2020Mark McQueenKnowledge Portal Release; Broke out BE/DE Construct & Data Domain Management Process into Separate Articles

3 thoughts on “Prioritizing Data Based on Criticality: Critical Data Elements (CDEs) in Context”

  1. Does anyone have any worked examples they can share that show the relationship between a critical business element and a critical data element? It would be really helpful to see a few examples if possible.

    Thanks
    David

    In reference to Component: - None -

  2. The link to the CDE Member Research Interim Report is not found. Is there any way you could share the correct link?

    In reference to Component: - None -

    1. Vitalia – the issue has been fixed and the Interim Report is now accessible through the hyperlink.

      Thank you for pointing that out. Please let me know if you have any more issues with it and I’ll send it to you directly.

      Best regards.

      In reference to Component: - None -

Leave a Reply

Be a thought leader, share your best practice with other industry practitioners. Join the DCAM User Group or the CDMC Interest Group (or both). Then share this invitation with your fellow members - let’s get the crowd moving.
Join the Crowd