Information Risk and Data Quality Management

Firms today heavily rely on information to run and improve business objectives; this dependency introduces risks that affect the achievability of the organization’s goals. For this reason, no organization is complete without instituting a process that measures, reports, reacts to, or controls the risk of poor data quality.

Organization Risk, Business Impacts, and Data Quality

For a business that heavily relies on high-quality data, flawed data delays or obstructs the successful completion of the said business processes. Determining the specific impact relating to different data can be difficult, and assessing the impact can, however, can be simplified through the characterization of impacts within different taxonomies (the branch of science concerned with classification).

Categories in the said taxonomy relate to different business finance, confidence, and compliance activities. In categorizing the impact, there are two ways of looking at information:

  1. How flawed information impacts organizational risk; and
  2. The type of data failures that creates the exposure.

Business Impact of Poor Data Quality

Data quality issues occur within different business processes. A data quality analysis should, therefore, include business impact assessment to identify and prioritize risks. Business impacts associated with data errors are categorized within a scheme to simplify the analysis.

The said classification scheme defines the following six primary categories:

  1. Financial effects like increased operational costs, decreased income, delays reduction, and missed opportunities;
  2. Confidence-based impacts such as decreased organization trust or inconsistence operation;
  3. Satisfaction impact- employees, customers and supplies satisfaction;
  4. Productivity impacts such as increased workload, decreased throughput or increased processing time;
  5. Risk impact associated with credit assessment, investment risk, and competitive risk;
  6. Compliance is jeopardized.

In most cases, despite the focus on financial impacts, risk and compliance impacts are largely compromised by the data quality issues. Though the sources of these areas of risk differ, they all mandate the use or presentation of high-quality information and require means of showing the adequacy of internal controls overseeing that quality to external parties such as auditors. What this means is that the organization needs to manage the quality of data and have processes that are transparent and auditable.

Information Flaws

Business impacts relate to flaws in critical data elements that a business process depends on. Erred data may result from: data entry errors, missing data, duplicate records, inconsistent data, and nonstandard format, and complex data transformation among others.

Data Quality Expectations

Articulating the business user expectation for data quality and asserting specifications to be used to monitor organization conformance is the first step toward managing the risk associated with the introduction of flawed data into the environment. The aim of doing this is to be able to group business user expectations in terms of acceptability thresholds applied to quantifiers for data quality that correlates to different types of business impacts.

The different dimensions of data quality used are:

  1. Accuracy: measures the degree with which data instances compare to real-life entities to be evaluated;
  2. Completeness: specifies the expectations regarding the population of data attributes;
  3. Consistency: measures reasonable comparison of values in one data set to those in another data;
  4. Reasonableness: measures conformance to consistency expectation relevant to specific operational contexts;
  5. Uniqueness: measures the number of inadvertent duplicate records that exist within a dataset;
  6. Currency: measures the degree to which data is current with the world it models.

Mapping Business Policies to Data Rules

After identifying the different dimensions of data quality, it is possible to map the information policies to their corresponding to those dimensions. For example, a company policy that specifies that personal data may be shared only if the user has not opted-out defines an information policy. In this example, the data model must have a data attribute that clearly shows whether a user has opted out of information sharing. It should also provide a measurable metric, i.e., the count of shared records for opted-out users.

Note that the previous examples can be applied to all business policies. These assertions can be expressed as rules for determining whether or not a record conforms to the expectation. When it results in a count of nonconforming records, the assertion is a quantifiable measurement.

Once the methods are reviewed, what follows is to interview the business users to determine acceptability thresholds. When acceptability is below the threshold, it means that data does not meet business expectation, and it shows the boundary at which noncompliance may lead to material impact to downstream business functions.

Data Quality Inspection, Control, and Oversight: Operational Data Governance

Controlling the quality of data throughout the information processing flow enables the on-time assessment, initiation of remediation, and audit trail demonstrating level of quality of the data.

Operational data governance can be defined as the manifestation of the processes and protocol that ensures the acceptable level of confidence effectively satisfies its data needs. It defines the roles, responsibilities, and accountabilities associated with data quality. It also combines the abilities to identify data errors and activities necessary to address the errors. To do this, data inspection processes are put in place to measure and monitor compliance with data quality rules.

Note that data validation and quality inspection are not the same things. Therefore, inspection is an ongoing process to:

  1. Reduce the number of errors for a level that is reasonable and manageable to be achieved;
  2. Facilitate the identification of data flaws among a protocol; and
  3. Initiate a mitigation or remediation of the cause.

Managing Information Risk via a Data Quality Scorecard

Apart from practices that measures and monitor certain aspects of organization data quality, there exist an opportunity to evaluate the relationship between the business impact of noncompliant data as indicated by the business clients and defined thresholds for data quality. The standard against which the data is measured is the degree of acceptability. The measure covers conformance to the defined standards as well as monitoring the staff’s ability in taking action when data does not conform.

The frameworks for defining metrics are provided by the dimensions of data quality relevant within the business context. The degree of reportability and controllability may differ depending on one’s role in the organization. In order to properly resolve issues, data stewards need to focus on continuous monitoring.

The need to present higher-level data quality scores introduces the following distinctions:

  1. Basel-level metrics: a simple metric based on measuring against defined dimensions of data quality; and
  2. Complex metrics: a higher level concept that represents a rolled-up score computed as a function of applying specific weights to a collection of existing metrics.

For a report in a scorecard, complex data quality metrics can be accumulated in the following three ways:

  1. By issues: Evaluating the impact of specific data quality across multiple processes demonstrates the spread of trouble across the organization caused by a specific data flaw;
  2. By business process: In this view, managers can examine risks and failure preventing the business process; and
  3. By business impact: Since an impact may incur as a result of a number of different data quality issues, this view displays the aggregation of business impacts rolled up from the different issues across different process flaw.

Managing Scorecard Views

Each of the views above requires the construction and management of a hierarchy of metrics related to the various levels of accountability to support the organizations business objectives. Each is, however, supported by describing, defining, and managing both base-level and complex metrics in that:

  1. Scorecards reflecting business relevance are driven by the hierarchical rollup of metrics;
  2. The definition of metrics is separated from its contextual use; and
  3. The appropriate level of presentation can be materialized based on the level of expected detail for a specific data governance role and accountability.

Practice Questions

1) Which of the following is not a factor to be considered by Higher-North Bank while looking at data quality expectation?

  1. Accuracy
  2. Completeness
  3. Consistency
  4. Variability

The correct answer is D.

It is important to determine the accuracy of data, completeness, and consistency. All these factors are necessary for determining the quality of data while variability does not affect the quality of data.


Leave a Comment

X