Accurate, reliable, consistent, and readily accessible data is critical to just about every facet of modern bank operations, from customer relationship management (CRM) and portfolio management to regulatory compliance, stress testing, and financial reporting. The transition to a current expected credit loss (CECL) model for calculating impairment is adding further urgency to banks’ concerns over data challenges.
Because data quality is so crucial to bank management, effective data governance has become a strategic priority in many organizations. To begin identifying and addressing the root causes of data quality issues, banks first must establish sound data governance practices and understand how to apply them in a wide variety of use cases.
The overarching objective of effective data governance is to establish an environment in which data is treated as a business-critical asset. Data is trusted in such an environment – not only in terms of data availability, but also in terms of its reliability and consistency.
Ideally, banks will come to regard data as an asset that is comparable to a utility; that is, users are confident it is always available and functioning properly, and they do not have to think about where their critical data came from and whether they can rely on it. They also do not have to think about the background processes that make this data available and accessible to them.
Unfortunately, many banks have a considerable distance to go in achieving such an environment. There are many red flags banks should look for to alert them to data quality issues. Some of the most common indicators include:
The effects of such shortcomings can be seen throughout an organization. In addition to diminishing overall organizational confidence and limiting competitive strategies, data governance issues also can have a negative effect on a broad array of critical functions, including business operations, financial performance, risk and regulatory compliance, and CRM.
What do you feel are the biggest contributors to data quality and governance issues at your bank?
The underlying causes of data quality and reliability problems can be numerous and wide-ranging. In many instances, the bank’s operational processes and technology systems have grown over time – often through mergers or acquisitions – and the supporting data systems have not kept pace. In addition, data system budgets often take a back seat to spending on compliance and operational issues.
These challenges often are exacerbated when business lines recognize they cannot meet critical new requirements using the existing data systems and processes. To compensate, they develop their own solutions or workarounds using spreadsheets or other generic office software. Such one-off solutions result in an even larger number of isolated, ungoverned, and uncontrolled data sources, which further erodes users’ overall data confidence.
The difficulties associated with identifying and attacking the root causes of data quality issues were made obvious in a recent online survey, conducted by Crowe Horwath LLP as part of a webinar presented to banking professionals. When webinar participants were asked to identify the biggest contributors to data quality and governance issues at their banks, their responses covered a broad range of causes, with no single issue identified as a clear leader. In fact, as Exhibit 1 illustrates, more respondents chose “all of the above” than any single issue.
The relatively even distribution of survey responses reinforces a critical point: there is no single, simple answer or approach that banks can pursue in order to improve data quality and governance.
What initiatives or requirements have you most concerned about data quality and data governance?
As seen in Exhibit 2, 62 percent of the participants identified upcoming CECL demands as their area of greatest concern related to their organizations’ data quality and data governance practices.
Just as data quality shortcomings can stem from many diverse root causes, improved data quality and governance can have positive effects across many diverse aspects of bank operations. The upcoming CECL transition is only one of several regulatory compliance and financial reporting requirements in which data quality issues are playing an increasingly important role.
Stress testing is another area in which access to trusted data is crucial. The forecasts and models that banks use for Dodd-Frank Act stress testing (DFAST) or Comprehensive Capital Analysis and Review (CCAR) processes use data from required quarterly call reports as a starting point. Yet many organizations still rely on manual data acquisition processes, which hamper their ability to document, trace, and show the lineage of the data used in their stress-testing production.
Compliance with the Bank Secrecy Act, USA PATRIOT Act, and other anti-money laundering requirements also is heavily dependent on data quality. These programs involve the movement and monitoring of massive volumes of customer and transactional systems on a daily basis. Banks must have complete trust in the data lineage, along with full auditability and reconciliation capabilities at every step.
Outside of the regulatory and financial reporting arenas, data quality also has a direct impact on many operational and strategic domains in banking. For example, effective CRM initiatives require immediate access to accurate and trusted data, with a single source of truth that can provide a fully linked 360-degree view of all of a customer’s relationships.
In addition to having direct and immediate access to consistent and reliable information for use in real-time interactions, banks also require reliable, accurate, and consistently formatted historical data. A growing number of banks are exploring artificial intelligence capabilities, along with advanced predictive and prescriptive machine learning models that can help them address customer churn and attrition, behavioral marketing, and other CRM concerns. Quality data is the essential foundation for all such programs.
Given the critical role that data quality plays in so many aspects of bank operations, management and executive teams must have a clear understanding of how an effective data governance structure would operate. One way to accelerate this understanding is to establish an environment in which all parties recognize and validate the following seven key principles of effective data governance:
These seven broad principles can provide the foundation for more consistent and reliable data quality, but they are only the foundation. Banks must build on them and reinforce them through an organizationwide commitment, while also devoting adequate funding to maintain data quality and availability across the bank – not just within a specific project or department.
Improving data quality is not a one-time event or short-term effort. Rather, it should be viewed as an ongoing process that will remain a priority for the organization in the long term. With this long view in mind, however, there are some basic first steps that banks can take to get started and begin to build critical momentum.
Most banks begin by establishing a baseline and documenting the current state of data systems, access, quality, and governance in the organization. The goal is not to establish fault or affix blame for data quality problems, but rather to provide an accurate assessment of the current data environment.
Like most improvement initiatives, improving data quality will require a coordinated effort involving three fundamental elements: people, processes, and tools. While there is a natural tendency to gravitate toward data tools or technology as the starting point, a more effective approach is to focus first on the people and processes that are involved.
It is important to establish clear ownership and responsibility for all data, with designated data stewards answering to an organizationwide data governance board. Technology will play a critical role in terms of implementation processes and controls, but establishing the organizational roles and processes should happen first.
In developing the improvement road map, the implementation effort should be carefully and thoughtfully prioritized, based on a determination of which elements offer the highest business value, as well as which elements present the most immediate issues and risks. While the CECL transition might appear to be the most urgent and challenging priority at the moment, the drive for improved data quality should not be tied solely to any single event or project.
Rather, the investment in improved data governance should be viewed strategically. This means establishing and monitoring clear metrics to measure the overall success of the effort, with regular monitoring and comprehensive tracking of the initiative’s financial return on an organizationwide basis.
Because data quality is so crucial to bank management, effective data governance has become a strategic priority for more and more of today’s banks. The objective of their improvement efforts must be to establish an environment in which data quality is trusted without question, and data access and reliability are matters of routine. Establishing such an environment can benefit nearly every facet of bank operations including financial performance.