News

Industries

Companies

Jobs

Events

People

Video

Audio

Galleries

My Biz

Submit content

My Account

Advertise with us

How does data quality underpin success in a changing and disrupted banking environment?

The banking industry is currently in a state of flux. Traditional banks have to contend with new competitors unburdened by legacy technology or business models, making them far more agile and responsive.
Gary Allemann, managing director of Master Data Management
Gary Allemann, managing director of Master Data Management

These advantages enable new entrants to offer low-cost banking, new product offerings and services tailored toward changing customer needs. To remain competitive, traditional banks must be able to make intelligent decisions on how best to serve their customers – and the crux of intelligent decision-making is quality data.

Banks are looking for new ways of doing business, and this requires an in-depth understanding of their market and customers. Adding to this pressure, there are many regulations that the banking industry must comply with. All of these challenges require data, but simply having data is not enough. To be competitive and comply, banks need quality data that is accessible and can be trusted.

Poor data quality leads to errors in decision making, which can be costly

Incorrectly marketed products, for instance, will see poor uptake at best, and at worst can actually cause customer attrition. Accurate customer segmentation can also improve customer satisfaction, since risk profiling can be completed more accurately, and therefore customers can be offered lower interest rates and better service offerings.

Increasingly, experts agree that the accuracy of advanced analytics capabilities such as machine learning, artificial intelligence and big data are heavily dependent on the quality of the raw material – data. As traditional banks make choices that will allow them to compete with newer entrants their ability to leverage quality information could be their game changer.

The quality of data is also critical for compliance reasons

For anti-money laundering (AML) purposes, for example, you need to be able to verify information, trace transactions and so on, which requires accurate and accessible information. The accuracy of risk calculations, as regulated by the Basel Committee on Banking Supervision's standard number 239 (BCBS 239) must also be verifiable and impacts the amount of capital a bank must hold in reserve.

Better quality risk data frees up capital to give better returns to shareholders. And, of course, the Protection of Personal Information (PoPI) Act, and similar regulations around the world require that customer’s data be of adequate quality so as to ensure that decisions made based on that data are not prejudicial.

In addition, there is a hidden operational cost to poor quality data. For example, if the contact details you have for a customer are incorrect, and the customer defaults, you will be unable to contact them, which will require significant additional expense to collect on defaulted payments.

Data needs to be captured correctly from the outset – the well-known 1-10-100 rule applies here. If it costs you R1 to capture the information, it will cost you R10 to correct it and R100 in additional costs to resolve issues caused by incorrect information. Investments in digital platforms must include data quality if the operational benefits of digital are to be realised.

To be competitive, control costs, optimise product offerings and enhance customer service, it is imperative to prioritise data quality. Automated data cleansing is key, not as an afterthought but at the actual point of capture, to validate the data as it is inputted. This will enable an accurate, real-time view of the customer.

It is also essential to remove data siloes to enable a full single view of the customer across the entire enterprise, so that risk profiles, total value and product targeting can become accurate. Data needs to be treated as a business asset, which requires strong data governance and quality initiatives.

Quality data underpins agility and the ability to make fast, accurate decisions, and is essential for moving forward in a digital world. Digital and data are intrinsically intertwined, and it is critical to automate and reduce processes around data capture.

However, if the data you are capturing is incorrect or inaccurate then it is useless. Automation means that errors can be propagated more quickly and can easily become pervasive.

Data quality and verification are more important than ever, to ensure data can be trusted as part of an effective digital strategy. This digital strategy is what will take banking forward and enable it to remain competitive in a changing market.

About Gary Allemann

MD of Master Data Management He is passionate about Information Communication Technology (ICT) and more specifically data quality, data management and data governance.
Let's do Biz