Subscribe to industry newsletters

Advertise on Bizcommunity

Creating the elusive single customer view

Many organisations are investing significant sums and extensive time into implementing Customer Relationship Management (CRM) systems for their purported ability to deliver improved customer relations and client retention.
However, despite this investment, many CRM initiatives will fall far short of expectations. The reality is that CRM will not magically give organisations a single view of their customers, which is the heart of the problem. Data quality plays a major role in the success of CRM - without quality data it is impossible to achieve that elusive 'single customer view'. Without a single view of the customer, the insights delivered by the CRM solution, no matter how sophisticated, will be fundamentally flawed. Addressing customer data quality issues is thus critical to leveraging value from CRM investments.

Data duplication

CRM systems rely on customer data in order to provide insights that will help organisations to maintain and retain customers. One of the biggest data quality issues for CRM is data duplication - the same contact or lead may be captured multiple times due to small, seemingly insignificant details such as their city of residence being listed as Cape Town, Capetown and Kaapstad, for example. Human error is the most common cause of this problem. As a result, data duplication will always be an issue. If it can be addressed quickly, it will not significantly affect the performance of CRM. However, studies have shown that should data duplication levels rise above just 2% of data, reporting becomes unacceptably inaccurate, leading to flawed insights and actions.

© fotoscool -

The issue here lies in the fact that managing customer data in today's world is no simple task, and it is impossible to manage using manual processes. Ultimately, a single view of the customer is not one-dimensional, and needs to integrate customer master data, product purchases, call history, credit history and more, from multiple information sources across all divisions in an organisation.

Given the sheer volume of data generated on a daily, even hourly basis, the effort involved in de-duplicating customer information overwhelms administrative staff. This complexity is only compounded when one considers the number of different data sources that exist. In addition, manual processes are always prone to human error.

Maintaining credibility

Furthermore, problems with existing data and constantly changing data can add yet another layer of complexity. Identifying related customers is also challenging with manual processes. When errors creep in and begin to affect the quality of reporting and insight, users will stop trusting the system and stop using it, resulting in a wasted investment. In order to ensure value is leveraged from CRM investments, it is vital to maintain the credibility of the system.

Automated data quality maintenance systems, which should form part of any CRM investment, will assist with correcting and standardising data, as well as consolidating multiple and duplicate records about a single person.

Local knowledge is important. For example, South African client records may be captured in English or Afrikaans, while multinationals may have to cater for emerging markets across Africa, China, India and South America. A true multinational data quality solution must identify consumer households and individuals across records irrespective of the language used to capture each record.

Without a focus on data quality, an organisation's CRM efforts will not be successful. Learn more about managing the data complexities and risks inherent in your CRM implementation and plan to 'Create a Single Customer View' with this whitepaper.

About Gary Allemann

MD of Master Data Management He is passionate about Information Communication Technology (ICT) and more specifically data quality, data management and data governance.
Tommy Land
Very well said.
Posted on 21 Jul 2014 21:08
Martin Doyle
In my opinion, finding CRM duplicates is relatively well understood.The greater challenge is how to manage them once identified and minimize the human intervention required to identify the best (golden) record and then merge 1:1 data and re-align 1:M data before retiring the duplicates.To overcome these challenges we have built deduplication and merging technology for CRM's and databases so that businesses can capitalize on higher quality data and minimize costly human intervention.
Posted on 22 Jul 2014 08:56
Andrew Heriot
It is true that data quality is paramount, however, if there are multiple sources of data that we want to include in a single view, then we need to understand that these data sources are often products of distinct business processes, administrative regimes and projects. Stitching it all together is a tough call. There are automated approaches and there are administrative approaches to these sorts of problems, Both of these run the risk that further data artefacts or errors will be created. In real terms, the best thing that can be done is to tackle the problems at source and implement or adjust systems so that data accuracy is achieved by design.Recently I wrote a blog offering guidance for maintaining clean data – may be this assist: #sthash.Xs9W2xcp.dpbs Andrew HeriotHead of Customer
Posted on 13 Aug 2014 10:51