Moving to enterprise data quality
Continually expanding data volumes result in traditional approaches to data quality becoming insufficient in ensuring that organisational data is fit for purpose and is managed as a corporate asset. Added to this is the challenge of increasingly complex environments, where data is often held in the cloud or by third-party providers.
The traditional approach - multiple tactical data quality projects and point solutions - is increasingly unable to provide sustainable quality data. A new approach is required - one that is capable of delivering consistent data quality across all systems and applications in an organisation.
It is estimated that by 2015 the average organisation will hold 700 times the volume of data compared to the beginning of the century. This data explosion will only get worse - by 2020 companies will be managing 7000 times as much data as they were holding in 2000.
The number of data stores is also proliferating - companies must also cope with diverse environments, ranging from legacy Enterprise Resource Planning (ERP), mainframe and data warehouse environments, to cutting-edge cloud, Hadoop and other big data platforms.
Point solution approaches are no longer the answer
In the past, a typical approach to managing data quality was to adopt a "per application" approach or focus on point solutions where particular data quality problems are identified and addressed. Storing and managing this data is increasingly challenging and, in terms of data quality and data governance, point solution approaches are no longer the answer.
Take, for example, an increasingly common organisational scenario today. A company may have an internally hosted ERP system from one supplier, a cloud-based Customer Relationship Management (CRM) from another, and an Extract, Transform, Load (ETL) functionality from a third, feeding a data warehouse provided by yet another provider. Obtaining consistent, quality data across all of these platforms, with multiple providers, is a typical challenge for large organisations today.
Cleansing of specific sets of data in one system, for example customer information lists, financial records, inventories and so on, has provided benefits to the organisation, but the resulting inconsistencies across systems cause frustration and additional costs. At best, the problem is partially solved.
The result? Data quality improvements remain a niche activity, focused on a few applications, but the majority of data remains uncontrolled. Given the sheer volume of data being generated on a daily basis, this approach means that problems continue to escalate.
There is no such thing as a homogenous single-vendor environment anymore, and organisations need a solution capable of delivering consistent, quality data across the various systems, applications and solutions that make up enterprise architecture.
A strategic, enterprise-wide approach is needed to allow organisations to harness the business value of their data. Broad and sustainable data quality improvement can only be delivered through proactive, cross-organisational collaboration, with a close partnership between business and IT.
The benefits of such an approach can be seen in pioneering organisations, such as British Telecom (BT). Over a number of years, the group moved from an application to an enterprise view of data quality, measuring quantified business savings running into billions of rand.
The business case for enterprise data quality is established. Evolving from point data quality approaches to managing data quality at an enterprise level is critical in turning this data into a strategic corporate asset. A new information quality management methodology, built on the back of more than 20 years of data quality management experience at over 2000 sites worldwide, is available to help guide organisations through this process.
For more information, download the white paper from www.masterdata.co.za/index.php/get-the-whitepaper-proactive-data-quality.