Implemented enterprise-wide data quality management program


  • CASE STUDIES
  • October 31st, 2018
  •   6373 Views

What We Did:  Enabled institutionalization of data quality management, thus making it easier for the business to trust the data and make informed business decisions.

The Impact We Made: Enabled the business to reduce operational inefficiencies which led to reduction in overall costs (detection costs, prevention costs, correction costs, rollback/ rework costs) by 5% percent.

Summary – Dynamic data quality framework         

ISG (Information Services Group) of the client organization is responsible for driving information usability and analytics to generate better insight and enable fact based decision making. The organization has the power to tap into massive amounts of data to measure and understand what consumers want in order to make sound business decisions, drive product innovation and provide a better customer experience. The client partnered with Mu Sigma to implement a comprehensive enterprise data quality management capability.  

About The Client – A leading personal insurance company

The client is one of the largest publicly held personal line insurers. The Information Services Group of the organization is responsible for driving the organization’s information strategy, which also accounts for significant competitive advantage. Data quality management was one of the key focus areas for the coming year and the client had partnered with a customer marketing group to create a dashboard to track data quality of their customer metrics.

The Challenge – No solution architecture in place

In order to have a sound data quality management across the organizations, there were 3 main challenges faced by the client:

  • Lack of a solution architecture to address DQM
  • Identification of key metrics to measure data quality
  • Monitoring and cleansing of data

The Approach – Institutionalizing DQM across business functions

Mu Sigma adopted a structured approach to institutionalize DQM which ensures completeness, correctness, and consistency of data.

Various factors such as data warehouse, data dimension, business functional unit, and the key metrics were considered for the DQM process. Data quality was improved by cleansing, parsing and standardizing data in the enterprise.

At different stages of data flow, all the elements of DQM such as data validation, data auditing, and data cleansing were implemented to monitor and improve data quality.

While there are various approaches to DQM such as “by functional group”, “by data dimensions”, “common data universe,” etc., Mu Sigma team recommended implementing DQM by functional group, starting with Marketing. The team leveraged our muUniverseTMplatform to understand various connections and their priority of business problems.

The Outcome – Improved operational effectiveness

DQM helped minimize operational inefficiencies across the organization. Some of the major factors responsible for the diminution of these inefficiencies were:

  • Improved throughput for volume processing
  • Exhaustive data health checkup to identify data glitches and improve decision quality
  • Predictability in project planning and completion
  • Improved productivity

The client customized the DQM process for all descriptive and modelling projects. This helped them receive an in-depth understanding and control over data. The customization also led to quick identification and communication of data glitches, which helped the client utilize efforts and time of their resources more effectively.