News By Tag
News By Location
DQ Global's Approach to Master Data Management
Our approach on Master Data Managment - Make Do and Mend.
Master data – any data which needs to be shared across systems - or how to correct the problem so there is no downstream correction required.
Master data management goes way beyond just data migration and integration, it requires all the multiple disparate structured data sources, Accounts/ERP, CRM, Marketing etc. and all the other unstructured data silos, including: e-mails, documents etc all be harmonized to a single master data repository.
It requires data governance, with clearly defined data owners and users and clear guidelines regarding what is good master data and what is not.
It requires a change in corporate philosophy moving from Make do and Mend where there is a perpetual cycle of correction and corruption to one of getting it right first time.
Businesses squander billions of pounds each year by relying on inaccurate or poor quality data. Quality data provides a common reference for all users and avoids the waste associated with duplicated or invalid mailings, achieving a better return on marketing investment and increasing response rates. A database containing duplicate-free customer information enables you to identify new prospects, close more business and better serve your existing customers.
An MDM strategy would benefit large organisations with diverse business functions, e.g., finance, sales, R&D, etc. which often extend over several countries, or companies formed by acquisition or merger. These diverse systems usually need to share important or strategic data – business intelligence, products, customers, suppliers.
The challenge is to create a common system for all users to access the information according to need, as well as maintain accurate master data. This means that everyone needs to own the problem of data quality, seeing it as a corporate asset.
Accepting poor and unproven data into an enterprise decision support system is complete folly and a recipe for disaster. Data has no value unless it can be used to make sound corporate decisions. Making decisions based upon bad data leads to bad decisions being made with a high degree of certainty!
It is well known that what you can measure you can manage. Data quality is no different, so first you must define what good data is, then measure, analyse, improve and control it so that you know where you are on the data asset improvement journey at any time.
Without a proper strategy and the right checks and balances, MDM is merely make do and mend!
Failure to comply with data protection legislation can result in fines up to £5,000. Not only does legislation affect areas such as data privacy, security, retention, protection and accountability, it also helps to safeguard consumer and patient information.
Two companies, ChoicePoint and DirecTV, were fined a total of $20M between them for data related security breaches, one of which led to 700 cases of identity fraud. Reed Elsevier called in the FBI when 32,000 consumer records were stolen from a hosted Lexis-Nexis database.
Apart from these irregularities, there are the fines for marketing to people or companies registered with the preference services, Data Protection Act, Sarbanes Oxley if you have a US parent
The UK’s Information Commissioner says bosses must take the personal data of both customers and staff seriously.
Complying with legislation has an enormous bearing on the ways corporate data is managed. Data compliance procedures will only become more stringent in the future and no organisation can afford to ignore data quality.
How to introduce MDM
The main goal of MDM is to make a common pool of structured data available to unrelated new or legacy applications. Before implementing an MDM strategy, which has to include a data quality element, ask the following questions:
• What data is being held?
• Why is it being held?
• Where is it held?
• Who gathers such master data?
• Who “owns” the data (RACI – Responsible, Accountable, Consulted, keep Informed)
Data cleansing simply improves the quality of business data by eliminating redundant and anomalous entries and applying standard formats. Implementing MDM includes a data cleansing step to standardise data formats from disparate sources.
Reconciling and merging the records from one database with another is fraught with data quality problems. Why is it that no two data sources can agree on the number of customer records? Accounts say you have X number of customers, while the CRM system has Y number of customer IDs. Increasing automation of business processes introduces further complexity.
Introducing a successful MDM implementation requires a six point plan:
• Get executive buy in; data quality is a business problem not a Marketing or IT problem
• Assess where the greatest damage is being done to identify where fixing the problems will generate the greatest return
• Look at what can be achieved tactically within each data silo to improve data quality
• Identify the defective business processes which often create a cycle of “correction and corruption”
• Be customer centric – orient the business, people and culture
• Research data quality improvement tools which will help solve the critical problems identified
• Take action and attach the problem systematically
The critical question here is ownership of the data as, without ownership, it will never be measured or managed.
Master data management system integration is a multi-disciplinary project that typically involves business process analysis, data assessment, data cleansing, data consolidation and reconciliation, data migration, and development of a master data service layer. As a result, a "system of record" is produced to store a master copy of all corporate data. A successful implementation also requires a corporate mandate to prevent renegade use of applications that produce incoherent or redundant sets of data.
The process of translating data from one format to another – data migration - poses a challenge for many companies. As organisations increasingly require the use of enterprise-wide relational databases, or attempt to merge data following an acquisition or takeover, legacy systems prove unwieldy to work with.
All too often, data migration highlights data quality problems that require a comprehensive solution to deal with inconsistent, incomplete or inaccurate data, lack of conformity, lack of integrity and duplicate records.
Using a proven data profiling and integration solution to merge information from disparate sources will ensure a successful data migration project with minimal delays and costs.
- data quality – identify, correct, prevent, enforce
Master data is the foundation of information integration initiatives and is therefore equally important to both IT and the business. Poor quality master data is a major problem in data integration projects; master data management has to be the main focus for ensuring data quality.
A different approach
Perhaps there is a different approach to MDM where all the data can be left where it is and linked together so that MDM can be provided on the fly. That’s what we believe and that’s what we’re working towards delivering.
DQ Global’s DQ360 incorporates many unique features designed to ensure all-round data quality, to quickly and accurately deliver a single view of the customer, business, household or family. The software works continuously to audit and monitor data from disparate sources, carrying out various data processes and data quality tasks, returning clean, de-duplicated data to its source or to a new output location of your choice.
# # #
We have worked with over 500 businesses worldwide on a variety of data quality projects. Our deduplication software, address verification and email verification (data cleansing software suite of solutions) provide our clients with improved data quality.