The main purpose of a dashboard is to provide a comprehensive snapshot of performance, which means that you should incorporate a large amount of detail without using too many drill-downs. It uses data from the past to identify trends and patterns that can help design future process improvements.
Data Quality Dashboard is an information management tool that visually tracks, analyzes, and displays key performance indicators metrics It`s highlighting key data points to monitor the health of a business, department, or specific process. They can be customized to meet the specific needs of a business and it shows how much trust you can put in your data.
Improvement of Data Quality is a long-term process and the best outcomes of such initiatives are bulletproof processes that will serve you in the future instead of just on time cleaned data. If You want to be effective you should get your process in shape monitor variations and control it instead of performing periodically data cleansing exercises. Correcting data is time-consuming so try thinking ahead when designing and implementing new processes. Investing time into quality assurance can save you a lot of later work.
If you still don’t know why you need reliable data quality check this article:
These are the main Data quality dimensions
Completeness — doesn’t bring too much value to the table. It can be misleading as you can have all the attributes completed but it`s about the content of the field you are validating and it still can be garbage.
Compliance/Validity — This should be the focus when starting a data quality program. Does your data is satisfying business usage requirements? Start with defining rules and then profiling data against them. We can split this to :
1. Format checks, depends on company standards, location and markets examples are:
- Formats of postal codes (you need to define this per countries)
- Minimum and Maximum Number of characters — so there are no 1 character addresses or company names
- In the case of a global company you can check if local characters are used in global names
2. External Reference data compliance
- Checking external standard can be very beneficial as this will be very usable in reporting but also many of those classifications can be regulatory ones that are mandatory to run business (Customs Tariffs numbers)
- countries codes validation against ISO codes standard.
- Customers classified with SIC codes that are valid according to a global list
- Various Product classifications are mandatory in many markets: WEE classification, ETIM, UNSPC, Customs Tariffs codes. You can get the list of valid codes and perform validation of your products so they can fulfill business requirements.
3. Master Data compliance — for better Strategic reporting companies are implementing internal segmentation or classification of customers products and orders. You need to create a master data reference table and then validate your records classifications against it. Various internal business rules can be implemented here but this should be suited to your needs and started with the design phase to address actual issues of organizations like:
-
- Each product need to have an assigned active profit centre
-
- Each Client needs to have an active owner.
Consistency — is your data the same in different sources? This should be relatively easy if you have Master data in place and each local records are connected to your master data source of truth with a global identifier. Just compare key attributes in different sources. In an ideal world data should be syndicated from source of truth to consuming systems:
-
- Customers addresses in CRM against SAP against your Master Data
Timeliness — is your data up to date
- schedule and monitor the data review process. Each records with
- defined last edit date should be reviewed by the owner
- Is your data provided on time? — Creating new customers is a workflow and you can have different requirements on different steps of the sales process, always consider the business process when monitoring such aspects.
- For example, newly locally created customers records should be cleared and assigned with Global Customer Id withing 2 working days
Correctness/Accuracy — This is the most difficult one as there is no easy way to check that from the IS point of view. How can you check if the product weight is 10 kg and not 15 kg? Ideally, it`s established through primary research. In practice:
- Manual auditing of sample
- use 3rd party reference data from sources which are deemed trustworthy
- Data Profiling and finding coherence patterns
- Manual validation by owner — akin to crowdsourcing
It is not possible to set up a validation rule. Instead, you should focus on creating logic that will help discover any suspicious pattern in your data that varies from what you saw in past.
What you should measure :
Start with talking to data users and define your top elements and attributes and then the purpose of it.
Look into user problems try a design thinking approach to discover opportunities.
Define more crucial ones and start measuring them.
You can group your KPI`s into business domains and accountable parties, then calculate quality indexes.
Customer Data Quality can be built as a weighted average of attributes quality (Address, Vat, Owner) but also can include processes whenever these records are updated on time or are consistent in different sources. Calibrate the weights over measuring iteration based on business feedback.
Make them SMART
Specific — specifically designed to fulfill certain criteria
Measurable —Status is constantly monitored and available on the dashboard
Accountable — Clearly defined responsible party
Relevant — They have an impact on business
Time-based — there is time scope within they should be reach
My KPI`s checklist:
- The object/attribute is defined in the metadata tool rule-book or instruction
- There is a clear business “WHY” — so people now why are they doing it this is important and it is building data quality awareness across the organization
- Clearly accountable function/person/organization
- Communication plan
- Reviewed with reference team and stakeholders
&
To recap, to improve and manage your data quality you need to know where you are now and where you want to be in a defined period so:
- Describe your data with clear definition and rules
- Implement a continuous monitoring process and keep systems and processes up to date.
- Engage with everyone responsible for the data tell them why and build data quality awareness
- Increase the accountability for the data by assigning responsible