Better Supplier Relationships Start With Supplier Information Management (SIM)

Start with SIM >

6 Data Quality Issues Stopping You From Achieving Perfect Data (and how to fix them)

data quality issues

According to Deloitte’s Global CPO Surveys from the past years, including 2021, ‘poor data quality’ is listed as a top challenge, with ‘availability of data’ being eight. Harvard Business Review shared an estimate from 2016 that poor data quality issues cost the USA alone $3.1 trillion. Another study on data credibility by Harvard Business Review revealed 50% of the time is wasted on locating data, identifying its errors and/or confirming its accuracy. A HICX survey recently revealed that this is due to data being a low priority for procurement leaders, as 61% of them cited perceived low ROI.

It is clear that data quality issues are prevalent in almost every organization and these are preventing them from reaching their objectives. This article covers:

  • The 6 most common data quality issues
  • How to know when to implement data quality measures
  • Who is involved in the data quality management process
  • Stages of the data quality management process
  • Factors to consider due to evolving data quality needs
  • How Supplier Experience Management can help

The 6 most common data quality issues

Below are the most common data quality issues that most organizations experience, which prevent them from getting the most value out of information:

  1. Incompleteness: Crucial pieces of information are missing
  2. Inaccuracy: All the data exists (the data fields are filled in), however, they may be in the wrong field, spelled incorrectly, or are inaccurate
  3. Inconsistency: Data is not presented in the same format or value (e.g., using different currencies instead of the same one throughout)
  4. Invalidity: Data fields are complete, however, said data cannot be correct in such context (e.g., ‘units available’ displaying a negative value)
  5. Redundancy: Data is entered multiple times but displayed in different ways (e.g., company or individuals’ names entered different ways)
  6. Non-standard data: Data displayed in non-standard formats or formats which cannot be processed by the system (e.g., ‘percentage’ rather than %)

While these data quality issues are far from ideal, the question arises of when do they become big enough of an issue to justify making changes in how the organization manages its data? If a business can function ‘adequately’ with such issues, when should data quality processes be implemented, if at all?

When is it the time to implement data quality processes?

In general, data quality control and measures should be implemented once the business requires it in order to solve a specific issue. However, if you are asking yourself this question, it is probably time to make those changes.

Some of the common reasons for improving the quality of data are, but not limited to:

  1. Data can be used as a major strategic asset which provides competitive advantage if it is accurate and usable
  2. You want to centralize data into one source by extracting it from disparate sources, which will be extremely difficult if the information is not standardized or is inconsistent
  3. You want to manage the master data more effectively
  4. When implementing a new system or carrying out a system migration, such as from a legacy system or ERP to a cloud-based system

Once the business case and objectives for implementing quality measures have been identified, the overall process needs to be defined. Responsibilities are outlined below.

Who is involved in the data quality management process?

Two roles that are essential for the success of the data quality process are:

  • Data stewards who profile the data and create rules for its standardization and cleansing
  • Developers who collaborate with data stewards. They have a critical role in designing data quality rules and the development process.

Throughout the implementation process, both of these roles work together, after which the data stewards are responsible for monitoring the data quality.

Stages of the data quality management process

Data quality analysts, data stewards and developers are required to complete certain stages of the overall project, including:

  • Data profiling: Data has to be explored to gain an in-depth understanding of it and to identify its issues, such as incompleteness, inaccuracy, etc., before compiling a list of problems which must be resolved
  • Defining metrics: In order to comprehend the severity of data issues, quality benchmarks must be established. Metrics such as completeness of data (% complete), consistency (% consistent), validity (% valid) and so on should be recorded.
  • Fixing the data: The process of cleansing the data and fixing the issues can begin once the problems have been identified and quality benchmarks established.

It should be noted that any changes should not be made directly to the database straight away as it poses a risk in the event that the changes themselves are incorrect. Changes should be documented, reviewed, and approved or rejected by the data stewards.

Factors to consider due to evolving data quality needs

One of the inescapable aspects of working with data, especially within data quality management, stewardship and governance, is the fact that it is never a ‘one and done’ type of project. Data is constantly changing, and with it, its quality requirements.

As the data changes, so too will the measures and rules covering it. There will be common recurring issues. Over time, data stewards will gain a great understanding of how to mitigate the problems and readjust measures accordingly.

Additionally, data itself does not stand still. Information such as company names, addresses and email addresses change frequently and new data sources will also emerge over time, which will confirm the need for governance and stewardship.

To stay on top of the changes and data quality, organizations must be proactive rather than reactive to avoid complacency. Ask yourself:

  • Is the quality of data improving over time, and as a result is the data management process working as intended?
  • If the quality of data is not improving, which rules and measures should be changed and updated? Are they helping the organization meet its current needs and objectives?
  • Once new data sources are added, how should existing measures be applied accordingly?

How Supplier Experience Management (SXM) plays a role

In a recent HICX survey on Supplier Experience, 100 senior procurement professionals were asked questions on the actual activities and aspirational objectives for their organizations.

When asked what score they would give the quality of their data, the average score was 6.12/10. However, if suppliers were faced with fewer supplier-facing portals and provided with a better overall experience, they would be encouraged to engage more frequently, hence also achieving better quality of data. It is often forgotten within that it is the suppliers who are the real custodians of the data, so any drive towards improving its overall quality must also incorporate them. Supplier experience management is one mechanism for achieving this.

Supplier experience refers to all the interactions that take place between an organization and its suppliers. Supplier Experience Management (SXM) is the practice of creating the conditions in which a buying organization and all of its suppliers can achieve mutual success together. Read the Supplier Experience Survey report or find out more about our Supplier Experience Management software.

Article updated July 2021

Posted in

Share this post