The lack of a data quality initiative in enterprises today can be driven by the perception that the cost associated with poor data quality is a mere cost of doing business, said one analyst.
Enterprises receive and process a plethora of data, but are more concerned with the processes that handle that data than the data itself, said Dean Wiltshire, senior product analyst for data quality, with location intelligence technology provider Pitney Bowes Business Insight.
“They are more focused on processing the information and hoping it’s correct than they are on focusing on whether the information is correct before (they) process it,” said Wiltshire.
Pitney Bowes Business Insight and automated data mastering technology vendor Silver Creek Systems Inc. released the results of a co-sponsored report this week called The State of Data Quality Today. The report is based on a survey conducted by U.K.-based analyst firm The Information Difference.
The results revealed that while 70 per cent of respondents believe the quality of their product data is good or very good, only 37 per cent have implemented some manner of data quality initiative. Wiltshire is surprised by that statistic, a low number of data quality initiatives that demonstrates a sense of “false security.”
Even then, the numbers are skewed, said Wiltshire, because the 37 per cent could include organizations whose initiatives are only in their infancy, having just put a data governance framework in place and started to understand data quality.
In fact, the survey also found that 63 per cent of respondents have not even attempted to calculate the cost on their business of poor data quality. Yet data is a core asset of the business, with modern businesses running on data more so than anything else, said Martin Boyd, vice-president of marketing with Silver Creek Systems.
“It’s an intangible,” said Boyd, referring to data quality. “It is difficult to put your finger on exactly what the overall quality of the data is in the business.”
That’s particularly problematic considering what can’t be measured doesn’t get improved, said Boyd.
The reason that measuring data quality is tricky, especially concerning product data quality, he said, is that product data can be more variable and flows through many systems and processes in a business, and getting a holistic view of the business is not always easy.
For instance, validating information about a resistor is different from validating information about a handbag, explained Boyd. Vocabulary, inferences, validations change with different products.
“It’s easy to say ‘data quality’ at a high level, but not really get into the different nuances of each type,” said Boyd.
Wiltshire said that while data quality often exists at the departmental level where unit managers toil to ensure their data is of a high standard, the same cannot be said for the enterprise as a whole. “This is where enterprises tend to lack their focus,” he said.
But besides measuring data quality, the hurdles to good data quality lie in lack of leadership support and in defining the business case, said Wiltshire.
A data quality initiative should begin with policy creation, while also having tools to enforce the policies, and measure and report success. “It’s a lifecycle, it’s a loop,” said Wiltshire.