When we talk about Data Quality, it can often be hard for everyone to agree on one, technical definition. But at its most basic level, data quality is about information that is correct.
Of course, everyone wants data that is correct, regardless of department, market segment or role. And as long as most of the data is correct, the common plan of action is no plan – literally to ignore that bad stuff and focus on the good stuff.
But Dr. Anthony Scriffignano, Senior Vice President and Leader of Worldwide Data and Insight at Dun & Bradstreet, suggests this may be one of the most common mistakes organizations make when it comes to managing their data.
“The biggest opportunity I see lost is what people do with the data they don’t like,” he said in a recent conversation. “They just ignore it. And ignoring it can be hugely detrimental to the company.”
That might sound counter-intuitive. After all, when your sales force is under the gun to work the next lead or reach the next contact, who wants to stop and think about why a phone number didn’t work? Or why a record didn’t match? Or why information seemed old or garbled?
But according to Scriffignano, those errors can be a valuable asset to an organization willing to put them under a microscope. That bad data can provide deeper insight into your broader data set if you begin to ask the most obvious question: Why is this wrong?
Is it the way your own organization is handling the data set? Is it a conflict between different platforms or data providers? Is the data simply dated because your organization is not updating as often as the data provider updates its own information? Is it all of the above? And is it getting better or worse over time?
This ought to be a basic, core concept in any company’s Master Data Management plan. Taking the time to examine the bad data may not have the same immediate pay off as chasing the next sales lead using the good data. But using that bad data to drive improvement in your overall data usage can offer big returns in terms of efficiency and accuracy over the long term.
Image credit: Intel Free Press
Posted in Data Quality by Lisa PetrucciLisa Petrucci is Vice President of Dun & Bradstreet Global Alliances and Partnerships.