News

Poor data quality: what is dirty data costing UK organisations?

Tracey Caldwell, Contributor

No one wants to air their dirty data issues in public, but sweeping them under the carpet seems to be a very British trait.

That could be problematic at a time when data quality is becoming more important to business success. As the use of business intelligence systems

Requires Free Membership to View

and data analytics tools becomes more pervasive, companies increasingly need high-quality data to help executives and other users make important business decisions, often in real time.

But a survey conducted in Europe and North America last year by UK-based analyst firm The Information Difference found that one-third of the respondents rated the data quality within their organisations as poor at best.

The survey, which was co-sponsored by vendors Pitney Bowes Business Insight (PBBI) and Silver Creek Systems Inc., also found that only 37% of the respondents had data quality initiatives in place, while 17% said that their organisations had no plans to launch a data quality program. And 63% said that their companies hadn't tried to calculate the business costs of errors arising from poor data quality.

European respondents appeared especially blasé about the data quality issue; only 27% said they found the process of standardizing data across an organization to be difficult, compared with 47% of the respondents in North America. "It is tempting ... to conclude that there may well be a substantial element of wishful thinking here," the report's authors commented in response.

Jay Bourland, PBBI's group technology officer, wasn't surprised by the survey results. "It is not unusual in my experience that people dislike dirty data, but they don't really understand how to put a hard value on it in many industries," he said.

UK businesses lacking data quality strategies, report finds
Independent research commissioned by other data quality vendors fleshes out the scale of the data quality challenge in the UK For example, Experian QAS commissions an annual global research report on data quality; last year, it found that only 45% of the surveyed UK organisations had documented data quality strategies.

Meanwhile, London-based market research firm BDRC, in a March 2009 report undertaken for SAS Institute Inc.'s DataFlux data quality subsidiary, said it found that corporate data was viewed as a strategic asset by 90% of survey respondents in the UK's financial services sector. But it added that the industry had an inconsistent approach to assigning responsibility for data quality management.

According to BDRC, about 60% of the respondents reported that responsibility for maintaining data quality was scattered across multiple departments or held by individual business units, making it hard to be fully confident about the accuracy of data reporting across the enterprise.

Poor data quality can have serious financial costs for organisations. For example, one respondent to the Information Difference survey said that problems with data quality and consistency had led to the orphaning of about £20 million worth of product stock. The goods (valued at $30.8 million at current exchange rates) were sitting in a warehouse and couldn't be sold because they had been "lost" in the company's systems.

At consulting firm Deloitte UK, each business unit is responsible for managing its own data. But with Deloitte's business reputation based on the quality of the information it provides to clients, CIO Mary Hensher is keenly aware of the need for IT to take the lead on enabling good data quality.

Hensher said her department has implemented a number of data quality tools for use by the business units, although she declined to comment specifically about the technologies that have been installed. In addition, she said, "there is a lot of focus on making it as easy as possible to enter data accurately."

That itself isn't so easy, though. Deloitte uses drop-down checklists and has tried to streamline the number of keystrokes that are required of end users and the number of screens that they have to navigate – all of which can be challenging to implement successfully, Hensher noted. "Mainly it involves making it difficult for people to make mistakes," she said. "But it is a work in progress."

Poor data quality may dull companies' competitive edges
Data Locator Group Ltd., a UK-based direct marketing services firm, competes with rivals on the quality of the data it collects from consumers through online and printed surveys. But it was only last year that DLG brought in software specifically designed to address data quality issues.

Paul O'Callaghan, a business consultant at DLG, said that after he joined the company in 2009, he noticed a "serious lack in data quality" while manually checking the information in its systems.

The data quality problems primarily involved duplicate data entries about individual consumers. For example, when DLG tried to do data matching against its database to weed out duplicates, it was missing people who had registered previously on its site or via a paper-based survey and then registered again.

O'Callaghan went to DLG's board and recommended that the company buy data quality tools from Datactics Ltd. that he had used for about five years at a previous employer. Thanks to the tools, DLG now finds virtually all of the duplicate entries in its database: "We are down to less than 1% duplicates," O'Callaghan said.

Also important to O'Callaghan is the speed at which the software from Belfast-based Datactics can manipulate large amounts of data. DLG has data sets with 30 million records, but "if we get client data in, we can match it to our files and get results overnight," he said. "It gives us a head start against our competitors."

In a survey for an upcoming report on the importance of data quality, London-based consulting firm Bloor Research asked respondents whether their companies appreciated the value of accurate business data. Philip Howard, Bloor's data management research director, said European respondents were less likely than those in other regions to feel confident that their organisations valued data accuracy.

Howard thinks the problem of poor data quality goes beyond the immediate business costs to companies. He's worried that unless data quality becomes more of a priority, Europe will lag behind North America and the Asia/Pacific region on overall economic competitiveness. Data quality and competitiveness problems appear to be "symptoms of the same disease," he said.

Tracey Caldwell is a freelance writer based in the UK.