Tip

How to improve data quality on a tight budget -- a guide

SearchDataManagement.com editorial team
 

Many organizations may be tempted to forgo data quality management during a recession, but it is important that organizations assess the return on investment (ROI) of quality data, according to industry expert David Loshin.

In this guide, you'll learn how to manage data quality efforts during an economic downturn and find out what trends are emerging in the data quality market. You'll also learn about common mistakes, how to avoid the pitfalls of poor data and how data quality tools and strategies can improve poor data quality.

Listen to a podcast and read a Q/A with author and data quality expert Arkady Maydanchik on how a data quality assessment can help identify data quality problems and find solutions. Get tips, advice and best practices information from data management experts and excerpts of data quality books.

    Requires Free Membership to View

 


Don't miss the other installments in this data quality management guide
Managing data quality programs during a recession
Trends in the data quality market
Avoiding data quality pitfalls and using data quality tools for discovering new opportunities
Q/A: Identifying data quality problems with a data quality assessment
FAQ: Best practices/tips for data quality

Download a PDF version of this guide -- Tactical data quality: How to improve data quality with a tight budget


Managing data quality programs during a recession
By David Loshin, SearchDataManagement.com Contributor

During uncertain economic times, there is a certain amount of belt-tightening expected across the board, and the IT department is not immune to this. Yet before you grab the knife to start slashing the budget, it is worth considering that reducing the investment in any program or infrastructure that supports the organization's business needs is a measure that will not only diminish needed agility during poor economic times but will also slow the organization's competitiveness when times start to get better.

Often, data quality management is seen as a good practice, but most organizations do not have the discipline to integrate its value proposition holistically across the organizational value drivers, whether they are focused on revenue growth or operational cost containment. Therefore, a recession actually provides an excellent opportunity to assess two aspects of the relationship between data quality and the business. Companies can directly connect high-quality data to the organization's value drivers, weighted by the perception of existing economic trends.

Determining data quality's impact on business processes

The first aspect is identifying specific business processes that will be positively affected by high-quality data. Data quality may affect different business processes in different ways. A data quality analysis should incorporate a business impact assessment to identify and prioritize risks. Those business impacts associated with bad data can be categorized within four general categories for assessing either the negative impacts suffered or the potential new opportunities for increased value resulting from improved data quality:

 

  • Revenue growth, incorporating financial impacts such as decreased sales, higher cost to acquire new customers, and customer retention.
  • Cost reduction, such as increased operating costs, reduction or delays in cash flow, and additional unnecessary charges.
  • Risk management and confidence management, such as credit assessment, investment risks, competitive risk, capital investment and/or development, fraud and leakage, compliance risks, decreased organizational trust, low confidence in forecasting, inconsistent operational and management reporting, delayed or improper decisions, decreased customer, employee, or supplier satisfaction, or lowered market satisfaction.
  • Productivity impacts such as increased workloads, decreased throughput, increased processing time, or decreased end-product quality.

Assessing the business impacts associated with data means working with the business consumers to understand their information needs and the corresponding data quality expectations. One can elicit information about the business impacts associated with data quality by asking probing questions such as these:

 

  1. What importance does data have in achieving the organization's business objectives?
  2. What data is critical to your business processes?
  3. How confident are you in the accuracy of your data?
  4. What changes to the data can improve business process performance?
  5. In which aspects of data improvement should the company be investing, and in what time frame?

Any significant data issues that will affect the business are likely to be revealed during this process, and this provides you with the basis for further researching documented business issues and connecting them to any type of data flaw. It will provide a connection between data quality improvement and a potential for increase in value. At the same time, this provides an opportunity to reinforce conformance with business data quality expectations by validating data quality rules and the corresponding thresholds for acceptability.

This leads into the second aspect of data quality management: monitoring the level of efficiency of the data governance and data stewardship activities. As a data quality program matures, the management of issues transitions from a reactive environment to a proactive one, and this can be scored in relation to continuous monitoring of the quality of data. In the optimal environment, the data stewards allocate time to address the most critical issues as they are identified early in the processing streams. Less efficient organizations have stewards reacting to issues at their manifestation point, at which time these issues may have already caused significant business repercussions.

The importance of designing a data quality scorecard

Therefore, organizations that inspect, monitor and measure the performance of data quality initiatives on an ongoing basis, across all processing streams, can then populate a data quality scorecard reflecting the effectiveness of the program and the efficiency of its staff. Together, these two aspects reflect the value of the program and the way that it has been implemented. Focusing on both of these aspects provides a number of valuable benefits:

 

  • It can demonstrate the value proposition for maintaining the effort, even in the face of economic stress.
  • It can provide long-term justification for continued funding and growth of data quality management as the recession ends.
  • It can help identify additional areas with an acute data quality improvement need that can help support the organization's survival during a recession.
  • It will demonstrate an example of proactive value management to other organizations.

On the other hand, it may turn out that this data quality assessment will show that the organization does not get a reasonable return on its data quality management investment. In this case, it provides an opportunity to reduce operating costs associated with the areas of missed expectations. While this is unlikely, it does demonstrate a level of accountability that should pervade all management activities. It is more likely, however, that this process will only strengthen the view that a data quality management program is fundamental to the ultimate success of the business.


About the author

 

David Loshin is the president of Knowledge Integrity, Inc, a consulting company focusing on customized information management solutions including information quality consulting and training, business intelligence, metadata and data standards management. David is an industry thought-leader and one of Knowledge Integrity's most recognized experts in information management. He writes for many industry publications, creates and teaches courses for The Data Warehousing Institute and other venues, and regularly presents at the annual DAMA/Meta Data conference. David is the author of "Enterprise Knowledge Management - The Data Quality Approach," which describes a revolutionary strategy for defining, managing, and implementing business rules affecting data quality management. His other book "Business Intelligence: The Savvy Manager's Guide has been hailed as a leading BI resource. He can be reached via his website, knowledge-integrity.com

Don't miss the other installments in this data quality management guide
Managing data quality programs during a recession
Trends in the data quality market
Avoiding data quality pitfalls and using data quality tools for discovering new opportunities
Q/A: Identifying data quality problems with a data quality assessment
FAQ: Best practices/tips for data quality


Download a PDF version of this guide -- Tactical data quality: How to improve data quality with a tight budget


 

 

This was first published in July 2009

Disclaimer: Our Tips Exchange is a forum for you to share technical advice and expertise with your peers and to learn from other enterprise IT professionals. TechTarget provides the infrastructure to facilitate this sharing of information. However, we cannot guarantee the accuracy or validity of the material submitted. You agree that your use of the Ask The Expert services and your reliance on any questions, answers, information or other materials received through this Web site is at your own risk.