Striking a Balance: A Thought on Data Quality Management

According to the Data Warehousing Institute, American businesses lose six hundred billion dollars annually as a result of poor data quality. While more and more companies realize that data needs to be an important corporate asset, many question the following:

  • Who is responsible for all of the data?
  • What problems exist with the data?
  • What requirements need to be in place for quality to be at its best?
  • Can a return on investment be quantified in such a case?

This is where Data Quality Management (DQM) comes into play. The goal of DQM is to establish roles, responsibilities, policies and processes to attack data quality issues. Cooperation and discipline is essential. It is important that everyone recognizes problems with the data and participates in determining a solution. Specifically, the IT department plays a major part in DQM, for they are in charge of the overall environment of the architecture, systems, databases, etc. Once problems are recognized and everyone’s on board, standards can be set. If you’re goal is to have a perfect database, you’ve set your expectations too high. We must all realize there is no such thing as a perfect, “spot-less” database. Data quality is a balance between accuracy and completeness. It is impossible to have data that is 100% accurate and 100% complete. We often find that we must sacrifice one or the other. Depending on the data’s value, one can make the decision to keep a record with an error or toss it.

Managing data quality is a continuous process that needs to be re-visited frequently. It is important to update your data in a real-time fashion. If you’re worried about the costs of managing your data, you shouldn’t be. Quality is free; it is the “unquality” that costs money.