Data Quality : 2011 : January Skip navigation

How often do we find the currency and gold lying unprotected in the office cubes or corridors?  How often do we find piles of garbage in and around the office buildings? Even if we do find them occasionally, how often do we find the gist of it getting summarized, packaged and sent to most of the senior managers, along with many other goodies undetected? The answer is, “Never.”  The physical assets management and waste disposal systems have evolved very effectively over time that take care of these things.   Very few would disagree that personal or corporate data is one of the most important assets.  Yet, we can all find around us many examples, where bad data is not cleaned up or important data is not protected and/or not available for use when needed.  The main context of this article is to present a strong case to develop data asset and garbage management systems that are comparable to those that evolved for physical assets and waste.



Data can be like garbage distorting the actionable information or can be like gold requiring very careful protection and attention.  Back in the year 2007, Gartner predicted, over the next two years, more than 25 percent of critical data in Fortune 1000 companies will continue to be flawed, that is, the information will be inaccurate, incomplete or duplicated.  Gartner expected that three-quarters of large enterprises will make little to no progress towards improving Data Quality (DQ) until 2010.  What is amazing to me is that day by day, year by year, much of our corporate data has become unmanageable garbage or our most valuable data is left unprotected and vulnerable for opportunistic theives.  It is therefore not an exaggeration to argue that the current global economic crisis is fundamentally due to the piles of “garbage” data that potentially sealed off actionable information from reaching the right people at the right time until it was too late.  Similarly, many of the materialized threats including those that challenged our national security have included one or many incidents of important data that did not become actionable information when needed or unprotected information that was stolen.


Even though everyone has their roles to play in data quality assurance, the Data Quality initiative must be driven from the top to become successful and sustainable.  Given the broad scope of Data Quality, the biggest challenge is defining the boundaries and justifying the Returns-On-Investments (ROI.)  In the article, The ROI of Data Quality, Len Dubois presented six different representative cases and models to calculate the ROI for the Enterprise-wide Data Quality (EWDQ.)  Bob Lambert cleverly articulated what sponsors need, in his article, “DQ, he isn’t so dumb he just needs glasses,” by suggesting, "...give Sancho (in this case the project team) a chance to speak to the reality of the situation, and hand to Don Quixote (project sponsors) the eyeglasses of in-depth visibility into real costs..."





            It is “Data,” folks, which we find often as the “Garbage in the Lockers and Gold on the Streets.” Defining the scope, boundaries and ROI of Enterprise-wide Data Quality can be the first major hurdles to overcome in solutions development to address Data Quality issues.

what is the solution? sessions are failing on a particular node

Filter Blog

By date: By tag: