A doughnut hole is an odd expression to describe a gap or a discontinuity, most often of a programmatic kind. Most recently the expression’s been in the news around the so-called Medicare doughnut hole. This occurs when prescription drug coverage for patient reimbursements ends when payments reach a threshold level. Thereafter the patient is responsible for the full cost of prescription medication until another threshold is reached, when Medicare starts to kick back in again.
Efforts and activity around data cleansing and getting data fit for corporate purposes can also reach a doughnut hole. This can happen during projects when data starts getting cleansed and users begin to see results based upon the transformed data. Euphoria over seeing reports and analytics that are far better than previously available (if at all) sometimes results in the view: “OK, we’ve gone far enough, we’re clean enough, let’s stop and start using this data, there’s gold here!”
Other times it’s a more formal or conscious decision to cut back on cleansing, a
s when users decide they’re comfortable with a level of data maturity that may be less than 100%. Such a “suboptimal” condition might be all that’s required to run the business effectively. Take for example vendor fields: perhaps not all vendor fields are equally critical and those that are not do not need to be filled in 100%.
But as BI becomes more pervasive and business continues to value accurate reporting, data can cross the doughnut hole when demand for better and more complete reporting picks up. This fuels the completion of data quality to even higher standards to support the continually evolving requirements of the business. Coming out on the other side of the doughnut hole is sweet desserts for those in the business of refining data for consumption!
Read more great blogs by the data experts at Utopia at bit.ly/ad2BhD