It’s been known to make grown men and women cringe in fear and weep.
It’s the prospect of having to migrate legacy data to a new system.
The people who sold the new system promised efficiencies, better reporting, more productivity, ROI, reduced TCO, and so on through the alphabet. Didn’t they know that the legacy systems had evolved over time with scant regard to data governance? Data structures had likewise changed, often reflecting the biases and peccadilloes of the data architects or DBAs who happened to be in charge at the time.
As a result, data “quality” was inconsistent, if present at all: not just in the nonconformity of individual data elements, but also in the uneven contents of fields themselves.
The challenge of rationalizing, normalizing, and cleansing the entire mishmash of data as a prerequisite to loading into the new system, that’s where the nightmare, the data migraine resides. The prospect of scrolling through screens of Excel or Access files to find non-conforming data, maintaining glossaries of standard data elements (remembering which alr
eady had been assigned and which still had to be created!). Now that’s a headache and a half. And the time to do this – go-live promised a mere 12 months away by some management honcho who’d never examined a data file in his life. Ohh the agony, ohh the nightmare, ohh the migraine!
This is the time to investigate data quality and ETL tools; time to see if this problem can be automated, routinized and industrialized – Advilized out of existence. Fortunately, there are tools available that can make data cleansing, normalization and rationalization relatively easy and automated. In fact, it’s possible to pre-process data so that it can be loaded into the new system with 100% success the first time around. Adopt a data quality expert to find out how to turn a migraine into a high-gain, career advancing success!