As much as we bang on about customer experience, communication and marketing strategies, loyalty, ROI and real-time – all driven of course from insight and modelling – the success of all of these things comes down to the lowest common denominator: the quality of the data.
Without clean, deduped, suppressed data – call it what you will – CRM programmes are at best inefficient and at worst totally useless in terms of driving a profitable campaign. Poor-quality data has in many cases caused the so-called ’death of CRM’ headlines that we have seen over the past few years, as opposed to CRM itself being at fault. Often the blame really lies with the data manager.
Dashboards showing key business metrics are useful, especially when they give the boardroom visibility of marketing activity and – hopefully – success. The problem is that they rarely focus on data quality. Think about it: how often do senior marketing managers ask for a review of data quality or even do a litmus test on how accurate it is? Although there is clear line of sight between data quality, results and revenue, the line “it’s too expensive to maintain data properly and continually” is trotted out. Wouldn’t it be better if marketing departments worked out the cost to the business of not doing so?
The key metrics will depend on your marketing activity and business objectives, but shouldn’t be hard to identify as long as the marketing and data managers work together to identify overall business requirements. What really needs to happen is commercial reform where the data becomes the engine room of the business, and C-level management are brought into the process so that data becomes a top-down matter involving many facets of the company. Paying lip service to data quality is surely no longer adequate at a time when every penny spent needs to work at its hardest to create return.
It’s time that business adopted a systematic approach to data quality management. It requires ongoing budget allocation, and shouldn’t just be treated as a one-off project. What is the point in refreshing or overhauling an entire database at great cost if the data is allowed to stagnate again during subsequent years? There would be no long-term gain.
Data experts should follow a simple four-step strategy to implementing data quality best practice. First comes optimisation: get a complete view of the information available so you can draw up a tactical list of projects to tackle, leading to incremental gains in value and operational efficiency. Second, consistency – use the whole value chain by aggregating data from applications across the business. A unified, standard approach works best. Thirdly, a more simplistic approach should follow; reduce the complexity of your business by integrating information across applications. And finally, be thorough and create a complete picture of the data you hold. Failure to correctly populate any one field could render the whole piece unfit for purpose.
The benefits of investing in end-to-end, ongoing data quality management seem blatantly obvious. All that is required is a bit of forward thinking and some expert input to push data up the food chain and let the boardroom know the advantages of permanently polishing the database, rather than just giving it a spring clean.