Keeping within an appropriate limit

Data quality is a major issue for marketing, but it should also be thought about by the whole business. David Reed finds out how brands are looking to a new generation of tools to help them fix the broken processes that create data errors in the first place.

SPEEDING.jpg

How do you feel about speed cameras? Are they an effective way of reducing traffic accidents, reminding even careful drivers to be aware of potential risks on the road? Or do you see them as just another way for councils to raise revenue by catching out the occasionally heavy-footed motorist?

Much of the same debate takes place around data quality. A close reading of the Data Protection Act reveals that keeping data up-to-date and accurate is a legal requirement, just like the Highways Act requires you to keep to the speed limit.

Membership of the Direct Marketing Association mandates the use of suppression and data cleansing.

Just about every data practitioner you ask would agree that hygiene is part of good practice. Even so, as many as one third of all direct mail campaigns are still not screened against goneaway and deceased files, according to some estimates. So it is likely that a similar number of organisations have not put routine data quality processes in place upstream. Why should it be so difficult to make a business case for investing in data quality?

One reason may be that there are few consequences for non-regulated industries. “For us in financial services, it is a little bit easier to make the case because we have a fearsome regulatory body behind us,” says Ian Dawson, manager, data services, marketing at More Th>n. “It is important to manage our risk and compliance.”

His company holds licences for all major suppression files, covering goneaways and deceaseds. “The company accepts they should be used,” notes Dawson. Justifying the cost has been made easier with the introduction of Royal Mail’s sustainable mail tariff. Part of its technical standard requires the use of robust cleansing of lists against suppression files in return for reduced mailing costs.

If cost-saving at the output stage is one way of justifying the cost of data quality, other regulatory requirements provide more strategic arguments. Financial Services Authority regulations require companies to track complaints, for example. More Th>n has a key performance indicator of its level one complaints handling processes to show whether activity is rising or falling.

Our compliance teams understand what marketing is trying to do

“The FSA will routinely visit our sites and look to measure how we are running our business. They will be interested in complaints and the top reasons for them. If data quality is in the top three, they would expect us to minimise and correct that,” says Dawson. “Our compliance teams understand what marketing is trying to do.”

While name and address data has wellestablished reference data sets and technologies for maintaining data quality, he identifies email addresses as a weak spot. With the growing use by online businesses of email as the key customer identifier, the lack of validation techniques (other than syntactical) is an obvious issue.

That leaves digital data quality as an unknown variable. It is standard practice for external suppliers to provide clients with data audits of their customer data. When it comes to the email address field, all that can be said is whether it is populated or not.

For the other dimensions of contact data, clearer indicators can be provided. “Typically 5 to 10 per cent of data may not match the latest version of PAF,” says David Barker, head of customer data integration at Acxiom. “Data decays over time and every field has a level of error. A database will typically have 2 to 5 per cent deceaseds in it, depending on the industry sector. For example, a charity will have a higher rate because it has an older supporter base. You will also find 3 to 6 per cent goneaways and about 5 per cent duplicates.”

Providing a “visually arresting” report on data quality is one way in which his company is helping clients to make the business case for investment. While internal indicators of quality are important, Acxiom is increasingly providing a benchmark against other data cleanses it has carried out. “When we say you have got a 3 per cent error rate, the first question is whether that is good or bad,” says Barker.

Marketing has been the main function putting money into data quality for a long time. But Barker says a shift is now occurring. “It is now coming to us with a master data management project, asking for help to achieve the customer master in that project. They have a data quality budget in place to do that,” he says.

What this trend indicates is that data quality is now being understood as a pervasive issue that requires business processes to be re-engineered, rather than a local problem to be solved with point solutions. Unless this happens, companies face an endless loop in which data is allowed to enter systems with mistakes at point of capture, only to be put right at point of use.

For this shift to take place, a cultural change is also required. “What I found in the early days of our big BT data quality programme was that people were in denial about it,” says Nigel Turner, client services, BT P&SD. “They don’t want to admit that they may have built a new marketing system and the data in it is not good.”

Turner identifies a paradox at work: “It is easier to prove the business case if you allow the data to get dirty. Then you have got hard evidence that it is not clean and you can quickly show that there are financial or regulatory impacts from that, so data quality has to happen.”

One aspect of building a business case he emphasises is to include a “do nothing option”. There will always be an argument that data can be left for longer without being improved. Yet business is unlikely to stand still in that time, leading to an even greater impact from poor data.

“It is easier to do that in an environment where you are expanding marketing activity. For example, with the launch of BT Broadband it made sense to put in data quality because at the start, it only had a base of 300,000 people, but we knew in five years it would grow to five million,” he says.

Short-termism may still win out, especially if the business just needs to keep its cash in the bank. Data quality problems tend to multiply the longer they are left, however, making the eventual fix more expensive. To make any kind of case for investment, Turner says that, “data quality people have to put it in terms the business can understand.”

That means less focus on the technical dimensions of data and far more on the way data will interfere with the correct handling of key business processes. Surveys by Aberdeen Research into customer data management have provided substantiation of how data quality can improve productivity. It compared the way leaders and laggards performed in getting data that was fit for purpose into the systems where it was needed.

Mature users of customer data management – typically those with a single customer view – were delivering clean data to the business in
an average of 39 days. Among laggards, this was taking nine months to achieve. “Business users are getting much more involved in data quality,” notes Ed Wrazen, vice-president product marketing at Harte-Hanks Trillium Software. “They are the ones looking to drive out value from customer data, gain insights by understanding loyalty and behaviour, as well as how the customer interacts with the organisation.” Poor data quality can make all of these less accurate and effective.

It is easier to prove the business case if you allow the data to get dirty

“Another area driving investment is asset management. Organisations are looking at their plant, products, materials, suppliers to find efficiencies. Data about all of those drives service, reporting and everything else,” says Wrazen.

Nectar has members in half of all UK households, so its data quality is highly visible. Yet as Andrew Bridges, data quality manager, points out: “We had collected a vast amount of data in a very short space of time via a number of incoming sources.” With a quarterly mailing to 15 million
collectors, if 10 per cent of those records were inaccurate, that would create £1.8 million in waste based on production costs of 30p per pack.

That is a dramatic figure. And with 70 per cent of the business involved in that mailing, it was easy to get support for investing in suppression and validation processes,” says Bridges. Nectar holds licences for all the major suppression files and manages data quality through the Cygnus and Trillium Software platforms.

The company recognised the fundamental importance of good data quality to its business model, serving both members and brand partners. For that reason, Bridges says, “marketing doesn’t drive data quality, it is down to operations to look at what processes need to be put in place, if they are working and whether they are doing what they are supposed to.”

Exception reporting shows when data is coming into the business that does not meet certain criteria. That does not necessarily mean a membership application will be declined. “We will mark that file for follow-up with the member – we put the onus on the collector to fill in the missing parts by identifying the benefits they will get,” he says. By allowing business users to manage data quality themselves, Nectar has made savings against the outsourced database bureau service it used until last year. The data quality team monitors data feeds to spot whether a process needs to be changed. A traffic light system of reporting gives the board an insight into overall data quality, while complaints
received by the call centre are also used as a measure.

Not everything can be improved, however – 70 per cent of the 100,000 monthly applications now come via the web, where onscreen prompts can be used, but the remainder are paper-based, where errors are more common. “We have to offer that route, but it is where the majority of the missing information happens,” says Bridges.

What this example indicates is the complex challenge presented by data quality. It is not the simple task of making data fit for purpose at point of usage that has tended to be the dominant model in recent years. Instead, it goes to the heart of how a business is run and decisions about which processes to improve or leave unaltered.

“What we have seen in the last few years is data quality moving from project-based, point solutions into enterprise-level competency centres,” says Bert Oosterhof, corporate senior architect and director of technology EMEA for Informatica. With the introduction of data governance structures, data quality is being pushed out across organisations as a concern for all business users, rather than being parked in IT.

“It doesn’t belong there,” says Oosterhof. His company has partnered with Global Data Excellence, a consultancy focused on providing financial arguments and frameworks for the data quality business case. As these larger, more formal solutions continue to emerge, companies will find it easier to get the backing they need to put things right in their data. Until they do, the risk of being fined for breaking the law remains high.

DATA QUALITY PROCESSES STILL LACKING

Although name and address data is being collected by 85 per cent of organisations, only 40 per cent validate these records using software. According to a survey carried out by Capscan, a discrepancy exists between the importance of data and the way its quality is being ensured. This can be seen in the types of software deployed. Address management solutions are present in 41.6 per cent of companies, while 36.2 per cent have deduplication systems and 31.5 per cent are using identify authentication solutions. That leaves a big gap between the volume of data being collected and its fitness for purpose. However, the survey did find growth in the use of bureau-based cleansing and suppression services – up to 18.1 per cent in 2010 from just 7.4 per cent in 2008 – and online data cleansing, which 14.8 per cent make use of. Data quality is getting some recognition, but not yet at the scale required.

Recommended

Sales up but consumers still wary

Rosie Baker

The high street enjoyed a rise in sales in August but consumer spending is still being knocked by fears over government spending cuts, according to the British Retail Consortium.