It’s time to measure up

Measuring the performance of DM campaigns should be a priority, but many companies are still not providing suitable data to show how effective their marketing is, says David Reed

Do you measure up? Marketers are increasingly being put to the test with demands to prove that the money they spend is actually achieving something. The problem for many of them is that it can be extremely difficult to provide that evidence.

In a survey carried out among the US-based Chief Marketing Officer Council, only 16.8 per cent of the 320 companies that responded had formal marketing performance measurement systems in place. This is despite the fact that some are spending as much as 25 per cent of corporate revenue on marketing.

Not surprisingly, almost 80 per cent said they were dissatisfied with their ability to demonstrate a return on investment and 90 per cent claimed performance measurement had become a priority. As the financial department increases the pressure, marketers will have to respond with better reporting.

Start at the beginning

But where to begin? Assuming the will exists to start measuring marketing outcomes properly, the information is likely to reside in a wide range of locations across the business. Transactional data, inbound contacts and responses all yield hard data from a range of sources, while market research surveys provide insights into soft measure. All of these operate independently and at different cycles from each other.

EHS Brann data planning director Amanda Arthur says: “Clients have measures for campaign evaluation and they report back on all the campaigns they have run quarterly. Often, direct marketing managers are looking at response, and advertising managers are looking at brand surveys; and never the twain shall meet.”

She notes that five years ago, few clients were concerned about this disconnected measurement. But now there are few who are not pursuing a more integrated set of metrics that will consistently demonstrate what marketing has achieved.

One of the first hurdles encountered, however, is how long it can take to measure how effective a campaign has been. Digital marketing campaigns can show response instantly, direct mail shots within days or weeks, but shifts in awareness and attitudes achieved by advertising may take months to show up.

Devil in the detail

The level of detail required also has to be decided on. “It varies from client to client. If they are reasonably technology-literate, they will want to see the 90 to 100 charts that went in to produce the analysis. More senior marketers want you to pick out the key points. In some cases, we just produce well-designed spreadsheets,” says Arthur.

Among marketers who are keen to face up to the challenge of improved measurement, there is a growing desire to put their department on the same footing as other functions. This means assembling data and measures on a regular, monthly cycle in reports with clear indicators attached.

“Key performance indicators can be set against campaign targets and objectives to see how campaigns compare against each other or previous results,” says Target Direct Marketing senior data analyst Simon Metcalfe.

A highway code

“Automatic traffic light reporting – green, amber or red for good, satisfactory or bad – makes simple analysis available to the whole marketing department. It’s not necessary to be a data analyst to glean basic performance information,” he says.

High-end analytical software applications have spotted the opportunity and are eagerly pushing their solutions at marketing departments. The same reporting packages used by business analysts, such as Business Objects, Cognos or SAS, create exactly the kind of dashboard indicator that is now being pursued.

From the other direction, customer relationship management (CRM) applications from the likes of Siebel, PeopleSoft, Oracle or AmDocs, have all integrated analytics into their solutions. Soon there will be little excuse for any employee with access to a front-office software package to remain ignorant of how well they are doing.

Yet deploying these solutions in the marketing department may not be the right approach. “Clients can over-engineer their data integration efforts for the needs of tactical campaigns and to measure the effectiveness of the marketing. The IT department sees it is justifying a big project for three or four years,” warns Quadstone vice-president of business development Rob Bruce.

He cites a large telecommunications company in the US which decided it needed to create a customer experience data warehouse. It had budgeted up to $5m (&£2.7m) over five years to the project. “We said: ‘how about doing it in one-tenth of the time and for one-tenth of the cost?'” says Bruce.

Instead of introducing a new, fixed IT architecture into which data is poured from operational systems, his company’s application can extract the necessary information from existing systems. BT Retail is using it to build a holistic measurement of the relationship it has with clients.

“Whether the customer is satisfied is the result of a wide variety of factors, many of which lie outside the marketing department,” he says. T-Mobile in the US, for example, is drawing together key metrics from 27 different systems, none of which holds the entire answer on its own.

This does underline how political marketing performance measurement can become. “One of the things that happens is that certain controllers of data have a lot of power from holding it. They say they can’t give it to marketers in case they jump to the wrong conclusions,” says Bruce.

Clarity Blue is another software vendor which provides a strong analytical platform overlaid with process management and reporting tools. It allows many of the chores involved in tracking marketing outcomes to be automated, such as matching responders and buyers back to prospect lists.

Song and dance routine

“Traditionally, measuring performance has been overlooked. Despite the big song and dance about direct marketing measurement and monitoring, it simply doesn’t happen,” says Clarity Blue director of consulting practice Marc Dench.

He blames the constant pressure to get tactical campaigns out of the door for undermining more reflective practices. “Clients know what they should do in theory, but the data is not in a single place and it requires a small army of analysts. So often assumptions are used instead,” says Dench.

No appliance of science?

Within marketing itself, there is a growing tension between the view of the discipline as an art rather than a science. Intuitive decision-making is one of the great attractions of the job, with many big-name marketers justly famous for their risk taking and flawless gut feel.

But as the world of business becomes ever more measured, accountable and data-driven, it is hard to see how marketing can continue to resist. Organising itself to be genuinely more insight-led and to learn from previous results could be a real challenge, however.

“Some clients have involved us with their agency. When that happens, it is amazing how well it works. But a big problem in that approach is the number of disparate companies involved in marketing – there can be six or seven around the table,” says Dench.

While some IT departments are eyeing the automation of marketing metrics as a juicy, big-budget project for themselves to get involved with, others say that they simply cannot accommodate marketing’s demands. Without internal support, the project risks running into the sand – unless alternative options are considered.

DoubleClick has been steadily evolving its proposition to the point where it now hosts marketing databases overlaid with analytical packages that offer a single customer and channel view. More than 50 clients use this on an outsourced, application service provider basis.

DoubleClick senior vice-president for marketing automation Court Cunningham says: “At the Gartner CRM summit early this year, the number one issue was the need to have integrated marketing technology and the ability to operate across all channels.”

Many big client companies are realising that their existing organisation, based around separate lines of business, is no longer effective. “They each have unfettered access to the customer, they can mail or e-mail them all the time. There is no co-ordination and customers don’t like that,” he says.

Creating a commonly shared platform based around a unified customer view can yield big benefits. One US catalogue company working with DoubleClick was mailing 151 distinct customer segments. It was able to identify 21 segments that did not generate the required revenue, meaning it could cut its direct mail print run by more than 250,000 and save $400,000 (&£220,000) in the process.

Generation game

“It also found nine segments that its old methodology would not have mailed. By reintroducing the company through follow-up mailing, it generated $150,000 (&£82,000) in incremental sales,” says Cunningham.

Tracking response and sales across channels is becoming one of the biggest challenges for marketers. Research by DoubleClick has shown that one third of catalogue customers buy from the website, but the most common reason for not making a purchase online is not being able to find an item they have seen in print. Quick-order systems using product identity codes can help to resolve this.

Equally, US retailer Sears has learned that many of its store customers have researched items before they shop, and go to its stores holding print-outs from the website. It now offers an order-and-collect service which means a customer can go to the store’s loading bay to collect a product, bypassing the checkouts entirely.

Significantly, it has taken Sears 18 months to integrate its systems in this way. Therein lies probably the single biggest challenge to achieving integrated marketing performance measurement. Systems are poor or missing and IT departments view marketers with suspicion. Only with some high-level banging of heads together will the necessary collaboration finally occur.

Recommended