Much has been made of the recent trend for clients to spend more below the line and demand greater integration between their marketing activities. Among the claims made, especially by agencies, is that this demonstrates the greater effectiveness of promotional techniques over advertising and that below-the-line work is more accountable.
Yet for many users of sales promotions, the experience is akin to having jumped from the frying pan into the fire. “Historically, sales promotions have had a reputation of not knowing how effective they are,” says Tom Hings, marketing manager, premium brands development, at Carlsberg-Tetley.
With more money being spent on sales promotions, the situation is proving intolerable. “Because sales promotions are becoming more important in the mix, and because marketing funds have to be justified more than they used to be, you have to evaluate the effectiveness of the activity,” says Hings.
But the sales promotion companies may find it difficult to achieve this. Having taken on board the need to develop database skills and produce promotions that, at best, build brands and, at worst, do not damage them, measuring effectiveness could prove problematic.
The first step towards this may be to introduce research and testing in their solutions to client’s briefs. Option One managing director Louise Wall says: “Look at what ad agencies spend on research – both in pre-testing and post-evaluation. It is 1.5 per cent to two per cent of the client’s spend.” As an avid believer in research, she has been arguing the case for the use of research with promotions for some time.
“Virtually every pitch where we have done research we have pulled it off, because it provided us with knowledge supplied by the responders we would be targeting. Having their verbatim replies is very useful,” she says.
Running focus groups on promotional ideas is a relatively new concept, yet it may hold the answers clients and agencies are looking for. Hings says the sort of questions he needs to ask in research are along the lines of: “What aspect, in terms of the mechanics, works better than others – for example, a scratchcard versus a collection scheme?”
But the answers such tests provide may not always be comfortable for the client. Impact FCA! recently won an account from direct household and motor insurance company AGF, based on its research-centred approach. To help design the service to be offered and decide how it should be communicated, a day was spent mixing together 80 people from the client company, the agency and consumers, without them knowing who the others were.
When asked to give their opinions of insurance companies, the consumers used words such as “liars, thieves and bastards”, much to the insurer’s surprise. Impact FCA! chief executive Chris Parry says the experience altered both the attitude of the company to its customers and its belief that cheaper premiums were all that mattered.
In launching the new insurance service Help, AGF recognised that processing claims quickly was the main priority. “In fact, consumers felt so strongly that they should be treated as assets, and not liabilities, that they suggested the new insurance company should be penalised if their claims were not authorised within 48 hours,” says Parry.
But promoters may not want to hear what their customers have to say. It is no surprise that there are no research agencies specialising in sales promotions.
Market Research Society director general Michael Warren admits: “It is true to say that we haven’t had any significant coverage of that side of things at our conferences in the past.” But it is a situation he believes can be readily changed.
There are other structural reasons why it may be difficult for agencies to make planning, testing or research part of the service. Not least, because it has not historically been required. “One reason why a lot of sales promotion companies don’t have planning is because it is difficult to add afterwards,” says Mark Tomlin, planning director at integrated agency MBA.
“All too often sales promotions are seen as a tactical, short-term, problem-solving tool. Obviously, it has those capabilities. However, unless you have strategic thinking involved early on, it is not easy to see how you could justify involving it later,” he says.
The burden of bringing strategic thinking – with its related tools of planning and research – into the sales promotion arena also rests on the shoulders of clients. Hings acknowledges that pre-testing a promotional proposition is not easy and may be resisted by clients who want to get on with building sales.
One way to encourage a more measured approach may be to introduce it through techniques from other disciplines that are known to work. The growth of integration and the development of agencies such as MBA is a reflection of some clients’ desire to see the same measurement they get from advertising research and direct marketing applied to their other marketing activities.
CSP direct marketing manager Joseph Garton is fine tuning promotional concepts. His approach is “to use direct mail to test mechanics, such as cash-backs, coupons and buy two, get one frees, to see the response from the mailing piece and then apply that to other promotions”.
The value of this method is that it allows the precise targeting of offers, without interference from competitors’ messages at the point of sale. But Garton warns that promoters need to use a mix of research techniques to get a full understanding of their customers’ reactions.
“We use AGB’s Superpanel as a way of checking behavioural response, to see the things people have bought. Other research, such as questionnaires where they tick boxes, is about an attitude. What consumers say and do can be very different,” he says.
There is no doubt that many of the more complex tools developed by market researchers for use in ads could be put to good use in sales promotions. Establishing a brand’s true stature before doing anything that might change it is sensible enough. Understanding whether consumers really do prefer instant-win to free mail-ins could save money in the long term.
If agencies are to deliver those benefits, they will need to be given briefs that set out clear parameters. “You can measure the effect of a promotion on sales provided you put in controls before the launch,” says Wall. “It is no good afterwards. You have got to establish whether you are looking at sales in the year to date or like-for-like, or at pricing. And whether you need to take into account competitor activity or advertising done at the same time.”
Sophisticated research programmes can meet all of these requirements. What is really at issue is whether it is wanted by the industry, and who will pay for it. Wall insists more testing and planning has to come because “there is no point measuring sales promotions by the number of redemptions”.
“Performance related pay is becoming more prevalent. It is very hard for sales promotions, but not as hard as for advertising to evaluate effectiveness in terms of sales. But that is what you could reward the agency on,” says Hing.
Client cultures may also need to change to accommodate greater strategic input by promotions agencies – a likely outcome of introducing planning. Tomlin says the drive towards integration is already producing a divide between those clients that want one agency to handle everything and those who resist it.
“Integration is a growing market, but it won’t be ‘the’ market. It will be a sector for particular types of client.
“Big players tend to see their own marketing department as the integrator. But we are finding that a lot of marketing departments were hacked away during the recession, so integration has been de facto out-sourced,” says Tomlin.
Unless clients’ desire for measurable marketing spend is matched by spending on measurement, and agencies’ desire to be strategic is backed by investment in the core skills to deliver, neither will take the next step forward.