TARGET PRACTICE

Increased familiarity with PCs, technological improvements and falling prices have led to a wider appreciation and application of technology to analyse databases. Ultimately, this enables more refined targeting of the consumer.

The volume of data on customers that is now being collected is becoming so large you need to know exactly how to crunch it in order to predict how likely customers are to buy your products.

“Reaching your customers is vital,” says Beth Barling-Twigg, a consultant with research and consulting company Ovum. She adds: “Most companies can’t afford to waste time and money on mass marketing. Instead they hone their marketing strategy and aim their products at smaller, more manageable segments of the market.”

According to Barling-Twigg, the outcome is the difference between “buckshot marketing”, where you hope that your wide, but almost random, coverage will mean you reach at least a few customers and “sniping”, where thorough planning and forethought mean you hit the target.

There is a family of software that can handle the kind of decision support and analysis that marketing needs to manipulate data effectively: executive information systems (EIS). These show key performance data about each section of a company and automatically build a rolling, up-to-date profile of a business. Their main advantage is that they can reduce huge amounts of information to just the essential facts.

The technology is becoming easier to use, thanks to the availability of packages now written to run under the Microsoft Windows standard. And it sits on relatively inexpensive, yet powerful, desktop PCs and servers.

“More and more marketers have experience with PCs these days. They may also use them at home,” says Greg Bradford, managing director of CACI, the market analysis, information systems and direct marketing company. “Windows has made many applications familiar, so it’s much easier for marketing people to understand and use them.”

Just a few years ago, EIS were locked away in the boardroom. These days, they can help any sales and marketing manager look beyond summaries to obtain the finest level of detail about a particular product, department or customer.

Take DIY retailer Do It All: Its senior management team is relying on Planning Sciences’ Gentia EIS to help it spot where the company is not making money. It is trying to chop 60 of the most unprofitable stores in the chain and make the remaining locations more profitable.

Gentia is, therefore, analysing five main areas of performance which it presents on a screen as a series of application modules. These are: traded performance (which includes information on sales, productivity, etc); market information (market share, spend and profitability); financial measurements (profit, cashflow, costs); internal development (which includes the progress being made on managers’ performance objectives); and customer satisfaction measurements (market research and other survey findings).

However, Do It All’s marketing and merchandise director David Clayton-Smith has also been an early user and executive sponsor of the Gentia system. He was quick to realise the benefit to sales and marketing from the kind of drill-down analysis Gentia provides: “With Gentia, I’m able to manipulate the figures and find the underlying trends more quickly than I would normally have been able to do.”

He can quicky begin to paint himself an picture of what’s been happening to the business overnight. Before the system’s introduction, Clayton-Smith had to wade through bits of paper.

“Now the information is delivered when I need it,” he says. “By the 8.30 morning meeting, we’ve highlighted some of the areas where we’ve got some sales or supply problems, and we can agree a plan of action before the office has opened.”

The mass of data being manipulated on the desktop is so great that users are not sure what patterns may exist in their data. There is, therefore, a major requirement for technologies which can make sense of it all.

“Much of the technology of the next decade will be oriented toward helping people process the enorm-ous quantity of information available to them,” says the SAS Institute’s founder and president, Dr Jim Goodnight.

“Such technology, I think, will fall into two areas. One will be information filtering – technologies designed to help us sort out only the information that’s most important to us. The other will be focused on improving human capacity for absorbing vast quantities of information,” he concludes.

“Intelligent marketing” will be the buzzword for the next Millennium – which means using advanced analysis techniques such as data mining (seeing patterns in complex combinations of data) to make databases more efficient.

NatWest is tackling this. It has invested 100,000 in a desktop data mining tool, Brann Viper (from Brann Software) which provides its marketing departments with easier access to their database of 50 million records.

The company has recently divided into six key business units – card, retail banking, insurance, life and investment, mortgage, and corporate business services. To gain maximum benefit from the commonly-held data in its Customer Information System, NatWest was looking for software that would provide easy access to its customer information and help its marketing users analyse the data in a way that fitted their working style.

“We could see we had significant opportunities for making use of our data,” says NatWest’s national marketing manager, Nigel Gatehouse. “We have huge numbers of established customers, millions of customer contacts, and a unique picture of the financial affairs of millions of people,” he says. “Our marketing professionals needed a desktop data analysis tool that would kickstart their thinking and then help to build a ‘train-of-thought’ analysis process. They needed to be able to test hypotheses and work in a dynamic context to plan and create new products and business,” he adds.

Brann Viper can handle large volumes of data and provides NatWest with the performance that in the past, would only have been possible on a sizeable mainframe computer. Initially, the product has been set up on workstations in nine offices. Campaign, brand strategy and business development managers have been trained to use the system. They are checking the quality of the data held and learning about what the data holds and how to manipulate and visualise it.

American Express is using highly complex neural computing formulae to scrutinise the hundreds of millions of entries in its database that include how and where its cardholders spend their money. Cardholders receive an extra page of “offer” information with their bills that is customised and printed in only two hours using database publishing software developed by Group 1, a US software company.

Micro-marketing computer models work out the offers – which can range from car hire offers to cut-price deals in French restaurants – that best suit the buying habits of each customer.

“Basically, marketing modellers take the purchasing data and assign each customer a score,” says Barry Hill, Amex’s senior vice-president of product development. “The promotional offers we have from our merchants can then be matched against each customer’s propensity to respond to a specific offer.”

Gerald Chertavian, managing director of the database marketing company Conduit Communications, insists that all these systems should be evaluated in terms of cost effectiveness.

“In general, there have been some very significant advances in data manipulation software over the past few years,” he says. “So-called ‘multi-dimensional analysis’, ‘object-based reporting’ and ‘intelligent agents’ are but a few of these achievements. However, the real question is: Do these software tools provide a return on investment?” says Chertavian.

The answer is driven by the declining cost of the software tools themselves, but also by their improved ability to access corporate information from databases which may be spread across the company.

“An enormous amount of data which would be of use to marketing departments is held elsewhere within the company and they don’t have easy access to it,” says Gavin Roach, senior consultant within IBM’s newly-formed Network Computing Group (formerly the Client Server Group).

Combined, these two improvements are working to significantly increase the cost-effectiveness of data manipulation.

“The tools being used today are not difficult to learn to use, although they do require a clear idea of what information you are looking for and why,” adds Chertavian. “This is where many seem to have difficulty, largely because they have not yet defined the key pieces of information which drive their business forward.

“For instance, if as marketers we are most interested in cross-selling, then the data that we need to have on the database must reflect current product usage versus potential product usage,” he continues. “Without this data, the best data manipulation tools in the world aren’t going to help you cross-sell.”

Additionally, data manipulation tools do require some understanding of technology as well as an analytical approach. This is an area where marketing departments need to build their skills over time.

“At the end of the day, analysis requires an analytical mind,” says Chertavian, “and marketers of the future will need to be both technically literate and analytical.”

The gap between marketers’ understanding and their technical ability is one he expects to narrow, particularly as technology advances and it becomes more of a standard part of the marketing remit.