A decade ago, the marketing director of a major high street bank confessed to me: “Until we put in our new computer system last month, I couldn’t have told you how many customers we had. I could have told you how many accounts we had, but there was no way we could relate accounts to the individuals who held them.”
At the time, that wasn’t such a remarkable situation for a bank to be in: marketing, for the financial services industry, was a relatively new concept, and the banks and insurance companies were drafting in packaged goods experts to help them get up to speed.
But you might expect that after ten years of exposure to consumer marketing, and with the white heat of the information technology revolution supposedly transforming the way we all do business, everybody in financial services would by now have a good idea who their customers are, what products they are using, and what they are likely to be interested in buying.
Yet only last week, one data mining expert told how his company was helping a major financial institution perform exactly the same exercise as this particular bank did ten years ago – reconciling account details with actual people.
The truth is that the power of the computer to collect raw data about consumers and their habits has far outstripped the ability of the marketer to process it into usable, value-added information.
Fertile ground for those companies which offer data mining software packages and related consultancy services: but there are those who question whether the approaches many of those companies are offering are actually as valuable as they seem.
Robin Coles, director of retail solutions for STS, says bluntly: “We don’t believe in data mining per se. A lot of people are buying data mining software, but what they’re doing is creating a data landfill, and never getting anything useful out of it.”
Coles’ point is echoed by Mike Smith, director of European marketing at data mining software company Applix UK, who says: “A few years ago, people were getting into data warehouses. But how do you access the information?”
STS itself offers a package called MarketWorks, with users including Calvin Klein, Giorgio Armani and Warner Bros. Coles says: “The idea behind the product is that the marketing department has to be able to use it itself, rather than having to pass queries through to the IT department to run.”
It used to be that the marketing department would hold a meeting on a Friday morning to discuss the questions it wanted the corporate mainframe to answer. This list would be given to the IT department, which would set the computer running over the weekend. If everything worked, the answers would be ready by Monday morning.
Then, IT departments were the guardians of the database. Now, everyone has a PC on their desks, and every department has its own databases of information on the people they have regular contact with – and frequently those databases are at the very least differently structured in terms of how many fields to a record and where data is stored, and at the worst on incompatible file formats.
That explains the data warehouse phenomenon, where clients were encouraged to create super databases, amalgamating all their in-house databases into one. But the data warehouse movement seems to have had its day.
And, as Michael Page, director of interactive services at database and direct marketing firm Acxiom explains, there are good reasons why separate databases need to be maintained for separate departments.
For one thing, companies may not wish to change the formats of databases, which have been built up over long periods of time (and which were perhaps once kept on paper), particularly if those databases are central to a business’s operations. “I can understand why people are terrified of changing their databases – once you’ve got the core to work, you don’t want to risk changing things,” says Page.
This is one reason why so many data mining companies are now advising clients to transfer their data over to a completely new database if they want to use it for marketing analysis purposes, leaving the original untouched.
Page says: “The methodology has been don’t mess with the central operational system. Take the data out and set up a marketing database on a separate system.”
At one time, databases had to be held on central computers, because desktop PCs simply didn’t have the processing power to handle sophisticated analysis, let alone the memory to hold that many records. When PCs did become powerful enough, different departments tended to buy different database programmes and set up their own files.
But the development over the past 18 months of client-server based systems and Intranets has had a major impact on the use of data mining by marketers.
“Client-server” systems allow companies to store the database on a central computer (the server) with individual PCs on people’s desks (the clients) pulling off the data over the Intranet as it is needed. If it sounds a lot like the old days of dumb terminals, it is – except that the PC isn’t dumb, and can be used for other tasks.
And, with the latest programming platforms, such as Sun’s Java, there is not even a need for each PC to have its own copy of the analysis software – PCs can download the programmes they need to crunch the data at the same time as they download the data itself.
When it comes to analysis packages, though, there seems to be some dispute about the relative validity of the statistical techniques that lie at the core of them.
Kevin Slatter is data services account director of The Computing Group. His company uses two data analysis packages, DB Query and SBase. DB Query is primarily used by The Computing Group in-house, although it has sold the package to a number of clients. SBase works on client-server technology and allows for three-dimensional analysis of data. What that means, Slatter says, is that whereas a basic analysis tool might look at a three-by-three matrix, so giving nine cells, SBase adds a third axis, making a total of 27 cells.
He says: “The key to data mining is looking at commonalities between different kinds of data, and three axes give you a hell of a lot more flexibility.”
Slatter admits that “to statisticians, that’s not greatly new. But you’re now seeing packages on the market which are user-friendly enough to put on a desktop to be used by the marketer. That means response rates should be increasing, and costs falling.”
But while Slatter might be promoting the applicability of three dimensional modelling, Applix’s Mike Smith, weighs in on behalf of OLAP – On Line Analytical Programming.
As with three-dimensional analysis, OLAP isn’t new – but Smith says that it has traditionally been restricted to large-scale number crunching for financial services and insurance companies, looking for ways to screen out risks.
He adds: “Those were the sort of people who understood statistics and who had huge amounts of money to spend on computers and developing software.”
Now, however, it is becoming available for PC-based systems and even notebook computers, with Applix’s software including features which allow distribution over the Internet and Intranets. And Applix, which has been a long-term player in the financial services database analysis market, has started to offer its expertise to marketers in the packaged goods area.
Apparently, OLAP “used to require that you knew what you were going to ask the database”. Hence the need for the Friday meetings, to discuss just what questions the marketing department wanted answered.
But obviously if you can use an analysis programme to highlight interesting commonalities within your database (or databases), that makes it much more flexible and useful than if you need to know the exact question you want to ask in advance. And the latest programmes also allow the use of more recent – even real time – data, rather than historical data as has been the case in the past.
Smith says: “OLAP isn’t new, but analysis packages used to be reactive. Now we’re training clients on how they can become proactive. We’re helping them look at how you acquire customers and how you can keep them longer.”
And the stress is on the training and helping, rather than on the software itself. Companies like Applix or The Computing Group do not sell themselves on the strength of their software packages, but on their consultancy skills – in other words, helping clients work out what information they need, how to set up the database so that it functions, and how to assess the results.
Steve Delo, head of modelling and analysis at geodemographic company CACI, says, “The most important tool in data mining is not sophisticated up-to-date software, but the telephone. Data mining is at its most effective when regular communication is established between business management and the analyst. Data mining that can drive business, including customer acquisition and retention, is the result of vision and of people who ask the right questions.”
Delo says that “being able to interrogate vast numbers of records at speed is of no use if the pattern identified adds nothing to the understanding of customers, or if ideas cannot be implemented”. As a case in point, Delo cites the classic “nappies and beer” scenario, where a retailer noticed that if a man shopping on a Friday night bought nappies, he would most likely also buy beer, and vice versa. Delo asks: “But what retailer is going to reconfigure the store layout for nappies and beer just for Friday nights?”