Shifting predictive models to a more optimal paradigm

Combining engineering and statistical disciplines, the Smart Technology Research Centre at Bournemouth University looks set to challenge the way models are built. Professor Bogdan Gabrys explains.

If you want to understand customer behaviour in order to predict churn, where would you look for predictive models?

The obvious answer would be established statistical models that allow you to optimise marketing activity within an operational environment. But what if there was a better model to be found in ecology or biology and their understanding of natural systems?

Likewise, if you need an analytical resource to transform performance in your marketing or customer management function, where would you look to find it? Typically, it might be among one of the numerous specialist data insight houses that have been spawned from DunnHumby. But what if you could get something altogether fresher by drawing on the world of academic research?

Bournemouth University’s Smart Technology Research Centre has already proven its abilities on churn projects for BT, transactional analysis for Screwfix Direct and others. Now with an analytical software platform in development, the centre is about to become a competitor to existing marketing analysts and the software vendors they tend to rely on.

“One of the things I’m excited about is the paradigm shift from knowledge-intensive and time-consuming approaches to engineering-based thinking that have dominated over the last century,” says Professor Bogdan Gabrys of the STRC. This approach will be familiar to anybody used to marketing-oriented analytics and the constant drive to optimise outcomes.

“What happens is that you optimise a very narrow set of scenarios, which is very inflexible,” he says. By contrast, the approach adopted by the centre is for modelling that draws on multiple competing possibilities in order to identify the best one.

“As part of the shift, the work we do is moving from a single, optimised model tuned to death by an expert and then operationalised to a solution that is very performance-based, driven by multi-level optimisation algorithms. As part of the platform we are developing, we are trying to incorporate competing arguments and have solutions working in parallel, then picking the best one,” says Gabrys.

This new solution is more dynamic and collaborative than the conventional approach to analytics. Whether it is the automated modelling in the software or the individual analysts working on a project, the result is a constant sequence of solutions that respond to changes in the environment or marketplace, rather than assuming that the perfect model at the start of a project will still be the best one by the end.

Creating the next best model can take time, which is why having an automated system running in the background will ultimately allow STRC to ensure its human experts are constantly focused on the higher level goals, rather than the short-term needs. Those experts are drawn from across disciplines, including ecology and biology as well as engineering, to ensure fresh thinking is constantly applied to problems.

“We might look at how social networks operate and some of the algorithms we would build might draw on behaviour you can observe in nature,” points out Gabrys. “You can gain a lot of insight from different sciences.”

STRC itself was formed in 2007 from the merger of two research groups, one working on computational intelligence – building predictive models based on processing of high volume data sets and automating those techniques – and the other based in bio-medical engineering – gaining information from sensory systems.

The heart of the centre is about automating these processes to allow multi-strand models to work in competition, rather than relying on a single expert or model with a narrow focus. The data sets used and the purpose for the model can be any number of things, ranging from airline load modelling to distribution or risk.

Gabrys says the centre continues to pursue acadamic research and has recently been part of a tri-partite consortium that won a multi-million euro research project into chemical engineering issues. But as its work on churn prediction for BT demonstrates, the commercial market beckons.

Launching a predictive modelling system into a market dominated by SAS and with competition from SPSS and KXEN will be a challenge. But Gabrys argues that, “their solutions are very limited and usually rely on generic techniques like linear regression, decision trees or neural networks. What we are working on is a pervasively adaptive solution where you don’t have to rely on just one approach.”

David Reed, editor, Data Strategy


Rules are rules, even in a recession

Marketing Week

Is it possible for the direct marketing industry to leave behind the “we-know-best carpet bombers” past?  Based on some recent analysis, it would seem that the issues of reporting royalties and usage reports has taken a major  step backwards over the last two years. It is understandable that a reseller marketplace may want to create efficiencies during a recession, but that doesn’t mean throwing the rule-book away.


    Leave a comment