≡ Menu

First Look – ThinkAnalytics

Share

I spent some time with Peter Docherty of ThinkAnalytics recently, talking about their decisioning product. ThinkAnalytics grew out of K.wiz, a small team focused on the automation of the data mining process. The team had experience in data mining, real-time telecoms monitoring and data compression. They delivered an open, component-based platform with the intent to allow the engine to be embedded into other applications to automate analysis and to do this in real-time, with high performance. In addition to their base product they have developed a content recommendations engine that uses preferences of consumer, content features, implicit features and much more based on the platform. The Content Recommendations Engine is initially aimed at the media industry (with an out of the box product for TV Listings and Video-on-Demand content) and it can also be implemented for other sectors. They are CRM focused, mostly in B2C, and Telco and Media are their top markets.They have focused on making it easy to embed the whole application inside something else. In particular the delivery of easy embedding requires no manual transformations of data to model – something that often takes a lot of work with other products. The ease of embedding is particularly important in CRM, their core market, as many organizations have multiple CRM systems so delivering decisions across all of them consistently is a real pain.

ThinkAnalytics have also automated the process of creating and updating models – they often use the same algorithms and techniques as other data mining and analytic tools but they do not always assume a statistician/analyst is the only user. ThinkAnalytics also deliver control of the models to business users to avoid the problem of other CRM analytics which tend to be “black box”. This focus on the management of models by business users is something you don’t see very often – unlike, say, rules management by business users.

The tool itself has a nice mix of analytics and rules, though it is a little analytic-centric rather than being completely even handed. Experience is that 80% of analysts’ time is spent on foundation tasks so they have delivered lots of automation for these tasks and for the re-building of models as data and circumstances change. They are very focused on ensuring that this automation is not “black-box” so that the resulting models can be understood and validated. They support all the usual analytic techniques but also have a completely extensible engine – their algorithms use the same APIs as the pluggable interface for others to use. They integrate with SAS, allowing reuse of SAS routines for data cleansing and manipulation for instance.

In addition to the Content Recommendations Engine, they have two essentially identical products – one aimed at supporting real time execution and one at batch execution. The capabilities and design tools seem the same. The real time execution is of models and rules not of model development. The workbench and deployment is all handled from a single, Java-based product designed from the ground up to be distributed, extensible and easy to embed.

The workbench has a repository that supports collaborative model development and allows the sharing and reuse of model components, data transformations etc. Reporting and visualization for analysts is both extensive and well integrated. They have the concept of a process plan which is somewhat like a decision flow but is more data centric. Process Plans to develop models are designed and others are specified to use those models in decisions. These can also have rules and ruleflows integrated with the process plans and can link steps in the plans to the visualization tools – allowing a plan to be defined primarily to populate a visualization, for instance. While the rules capabilities do not seem as robust as you might get from a pure-play BRMS, there is plenty of functionality there and it is nicely integrated with the models. Business users can be given controlled access to both rules and models and there are some nice automatic monitoring tools to do things such as notify modelers when the predictive power of a model drops below a certain level.

Share

Comments on this entry are closed.