Table of contents for Analytic Practitioners Speak
- A Practitioner Speaks: Analytics and Decision Management
- A Practitioner Speaks: Top challenges for analytic professionals
- A Practitioner Speaks: Requirements for analytic projects
- Another analytic practitioner speaks – an interview with Tracy Altman
- Analytic practitioners speak – an interview with author Nauman Sheikh
- Analytic practitioners speak – an interview with Lee Feinberg of DecisionViz
- Another analytic practitioner speaks: an interview with Matt Kitching of Apption
One of the questions I asked Andrea Scarso co-founder and COO of MoneyFarm when I interviewed him was “In your experience what are some of the top challenges for analytic professionals in terms of maximizing the business impact of what they do?” In this post I’ll take a couple of his comments and expand on them a little.
“Sometimes I observe too much abstraction in model development. The improvement of the chosen metric becomes an objective by itself, and the only one.”
I see this too. It’s easy for an analytic team to become very focused on the performance of the analytic model – how predictive it is, how precise it is – using measures such as a model’s “Gini” (measuring discrimination) or its “KS” (measuring fit). Obviously it is important that a predictive analytic model is predictive, that it does discriminate between (for instance) fraudulent and non-fraudulent cases and that it does fit your test data well. The problem is that focusing on these measures can lead to regarding the model and the production of an accurate model as the end rather than just as the means.
Predictive analytic models don’t do anything – they just make predictions. As someone once told me they turn uncertainty about the future into a usable probability. But they don’t act on that probability. It’s critical to remember that, to make sure that your analytic team is focused not only on the analytic they are trying to build but on the decision-making they are trying to influence. To make sure this is clear I strongly recommend that analytic teams develop a model of the decision-making they are trying to influence – see this white paper on Decision Modeling for Predictive Analytics Projects for details. Building up a decision model, showing where in the modeled decision-making the analytic plays a role and what the other critical elements are (like regulations and policies or business expertise) all helps focus the team on the business context for the analytic model (as I discuss in the role of decision modeling in a data-driven culture).
Interested? Why not come to the upcoming IIA Webinar: Improve Analytic Results with Decision Modeling
These models are also easy for non-data scientists to understand so they also help build collaboration across business, IT and analytics teams. This kind of collaboration (what I call the three legged stool) is crucial and setting up mixed teams at the very beginning of an “analytic” project is emerging as a solid best practice. Decision modeling is a technique around which the three groups can coalesce and that gives them a framework for agreeing on the project, its scope and its purpose.
Andrea went on to say:
“Another common risk is that the professional developing the model loses sight of (or ignores from the beginning) the business objective.”
This is another reason I really like decision models for analytic projects. If the team has built a model of the decision making to be influenced by the analytic then they are in a position to tie that decision-making to the business objectives or KPIs that are impacted by that decision-making. Think about it – if you make a repeatable decision differently then you are going to have an impact on your operational outcomes. Whether you are treating customers differently, flagging different transactions as fraudulent or whatever, changing decisions changes results. In fact nothing else really does – if you don’t change your decision-making you are unlikely to have any impact on your results. With a decision model to hand you can actually identify the specific business objectives, measures or KPIs that will be impacted by a change in any part of that model.
This linkage of decisions and objectives/KPIs helps move the discussion away from technical measures of model performance and toward measures of business performance. You can tell a business owner that they should use a new model not because it has a great KS but because it will help claims adjusters flag fraudulent claims more accurately and so reduce fraudulent claims paid. You can also prioritize possible uses of your scarce data science resources based on which analytic will influence decisions that have more business impact. All in terms of the objectives or KPIs that matter to business executives. This may seem obvious but it can be critical to success in an analytic project.
And let’s give Andrea the last word:
“The best way to overcome these challenges is to strictly tie the analytic development to the decisions taken in the organization.”
If you want to use Decision Modeling to improve your analytic projects, check out our Decision Modeling services and sign up for a 60 day free trial of DecisionsFirst Modeler, our cloud-based, collaborative decision modeling platform. If you need more general help adopting predictive analytics, check out our services to help you get started with predictive analytics.