Table of contents for Analytic Practitioners Speak
- A Practitioner Speaks: Analytics and Decision Management
- A Practitioner Speaks: Top challenges for analytic professionals
- A Practitioner Speaks: Requirements for analytic projects
- Another analytic practitioner speaks – an interview with Tracy Altman
- Analytic practitioners speak – an interview with author Nauman Sheikh
- Analytic practitioners speak – an interview with Lee Feinberg of DecisionViz
- Another analytic practitioner speaks: an interview with Matt Kitching of Apption
I have been interviewing analytic practitioners periodically – last year I interested Andrea Scarso, CEO of MoneyFarm and more recently I interviewed Tracy Allison Altman, co-founder of Ugly Research. Next up is Nauman Sheikh, a seasoned professional of Data & Analytics. Nauman was introduced to me by a client who really liked his book Implementing Analytics: A Blueprint for Design, Development, and Adoption (as do I). Nauman and I share a passion for making sure that analytics are not just developed, but deployed into decision making systems and processes.
What’s your background, how did you come to be working in analytics?
My background is purely in computer science and Started in software development with Object oriented programming and then switched to database development and data modeling. That led me to start working on data warehouse systems very early in my career and I dealt with a billion row fact table back in 1998 – it was certainly very Big Data – for those times J. I built and rolled out a lot of data warehouse systems and data marts and in 2006 started to provide data to statisticians. That got me curious as to what data they needed and why and what were they doing with it. So I learned the fundamentals of regression analysis and predictions in direct marketing and consumer risk at Experian. When I started working in their global consulting practice in emerging markets, I realized the challenges of analytics adoption presented by relying on statisticians and experimented with machine learning back in 2009. Have been focused on building analytics solutions using data mining techniques since then.
What are the primary kinds of analytics you build at the moment?
There are several active projects going on with predictive modeling (data mining based) in detecting fraud in government welfare programs like Medicaid, Unemployment Insurance and Subsidized Housing. There is also some R&D going on with Clustering and outlier detection as well as Social Network analysis for fraud detection.
In your experience what are some of the top challenges for analytic professionals in terms of maximizing the business impact of what they do?
I see two trends that I believe are both going to lead to very challenging situations for analytics professionals to justify their value to the business and management. One trend is too much focus on Big Data technology like High-end database appliances and in-memory and NoSQL type of technology. The other trend is focusing too much on the model especially using statistical techniques. It is very difficult to explain these models to the business users. So first trend will get you to invest too much money up-front before any meaningful value can be achieved and management will get impatient – same thing happened in early data warehousing projects which were too expensive, too large and took too long to deliver value. The second trend leads to business not able to understand and make use of the models especially if they have not been actively engaged through the model development process.
What have you found that helps meet these challenges? How have you evolved your approach to analytics to maximize the business impact of what you do?
My approach to analytics and to these challenges is “simplicity”. Existing data warehousing infrastructure and technology should be used to build and roll out first few projects and then get the momentum for larger technology investment in Big Data Technology. Business engagement is also going to improve if you simplify the model in terms of its use rather than its underlying mathematics. Simplifying the use of a model is always through business engagement at the operations level as to what the model would produce (almost treating the model as a black-box) and how they would use it in their day to day operations. They should understand the use of a model before a single data field or variable is selected for analysis and model development. Once the model is ready, decision strategy simulations on historical data is also critical so they start believing in the black box – which they will never understand or appreciate no matter how good a model has been put together. If they see the retrospective analysis of their decisions supported by the model, they will adopt its output for their decisions and that is the essence of analytics success – “complementing current business decisions rather replacing them”
How, specifically, do you develop requirements for analytic projects?
Again, Simplicity. I start out with their existing data warehouse reporting, their metrics and KPIs that they understand and live by on a day to day basis and I simply ask them, what if I can predict this same KPI few days or weeks earlier than the report comes out? What would you do with that information? That gets them going in starting to understand the model’s output and they start asking questions around their operational decisions that they can alter and change because of some knowledge beforehand to ensure the eventual metrics hits the expectation. I then provide some hand holding around decision modeling and show them how to look at their business processes and how to model decisions and experiment. The discussion is really then taken over by the business SME and I just sit back and take notes.
There’s a growing interest in rigorously modeling decisions as part of specifying the requirements for an analytic project. How do you see this approach adding value for analytics professionals?
It’s a useful exercise but it shouldn’t be done in a vacuum. What I mean by that is the conversation shouldn’t start with what are your business process flows and where are the decision points in those flows. The discussion should start with an objective around their operational metrics and KPIs as to how they would improve those if they employ analytics. Once the objective is understood then the decision modeling exercise should kick-in within that context. Extending that capability to run simulations and what-ifs has to be an integral part of decision modeling as well as a champion-challenger approach to keep testing new decisions strategies.
Anything else you would like to share?
I see analytics as Paul O Neil saw safety for Alcoa when he took over back in 1987. If you are changing the culture of an organization where they are required to review every business decision with an analytics’ magnifying glass and then optimize it then the key to this cultural revolution across all aspects of the business has to be a simple message that everyone understands and follows – may be start to understand and use it in their personal lives. Hence the key to success and adoption of analytics is Simplicity. If you are not familiar with the Alcoa case study, I would highly recommend its lessons for analytics.
Last question – what advice would you give analytic professionals to help them maximize the value they create for their organization?
The combination of business, technology and mathematics is what makes analytics work. However a combination of these skills is very hard to come by and therefore my advice is to not even try to master all three. Improve communication and build a process so all three are addressed by respective experts and communication around “Simplicity” of concepts gets them to understand each other.