Tom and Jeanne have written a new book (building on a paper they wrote some time ago) about what they call “analytic competitors”, that is to say companies that use their analytic prowess not just to enhance their operations but as their lead competitive differentiator. The book discusses a number of these analytic competitors and gives an overview of how analytics can be used in different areas of the business and how you can move up the analytic sophistication scale. So what are analytics? Tom and Jeanne say
“the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions and actions”
The book has two parts – one on the nature of analytical competition and one on building an analytic competency. The first describes an analytical competitor and how this approach can be used in both internal and external processes. The second lays out a roadmap for becoming an analytical competitor, how to manage analytical people, a quick overview of a business intelligence architecture and some predictions for the future.
They define an analytical competitor as an organization that uses analytics extensively and systematically to outthink and outexecute the competition. The analytics are in support of a strategic distinctive competency and they argue, persuasively, that without a distinctive capability you cannot be an analytic competitor. They also note that analytical competitors need a primary focus but once created the culture of test-analyze-learn spreads widely. They argue that to be successful analysis has to become a broad skill of the company not just the province of a few rocket scientists and they repeats the famous Red Sox story about a manager who did not believe the analytics and so lost the big game. While I agree with this overall, I think organizations sometimes confuse having an analytic mindset across the company with teaching everyone analytic skills. Many of the people in your organization do not, in fact, need analytic skills so much as an understanding of when analytics and other decision automation help and how to use them effectively. Indeed the authors refer to Malcolm Gladwell’s Blink by pointing out that even snap judgments are best when backed by an understanding of many previous encounters and the analysis of the same. You may not be able to give all your staff experience to improve their snap judgment, but you can use analytics to do so.
The book outlines what they call four pillars of analytical competition- a distinctive capability, enterprise-wide analytics, senior management commitment and large scale ambition. They lay out 5 stages of analytic competition from “analytically impaired” to”analytic competitor” (something I saw Tom present at a Teradata conference). The importance of experimentation is made clear (e.g. CapitalOne runs 300 experiments on any given day) and the book repeatedly emphasizes the need for companies and executives to be willing to run the business “by the numbers”
The book is full of stories about how companies compete analytically. Some of my favorites included:
Capital One’s focus on identifying and serving new market segments before its peers can. They have a lovely concept of “deaveraging” – breaking a segment into small segments for better targeting – that everyone should consider and that comes up in my posts on segmentation.
Marriott’s total hotel optimization shows the importance of new measures. They created one called “revenue opportunity” or what percentage of the theoretical maximum revenue they actually made. Not only did they get this to rise an amazing 8% but it shows the value of getting your measures right.
Progressive Insurance is so certain that if someone else’s rate is better than theirs that they are taking on an unprofitable customer that they are willing to tell you what their competitors rates are. I have blogged about Progressive before and the book pointed out that they are now so fast moving that they can often target a new segment and win much of it before competitors have managed to react!
The Veteran Administration’s use of evidence-based medicine and predictive analytics along with automated decisions for treatment protocols is noted, as is the fact that perhaps only 25-30% of medical decisions are scientifically-based! Healthcare comes up a fair bit on the blog but I think there is a lot more to do here.
Honda makes good use of text analytics to flag early problems in cars by analyzing warranty claims calls by customers or dealers to Headquarters etc. This was a nice example where automated analysis and flagging was all that was needed to get value.
Toyota found that only 20% of possible users of yield management tools could use it effectively- visualization tools helped but it seems to me that this is one of the drivers for decision automation rather than other approaches – embedding better decisions in existing processes using decision services means less need for staff to learn to use the analytics.
Vertex, a pharmaceutical company, starts by identifying the right metric to measure success and then drives into the data needed to measure that. This is a great general approach – don’t just collect data but collect data with a purpose – “begin with the end in mind”
Harrah’s focus on real-time analytics at the point of sale so that action can be taken as it is being collected e.g. when a customer is losing money recommend and promote the buffet, when crowding in one area causes traffic jams, offer deals in slower parts of casino etc.
DnB NOR bank uses event triggers to prompt customer relationship offers, using analytics to trigger the right events.
O2 mobile phone company using personalized menus to maximize value of limited phone real estate and uses predictive analytics to personalize
I loved this one as it is a great example of a hidden decision – the decision to display a certain set of options to a mobile phone user is often hidden as companies don’t think of each new list as a decision – they think of it as “the list”.
CEMEX used analytics to move focus from the sale of a commodity (cement) to the delivery window using analytics and GPS. They went from 3 hours for a change to 20 mins. Sometimes the power of analytics only comes with a different view of the problem.
Netflix focusing on giving each customer a personalized website experience based on recommentations, ratings, segmentation. Again, another example of regarding every single visit to the website as requiring a decision as to what to display. I call this extreme personalization.
Tom and Jeanne also referenced another book I liked, Good to Great, about the power of “breakthrough results come about by a series of good decisions, diligently executed and accumulated on top of another”. This is the mindset I think you need for Enterprise Decision Management – this focus on improving lots of operational decisions not one big strategic one. The authors also note that analytics are a way to confront the “brutal facts”.
The book has a great list of questions to ask about a new initiative – how will it make you more competitive, what data do you need, does technology work etc but one was particularly important: “What complementary changes need to be made in order to fake full advantage of new capabilities”. This resonate with me as I have seen companies get in trouble by focusing only on one stage in their process. This is why there is the “E” word in EDM. It is not because you must do decision management at the enterprise level to get value but because your must take a broader view of decision management for maximum value.
They outline a number of ways to get a competitive advantage from data – by collecting unique data, manipulating data better, using a unique algorithm or embedding it in unique process. Regardless of the competitive approach, the need for analytical executives to be willing to act on the results of analyses was clear. Segmentating your customers is not enough, you must differentiate your treatment of them to make a difference. As I have said before, those who act first win
There is a lot in the book about data quality – a major focus on getting a single version of the truth and clean, accurate data. Clearly an analytic competitor will spend more time and effort on data quality than others but is this cause or effect? My sense is that focusing on getting the data right first, without a view of the kind of analysis you are attempting, will get you in to trouble as well as delay the benefits. Indeed I don’t think you should try and collect and clean all your data before doing analytics. Instead I would say figure out what analytics you need and then see if you have or can get the data you need and fix the problems with that data. Tom and Jeanne seemed to imply that consistent, quality data across the board was essential for analytic competition and I am not sure I buy that.
Take one of their examples – a bank refusing o waive a $35 bounced check fee for a customer who had a $100M trust fund. Does the data need to be integrated to fix this problem? Well integrating the customer data would be one way. But what about sharing the insight? The fact of a $100M trust fund could lead the private banking group to identify the person as an excellent customer and this fact could be shared. There is still some integration – you must be able to identify that the customer is the same person in each case – but you don’t necessarily have to integrate all the data.
Another area where I take a slightly different approach is in decision automation. The authors talked about automated decision-making as a powerful tool for “more advanced analytical competitors”. I am not convinced that the degree of development towards being an “analytical competitor” is a driving force for decision automation. I think one could become a very successful analytical competitor without decision automation and that companies can get results from decision automation without great analytical sophistication. I believe these two aspects are, if you like, orthogonal. Indeed I speak to many customers who are using very sophisticated decision automation applications who are not, in fact, very analytically sophisticated.
In a similar vein I think they miss an option when they talk about tools for amateurs, they outline three options – give them powerful analytic/data mining tools, have the system spit out the “right” answer or give them a spreadsheet. I think there is a fourth option – automate the decision but have it return many options. This does not require the automation to be “perfect” but it does help the individual make better decisions. This kind of decision automation can return, say, a number of options along with the reasoning for each option and perhaps some visualization of the analytics that go with it. Then the individual is not wading through the whole report, say, but looking at 3-4 snippets each focused around a specific treatment option. Decision automation need not be all or nothing.
The book ends with a great list of changes coming, several of which resonated with me:
More automated decisions
This is, of course, the subject of the entire blog but you might like this post about a great paper Tom Davenport did on decision automation
More real-time decisions
Definitely. The whole real-time enterprise thing is happening
Greater use of alerts
More prediction and less reporting
I have blogged about this a fair bit and a recent TDWI report talked about the move to predictive reporting and predictive analytics
More mining of text
This is an area in which I am going to try and blog some more. I just scanned the blog and could not find very much.