Tom Davenport had a great article recently on Data Informed – “Printing Money” with Operational Machine Learning. His intro paragraph is great:
Organizations have made large investments in big data platforms, but many are struggling to realize business value. While most have anecdotal stories of insights that drive value, most still rely only upon storage cost savings when assessing platform benefits. At the same time, most organizations have treated machine learning and other cognitive technologies as “science projects” that don’t support key processes and don’t deliver substantial value.
This is exactly the problem I see – people have spent money on data infrastructure and analytics without any sense of what decision they might improve with them. By taking a data-first and technology-first approach these organizations have spent money without a clear sense of how they will show an ROI on this. He goes on to talk about how embedding these technologies into operational systems has really added value to an operational process – this, of course, is the essence of decision management as he points out. As he also points out this approach is well established, it’s just much more accessible and price-performant now than it used to be. It’s always been high ROI but it used to be high investment also – now its more practical to show an ROI on lower investments.
In the example he discusses in the article, the solution stack
…includes machine learning models to customize offers, an open-source solution for run-time decisioning, and a scoring service to match customers and offers
Tom goes on to identify the classic elements of a solution for this kind of problem:
- A Decision Service
This is literally a service that makes decisions, answers questions, for other services and processes. Identifying, developing and deploying decision services is absolutely core to success with these kinds of technology. The graphic on right shows how we think about Decision Services:- It runs on your standard platform to support your processes/event processing systems
- It combines business rules, analytics (including machine learning) and cognitive as necessary to make the decision
- A Learning Service
This is what we call decision monitoring and improvement. You connect the Decision Service to your performance management environment so you can see how different decision choices affect business results. This increasingly includes the kind of automated learning Tom is talking about to improve the predictive power of the analytics and to teach your cognitive engine new concepts. But it can also involve human intervention to improve decision making. - Decision Management interface
The reason for using business rules and BRMS in the Decision Service is to expose this kind of management environment to business owners.
We have seen this combination work over and over again at clients – mostly with human-driven learning to be fair as machine learning in this concept is still pretty new. Our experience is that one of the keys to success is a clear understanding of the decision-making involved and for that we use decision modeling. You can learn more about decision modeling from this white paper on framing analytic requirements, by reading this research brief (if you are a member of the International Institute for Analytics – the organization Tom founded) or by checking out these blog posts on the analytic value chain.
I have a lot more on how these decision management technologies work together in the Decision Management Systems Platform Technologies report which will be updated with more on machine learning and cognitive in the coming months.
Comments on this entry are closed.