≡ Menu

I am going to be at IBM InterConnect this week. I am speaking with Kaiser Permanente at 2pm on Monday – Pioneering Decision Services with Decision Modeling at Kaiser Permanente – so come by and here me and Renee speak about our successes with decision modeling with DMN (Decision Model and Notation), business rules and IBM Operational Decision Manager (ODM). I’ll have some copies of my books to give away and it’s a great story – well worth hearing.

Right after I will be easy to find at the meet an IBM Champion booth. Come by and ask me questions about rules, analytics, decision management, decision modeling or whatever! In general I will be around until early Wednesday morning and checking my email plus I will be blogging from the conference (here on JTonEDM) when I get a chance as well as tweeting @jamet123.

Come by and say hi!

 

The Decision Management Systems Platform Technologies Report began in early 2012 as a way to share our research and experience in building Decision Management Systems. Since then we have extended, updated and revised the report many times. This week we released the latest version – Version 8 – with a new, easier to use format. There is so much content in the report now than one great  big document no longer works. To make it easier to use we have now broken it down into a set of pieces. We have also added content on technology for modeling decisions, upgraded significantly the section on monitoring and improving decisions and given the whole thing a refresh.

The documents are:

  • Introducing Decision Management Systems
  • Use Cases for Decision Management Systems
  • Best Practices in Decision Management Systems
  • Five Key Capabilities
    • Managing Decision Logic With Business Rules
    • Embedding Predictive Analytics
    • Optimizing and Simulating Decisions
    • Monitoring and Improving Decisions
    • Modeling Decisions
  • Selecting Products for Building Decision Management Systems

The first three are an excellent introduction for business or technical readers, while the others are more focused on those who are selecting or using the technologies described. You can download them for free on the Decision Management Systems Platform Technologies Report page.

As always, don’t forget that we have extensive experience helping organizations like yours define, configure and implement Decision Management Systems that deliver on the value propositions described in the Report. Our clients are leading companies in insurance, banking, manufacturing, telecommunications, travel and leisure, health management, and retail. Contact us if we can help.

I caught up with the folks from Conductrics to learn about their 3.0 release (I have blogged about Conductrics before). Conductrics has been in the decision optimization business for many years now. At its core Conductrics is about assigning customers to experiences. They recently released 3.0 with some new features.

Conductrics Express is a point-and-click tool to help set up tests and personalized experiences for web sites. It’s a browser extension with lots of control. Historically Conductrics has been more API-centric. Some audiences really like a “headless” and API based platform but others want something more UI-based. In addition, quick tests or temporary projects are common and the new visual setup lets them quickly set something up. For instance, figuring out which customers get which experience sometimes requires some quick A/B testing or experimentation and there is no time to work with IT etc. Conductrics Express sits on top of the API so can be evolved and integrated with other API based projects.

To make it easier to use machine learning, the new version supports explicit induced rules. This gives you an interpretable AI as it converts complex optimization logic into easily digestible, human readable decision rules. Users can use it for audience discovery and can either have it drive the experience or just “listen” to see what it would have recommended. This engine does trial and error or A/B testing and as you collect data it builds a decision tree for audience segmentation.

One of the nice features of the engine it that it predicts likelihood of success for offers but also predicts how likely an option is the best one. This enables you to identify both those experiences that are clearly better than alternatives despite having a low chance of success as well as those that seem significantly better but where there is a high degree of uncertainty. This reflects the reality that additional targeting is lower value for some (sometimes there’s just not much difference between best and worst). This lets you see the marginal benefit of targeting (v picking A or B) etc. and allows you to see poorly served audiences.

The current version allows Inline creation of options, easy specification of goals and has a Javascript API that allows packaging of logic into a file that is locally available e.g. on mobile app. You can also group agents into a multivariate agent for reporting and create mutually exclusive agents to make for more sophisticated analyses. Users can also add rules to constrain or force experiences, use predictive analytics in assignment or randomly assign people to learn from experiments. A single recommendation can be the objective or a list of recommendations can be. All this can be tied together using flowlets that specify logic to tie agents together using adaptive or random logic. This allows for more complex decisions where making one implies another.

Finally, there is a new faster JavaScript API to the existing web service API and a variety of deployment options from hosted to on-premise.

You can get more information on Conductrics here.

I have trained a lot of practitioners in the Decision Model and Notation (DMN) – I am closing in on 1,000 decision modeling trainees now – and one of the interesting questions is always their motivation for using decision models. As I look back across all the folks I have trained, four motivations seem to bubble to the top:

  1. Too many analytic projects go awry and they need a way to frame requirements for analytic projects to tie analytics to business outcomes
  2. They struggle with a “big bucket of rules” in their Business Rules Management System (BRMS) and need a better way to capture, manage and structure those rules
  3. They need clarity and consistency in how they make (or want to make) decisions that are likely to be only partially automated
  4. They need an executable representation of decision-making

These motivations have three things in common:

  1. They need to visually communicate their intent to a wide group of stakeholders, some of whom did not build the model
  2. They need a broad group of people – business experts, business analysts, data scientists, programmers and others – to be able to collaborate on the model
  3. They have tried using documents (requirements, lists of rules, spreadsheets) before but have found them inadequate

Decision models built using the DMN notation and a decent approach (such as the one Jan and I describe in Real-World Decision Modeling with DMN) deliver. DMN Decision Requirements models are easy for a wide group of people to collaborate on and clearly communicate the structure of decision-making. Combined with either DMN Decision Logic models or executable rule artifacts in a BRMS, they manage decision logic effectively and are an executable representation of your decision-making.

Now, there are those that feel like the motivation for decision models is almost always a desire for an executable representation – that only a decision model that is complete and executable is really useful. While this may be true of BPMN tool vendors – who need executable DMN to make their process models go – it is far from true among decision modelers more generally. In fact in our training and consulting projects we see more projects where a decision requirements model alone is tremendously useful than anything else.

Among those who want execution there is a split also: Some want the model to be 100% executable – to be able to generate code from it. Others prefer (as I do personally) to link a business friendly logical model to executable elements in a more loosely coupled way. This allows the business to keep things in the requirements model that are not executable (such as how people should decide to consume the result or characterize an input) while IT can make some performance or platform-specific tweaks that don’t have to show up in the logical model. The cores are linked (most decisions in the model have an executable representation in a BRMS that is editable by the same people who edit the model) but not everything has to be.

Whether you like the “press the big red button” approach that generates code from a DMN model or a blended DMN + BRMS model for execution, never forget all the other use cases where clarity, collaboration, consistency and structure matter even though execution doesn’t. There is a wide range of valuable use cases for DMN – it’s not just about executable models.

Prompted by an interesting post by Bruce Silver called DMN and BPMN: Common Motivation, Different Outcome?

Forrester analyst Mike Gualtieri asks “if analytics does not lead to more informed decisions and more effective actions, then why do it at all?” Specifically in a great post What Exactly The Heck Are Prescriptive Analytics? he says (emphasis mine)

Prescriptive analytics is about using data and analytics to improve decisions and therefore the effectiveness of actions. Isn’t that what all analytics should be about? A hearty “yes” to that because, if analytics does not lead to more informed decisions and more effective actions, then why do it at all?
Enterprises must stop wasting time and money on unactionable analytics. These efforts don’t matter if the resulting analytics don’t lead to better insights and decisions that are specifically linked to measurable business outcomes.

Now he calls it Prescriptive Analytics and we call it Decision Management but we are aligned on the value proposition – operationalized analytic insight. I also really like his focus on the intersection between analytics  and business rules (decision logic). As he points out you can wrap business rules around analytics to make them actionable (starting with the analytic) and you can also start with the business rules and use predictive analytics to improve them. The combination of business rules and predictive analytics is the powerful one. And our experience shows that using a decision model, built using the Decision Model and Notation (DMN) approach, is a great way to show how you are going to use these technologies together to make better decisions and how those decisions are specifically linked to measurable business outcomes – metrics or objectives.

In fact I am in Asia this week presenting on a pilot that demonstrates exactly this. We have been developing a decision model and decisioning architecture, implementing that decision model in business rules and using analytics to inform those business rules as part of a pilot for a large enterprise out here. We are excited about the results as they show exactly what Mike – and I – have been saying: That a focus on decisions, tied to business outcomes, modeled and implemented with business rules and analytics leads to better business results.

If you would like to talk about a pilot or other help operationalizing your analytics, get in touch.

Thanks to my colleague Gagan Saxena for spotting this great post

I recently wrote three articles for KDnuggets on the potential for decision modeling in the context of the CRISP-DM methodology for analytic projects:

  • Four Problems in Using CRISP-DM and How To Fix Them
    CRISP-DM is the leading approach for managing data mining, predictive analytic and data science projects. CRISP-DM is effective but many analytic projects neglect key elements of the approach.
  • Bringing Business Clarity To CRISP-DM
    Many analytic projects fail to understand the business problem they are trying to solve. Correctly applying decision modeling in the Business Understanding phase of CRISP-DM brings clarity to the business problem.
  • Fixing Deployment and Iteration Problems in CRISP-DM
    Many analytic models are not deployed effectively into production while others are not maintained or updated. Applying decision modeling and decision management technology within CRISP-DM addresses this.

Check them out. And if you are interested in how one global leader in information technology is using decision modeling to bring clarity to its analytic and data science programs, check out this Leading Practices Brief from the International Institute for Analytics. We have found that a focus on decision modeling early really helps get and keep analytics projects on track and makes it much easier to operationalize the results.

Zbigniew Misiak posted a great set of tips from practitioners BPM Skills in 2017 – Hot or Not – with my contribution here.  This is a great way to get some quick advice from a wide range of practitioners and experts. As someone with a particular focus on decision management and decision modeling I was struck by the fact that there were 5 distinct recommendations for DMN and decision modeling/decision management (including, obviously, mine).

Some quick follow up points on my note:

  • Hot
    • Decision Modeling and the DMN standard
      • There’s rapidly growing interest from companies in using decision modeling and the Decision Model and Notation (DMN) standard
      • Lots of this interest is coming from business rules practitioners
      • But process modelers using BPMN are keen to use it too as modeling decisions separately simplifies their process models dramatically
      • And analytic/data science teams are seeing the potential for decision models in bringing clarity to their business problem
    • Predictive analytics (not just process analytics)
      • Data mining, data science, predictive analytics, machine learning etc are all hot technologies
      • Practitioners need to move beyond dashboards and process analytics to include these more powerful analytic concepts
      • A focus on decisions and decision modeling is central to finding the places where these analytics will make a difference
    • Declarative modeling
      • Business rules are declarative. So, at some level, are analytics. Process models are procedural
      • Declarative models like DMN allow you to describe decision making without overlaying the procedures we happen to follow today
  • Not
    • Modeling business rules outside the context of decisions
      • There’s no value to business rules that don’t describe how you make a decision
      • You can, of course, describe your data and your process in a rules-centric way and that’s fine
      • But don’t think there’s value in just having a big bucket of rules
    • Embedding business rules directly into processes
      • Decisions are where rules meet processes
      • Embedded rules as gateways or conditions simply makes your process messy and complex
      • Even embedding decision tables as separate tasks will only work for very simple scenarios
      • You need to model the decision and create a decision-making component, a decision service, for the process to use
    • Procedural documentation
      • Works fine for proceses but a procedural approach for documenting analytics or rules simply documents how you do this today, not what you need it to do tomorrow

Check out everyone else’s thoughts BPM Skills in 2017 – Hot or Not

Tom Davenport had a great article recently on Data Informed – “Printing Money” with Operational Machine Learning. His intro paragraph is great:

Organizations have made large investments in big data platforms, but many are struggling to realize business value. While most have anecdotal stories of insights that drive value, most still rely only upon storage cost savings when assessing platform benefits. At the same time, most organizations have treated machine learning and other cognitive technologies as “science projects” that don’t support key processes and don’t deliver substantial value.

This is exactly the problem I see – people have spent money on data infrastructure and analytics without any sense of what decision they might improve with them. By taking a data-first and technology-first approach these organizations have spent money without a clear sense of how they will show an ROI on this. He goes on to talk about how embedding these technologies into operational systems has really added value to an operational process – this, of course, is the essence of decision management as he points out. As he also points out this approach is well established, it’s just much more accessible and price-performant now than it used to be. It’s always been high ROI but it used to be high investment also – now its more practical to show an ROI on lower investments.

In the example he discusses in the article, the solution stack

…includes machine learning models to customize offers, an open-source solution for run-time decisioning, and a scoring service to match customers and offers

Tom goes on to identify the classic elements of a solution for this kind of problem:

  • A Decision Service
    This is literally a service that makes decisions, answers questions, for other services and processes. Identifying, developing and deploying decision services is absolutely core to success with these kinds of technology. The graphic on right shows how we think about Decision Services:

    • It runs on your standard platform to support your processes/event processing systems
    • It combines business rules, analytics (including machine learning) and cognitive as necessary to make the decision
  • A Learning Service
    This is what we call decision monitoring and improvement. You connect the Decision Service to your performance management environment so you can see how different decision choices affect business results. This increasingly includes the kind of automated learning Tom is talking about to improve the predictive power of the analytics and to teach your cognitive engine new concepts. But it can also involve human intervention to improve decision making.
  • Decision Management interface
    The reason for using business rules and BRMS in the Decision Service is to expose this kind of management environment to business owners.

We have seen this combination work over and over again at clients – mostly with human-driven learning to be fair as machine learning in this concept is still pretty new. Our experience is that one of the keys to success is a clear understanding of the decision-making involved and for that we use decision modeling. You can learn more about decision modeling from this white paper on framing analytic requirements, by reading this research brief (if you are a member of the International Institute for Analytics – the organization Tom founded) or by checking out these blog posts on the analytic value chain.

I have a lot more on how these decision management technologies work together in the Decision Management Systems Platform Technologies report which will be updated with more on machine learning and cognitive in the coming months.

BPTrendsReal-World Decision Modeling with DMN CoverPaul Harmon over on BPTrends interviewed Jan Purchase and I about our new book, Real-World Decision Modeling with DMN. The interview covers a pretty wide range of topics – a definition of Decision Modeling, the bottom line value of modeling a decision, the difference between decisions and rules and how decision modeling helps projects. We talk about some of our memorable experiences of applying decision modeling, our favorite best practices and the value in applying decision modeling and the Decision Model and Notation (DMN) standard alongside BPMN. We outline our method for applying decision modeling and DMN, and talk a little about the book – who it is aimed at and what it contains (Over 220 practical illustrations, 47 best practices, 13 common misconceptions to avoid, 12 patterns and approaches and 3 worked examples among other things). We end with some thoughts on how to get started.

Anyway, I hope you enjoy the interview over on the BPTrends site and, if you do, why not buy the book? Real-World Decision Modeling with DMN is available on amazon and more details are available on the Meghan-Kiffer site.

Real-World Decision Modeling with DMN CoverI am delighted to announce that Jan Purchase, founder of Lux Magi, and I have finished what we believe is the definitive guide to decision modeling with the Object Management Group’s Decision Model and Notation (DMN) standard, Real-World Decision Modeling with DMN. The book is now on general release and available for purchase from Amazon and Barnes and Noble (full details on the Meghan Kiffer site).

If you are a regular follower of this blog you will know that decision modeling is an important technique for improving the effectiveness, consistency and agility of your organization’s operational decisions and a vital adjunct to the continuous improvement of your business processes.

We have tried hard to develop a truly comprehensive book that provides a complete explanation of decision modeling and how it aligns with Decision Management and more broadly with Digital Transformation. The book describes the DMN standard, focusing on the business benefits of using it. Full of examples and best practices developed on real projects, it will help new decision modelers to quickly get up to speed while also providing crucial patterns and advice for more those with more experience. The book includes a detailed method for applying decision modeling in your organization and advice on how to select tools and start a pilot project. It contains:

  • Over 220 practical illustrations
  • 47 best practices
  • 13 common misconceptions to avoid
  • 12 patterns and approaches
  • 3 worked examples

Here are some of the great quotes from our initial reviewers:

“This comprehensive and incredibly useful book offers a wealth of practical advice to anyone interested in decision management and its potential to improve enterprise applications. Using a blend of user case studies, patterns, best practices and pragmatic techniques, “Real-World Decision Modeling with DMN” shows you how to discover, model, analyze and design the critical decisions that drive your business!”
—David Herring, Manager of BPM & ODM Delivery at Kaiser Permanente.

“Well written and very impressive in its scope.”
—Alan Fish, Principal Consultant, Decision Solutions, FICO and author of “Knowledge Automation: How to Implement Decision Management in Business Processes”.

“If you are looking for a complete treatise on decisions, look no further. Even though you end up in decision modeling and Decision Modeling Notation (DMN), you are treated to all the aspects of decisions explained with great care and clarity. This is a greatly needed book that will be studied by decision managers and practitioners for the next decade or more. It will end up on my physical and logical bookshelves.”
—Jim Sinur, Vice President and Research Fellow, Aragon Research.

“Written by two of the foremost experts in decision management, this book provides an extensive exploration of business decisions and how they are used in modern digital organizations. Taylor and Purchase distill for us the why, how, when and where to apply decision modeling in order to specify business decisions that are effective. Going beyond just an introduction to the Decision Model and Notation (DMN), the authors position this standard with respect to other well established standard notations such as BPMN. This is truly a first comprehensive handbook of decision management.”
—Denis Gagne, Chair of BPMN MIWG, CEO & CTO at Trisotech.

I very much hope you will buy and enjoy Real-World Decision Modeling with DMN. I’d love to hear what you think of it and stories about how it helped you on projects so please drop me a line.

I’ll leave the last word to  Richard Soley, Chairman and CEO of the Object Management Group who wrote the foreword for us:

A well-defined, well-structured approach to Decision Modeling (using the OMG international DMN standard) gives a repeatable, consistent approach to decision-making and also allows the crucial ‘why?’ question to be answered — how did we come to this point and what do we do next? The key to accountability, repeatability, consistency and even agility is a well-defined approach to business decisions, and the standard and this book gets you there.”

What are you waiting for? Go buy it –  Meghan KifferAmazon, Barnes and Noble.

JMP 13 and JMP Pro 13 launched in September 2016. JMP is a business unit of SAS with more than $60M in revenue with growth and a focus in manufacturing and life sciences. Lots of engineers, analysts, researchers and scientists as users. JMP has specialty products in Genomics and Life Sciences but this release is about the core products. The tool remains focused on design of experiments, predictive modeling, quality and reliability, and consumer research though dashboards and data wrangling are being added too.

JMP 13 remains a fat client tool and with this release there are a couple of key themes:

  • Increased ease and efficiency of preparing data
  • Enriching the kinds of data and this release is adding free text
  • Handling very wide data is another area, especially as more memory is added to the desktop
  • Finally reporting, dashboards and their priority focus areas remain important

New capabilities include:

  • Lots of focus on reducing the number of clicks for actions – so for instance creating a script to recreate an analysis on new or refreshing data is now 1 click not 2.
  • JMP tables created as the software is used can now be joined using standard query builder capabilities. This essentially creates a new JMP table that can then be analyzed. This includes filters and where clauses for instance. Building the new table also prototypes the SQL that would be required on the production systems.
  • A virtual join has been added also to support situations where very tall datasets are involved. For instance, sensor data might have thousands of entries per sensor. Joining this to a table about the locations of sensors, say, might blow memory limits. The virtual join allows a join to be defined and used without ever filling the memory with joined data.
  • New dashboard building tools are included to make it easier to arrange several graphs onto a dashboard without having to go through the full application development tools. Multiple templates are provided for a simple drag and drop construction. Full interactive views (with all the support for JMP changes at runtime) or summary views are available for dashboards. Lists can be added as filters and one graph can be used to filter another – so as a user clicks on one graph, the data in the other is filtered to only those data selected in the first one.
  • Sharing reports and dashboards has been an increasing focus in recent versions. Stand-alone interactive HTML reports and dashboards have been improved. For instance, graph builder capabilities, making HTML graphs more interactive. Data for these files are embedded in the files – the data is a snapshot not live – but only the data that is needed. Multiple reports can be packaged up and an index page is generated to front end a set of independent reports.
  • JMP Pro has improved the ability to quickly organize and compare multiple models. Scripts can be published to a Formula Depot for multiple models and then used in a comparison without adding the corresponding columns to the data set. From here code can be generated for the models too – adding C, Python and Javascript to SAS and SQL.
  • Text handling has been added too. JMP allows more analysis of text using the Text Explorer which handles various languages and identifies top phrases, word clouds etc. As usual in JMP, these are interactive and can be cross-tabbed and integrated with graphs and other presentations. JMP Pro allows this to be integrated with structured data to build a model.

For more information on JMP 13 click here.

Final session for me at Building Business Capability is Denis Gagne of Trisotech talking about the intersection of process, case and decision modeling – BPMN, CMMN and DMN – and the importance of these standards to business analysts.

Denis reminds us that all organizations are facing a new digital future with rapid technology change, changing customer behavior and increasing competition. Music, media, transportation and more have been disrupted and everyone is at risk of disruption. In response companies cannot just focus on operational improvement (good for the company, not for customers) or just on customer experience (expensive for the company) – we must focus on both – on transformation, on outcomes and on new business models. We must do new things that make our old things obsolete. And this requires LOB managers and executives, enterprise and business architects, business and process analysts to bring different perspectives to bear effectively.

garnterworktypesGartner has a great model of work efforts and all these must be improved, requiring a focus on structured and unstructured processes as well as on decisions. And these improvement efforts must balance cost against quality, and time against value. Different approaches balance these things differently, leading to different kinds of improvement. Business analysts in this world he argues therefore you need to have three practices in your toolbox:

  • Process Management – with BPMN, Business Process Model and Notation
  • Case Management – with CMMN, Case Management Model and Notation
  • Decision Management – with DMN, Decision Model and Notation

These notations provide unambiguous definitions for processes, cases and decisions. These can be interchanged and even more importantly is that the skills in developing and reading these models are transferable. This means that despite turnover, multiple parties and external resources, we still know what the model means.

whentouseThe most common question he gets is when to use which. BPMN he says is about processing, CMMN is about managing context and DMN is about deciding. It’s not too hard… And he had a nice chart showing the difference between BPMN and CMMN that also showed the broad applicability of DMN.

  • BPMN allows you to define a process – a triggering event followed by a sequence of activities that lead to some business outcomes – using standard shapes to represent the elements.
  • CMMN allows you to define a case – in a context there is an event followed by a condition and an action – again using standard shapes.
  • DMN allows you to define decision-making – decisions, decision requirements and decision logic – also using standard shapes

He wrapped up with some great advice:

  • In BPMN and using too many gateways – you need DMN to model that decision-making
  • In BPMN and using too many intermediary or boundary events – you need to use CMMN to model the case work you have identified
  • In BPMN and using ad-hoc processes – replace them with CMMN
  • In CMMN and using lots of sequencing – use BPMN

The standards as a set he says allow you to build a more agile, intelligent and contextual business.

By the way, my new book on DMN – Real-World Decision Modeling with DMN – is available

 

Continuing blogging from Building Business Capability and it’s time for Jan Vanthienen to talk decision modeling for the business analyst. Jan is a leading researcher into decision modeling and decision tables at the University of Leuven.

Observation 1: Decisions are important for business, not only processes.

Too often, he says, the process is modeled with no real thought being given to the decision-making required – a claims process that does not really describe which claims will be paid, only how to pay or reject them. To describe them, companies must model them and the DMN (Decision Model and Notation) standard is ideal. DMN defines two layers:

  • The requirements for a decision in terms of other decisions, data and sources of knowledge.
  • The logic of the decisions, derived from these knowledge sources and applied to the data to actually make the decisions in the requirements.

Once you model the decision separately, much of the decision-making complexity that overloads a process model can be removed to simplify and clarify the process.

Observation 2: Decision logic should not be shown as process paths. 

Decisions should be modeled as peers of the processes that need them so they can change independently, be managed by different people etc. This reduces the complexity of processes by removing decision logic and decisions from the process model and instead linking the decision to a specific process task.

Observation 3: Multiple decisions in a process might be in a single decision model

Observation 4: Proven techniques exist to represent decision tables and decision models

From a methodology perspective we need to do three things

  1. Model the decision structure
    • Build a decision requirements graph – a decision requirements diagram – to show how the decisions, data and knowledge sources form a network describing the decision making and its structure.
    • Generally this is best done top down and descriptions of questions and answers in requirements or policies often lead very naturally to sub-decisions.
    • Descriptions of data elements are similarly clear. But if you just try to write a decision table against these data elements you will end up with an unwieldy model. Build a model first.
    • img_20161104_092831706Decision requirement models are built using expertise, by seeing which conditions depend on the results of other decisions, and by normalizing the model to eliminate redundancy.
  2. Model a single decision table
    • Decision tables are not the only way to manage the business rules for a decision in the model but they are a very effective one and one that business users find very transparent.
    • img_20161104_093400122Representing a description of logic as a decision table clarifies and makes consistent the logic it represents. And the logic read in several ways without ambiguity – what will I do if? OR when will I do x?
    • Decision tables can show rules as rows or as columns, with each column/row being based on a fact. This creates a repeated structure for each row. And the rows are all rows that only have ANDed conditions (ORs lead to new rules as they should).
    • The standard also defines a “hit indicator” to define how to handle situations where multiple rules in a decision table are true. But stick to Unique and perhaps Any’s because the others get you into trouble…especially First which is just a way to code ELSEs!
    • Decision tables can be correct, compact and consistent but generally you can only get 2 of the 3. Start focusing on correctness, work to ensure consistency. Only take compactness as a bonus.
    • Jan likes to use rules as rows or rules as columns depending on the shape of the table. I prefer to stick to rules as rows unless it’s really important…
    • You can build decision models and tables interactively with people, from text of policies or regulations, or by mining data/rules/code. You can also find all the possible condition values and outcomes and build an empty table. As you fill in the logic you can clarify and normalize.
    • DMN allows the modeling of decisions alongside processes, simplifying processes and creating a more declarative and understandable model of decision-making. It also separates the structure from the logic while allowing them to be linked and reused. DMN standardizes and extends decision modeling and decision table approaches but builds on a long history of techniques and approaches that work.
  3. Model the process connection
    • Sometimes the entire process is about a decision. Model the decision first and then think about how to execute it, how to build a process around it. A process might do all the sub-decisions in parallel to minimize the time taken OR specific decisions might be made first because they eliminate the need to do other parts of the process for efficiency OR some decisions might be made about several transactions at the same time in a group OR…
    • The decision model describes how to make a decision so that you can not only build a decision-making system but also answer questions about how the decision works, how it is made.

By the way, my new book on DMN – Real-World Decision Modeling with DMN – is available

Last day of blogging from Building Business Capability and the first topic is modern business architecture with Gagan Saxena, VP of consulting at Decision Management Solutions, and Andrew Ray of Goldman Sachs. The presentation is focused on a new approach to business architecture to address problems in the legacy approaches – one that connects business at rest to business in motion in a model-driven way.

A perfect architecture would provide an organizing framework that synchronizes

  • Business Strategy
  • Business Operations
  • Business Organization
  • Business Systems
  • Business Performance

At rest (design) and in motion (operations). And it’s essential that over-complex technical solutions are not developed for problems that could perhaps have been avoided.

Yet business architecture has added more and more techniques and areas of focus, making it more complex, and yet failing to address new areas. They call out two particular areas – knowledge management and performance management – that are both long standing and yet still not well addressed. Nothing is ever removed from the list, overwhelming anyone trying to stay in touch.

Business is changing. In particular, margins and time to respond have shrunk and put real pressure on point to point, static solutions. Each regulation has an impact too, often on many systems. But how to demonstrate compliance?

Instead of coding rules into each system, compliant decision-making can be defined in DMN (Decision Model and Notation) and shared between the systems. But there are many such regulations that overlap so a decision architecture had to be developed to manage this. Yet most business architecture approaches don’t include a decision architecture.

atrestarchHistorically business architecture used was focused on “business at rest”. Documents are developed and published but they are not model-based, making it hard to keep a business architecture “alive” as systems are developed and evolved. Performance monitoring and learning is often dumped as deadlines loom and what is learned is often not fed back well enough.

Instead you want a model-based approach to support business in motion. These models tie directly to implementation, allowing the data captured in motion to be linked to the models and used to improve them. Decision models (DMN), process models (BPMN), data models and models of metrics or performance as a minimum.

To develop this set of models the team used

  • Business Model Canvas to  define stakeholders and value chains
  • Capability map
  • Strategy Map
  • These led to a Balanced Scorecard to track progress
  • And a set of strategic initiatives and projects
  • Capabilities were defined and executed to drive
  • Business service performance which is fed back into the scorecard

modernbizarch

The capabilities defined – around process, data, decisions or events – can be composed into higher level capabilities or patterns. Performance management is built in and the knowledge embodied in these capabilities can trace back to the real-world source. The embodied models can be automated and optimized while supporting agility and rapid, safe change.

The example they used is client onboarding. This SOUNDS simple but it’s really not thanks to regulations (45 KYC regulations for instance), a network of legacy systems, many manual processes and a continual effort to reduce costs. “A house of cards” resulting in excessive time and cost to onboard as customer (thanks in large part to legal negotiation).

Normally the focus would have been on one specific item but the team instead has focused on a more holistic approach designed to address the whole front to back process. Yet an agile approach was essential to show progress while a coherent overall architecture had to be the result. The approach involved:

  • Business Model Canvas gave a big picture context of key elements and relationships without a lot of documentation.
  • Capability Map identified the capabilities needed across perspectives of strategic monitoring, customer management, process management and infrastructure management.
  • Strategy Map showed the dependencies between strategy themes and objectives.
  • Balanced Scorecard showed the objectives and metrics for each perspective with targets and deadlines.
  • Strategic Initiatives were defined to hit the targets defined for each of these objectives – many initiatives supported several objectives.
  • capabilitymodelsCapabilities are defined that are reused across processes. These are defined using one or more of
    • User Stories
    • Process Models
    • Data Models
    • Event Models
    • Decision Models

Results:

  • Showing progress against an agreed strategy has kept leaders engaged and sustained funding. A complete front to back design exists even though it is being delivered incrementally with agile methods.
  • The Business Capability model has helped ensure reusable assets are created and has helped focus a large agile program.
  • Model driven development has put business people in the driving seat, increasing agility and transparency while reducing time to value.
  • BPMN and DMN have been key both in engaging the client/business and in freeing up technology to focus on engineering problems.
  • DMN’s ability to trace from executable logic to a logical design to the original regulation allowed for compliance and allowed regulatory/policy knowledge to be automated in decisions.

By the way, my new book on DMN – Real-World Decision Modeling with DMN – is available

Continuing with blogs from Building Business Capability I am self-blogging the session I co-presented with David Herring, who leads the Process Transformation and Decision Management Program at a leading Northern California Healthcare organization, on “Pioneering Decision Services with Decision Modeling”.

David works at a large not-for-profit health plan that does everything from inpatient, to home health, hospitals, hospice, pharmacy, and insurance. 10M+ members, 17,000 doctors and nearly 200,000 total employees. They have been on a decision management journey. Beginning with SOA infrastructure they first added Business Process Management before adding in-context Decision Management to automate decisions. More recently they have begun to focus on predictive analytics, on cognitive services and on event-awareness and event processing. Their messaging bus now handles 2B messages a month and 500 web services are connected, supporting BPM, Decision Management, advanced analytics and performance management.

Decision Management Solutions has been working with them on its adoption of decision management and, in particular, its adoption of decision modeling. Decision Management relies on a clear understanding of the decisions involved and increasingly therefore on decision modeling. Decision modeling supports the whole lifecycle – giving an unambiguous definition at the beginning, scoping and specifying the interfaces of decision services and providing a framework for monitoring and improvement. Decision models with the Decision Model and Notation standard (DMN) clearly express requirements by documenting decisions and sub-decisions, showing a precise structure for the decision-making, showing what information is consumed by the decision (and precisely where it is consumed) and identifying all the sources of knowledge – policies, regulations or expertise – that define the decision-making approach. Decisions in a decision model can also be connected to business processes, to goals and performance metrics, to organizational structures and to enterprise data.

Decision models in DMN have two layers – a decision requirements layer showing the structure and a decision logic layer expressing how each piece of decision-making executes. DMN decision requirements models can also be integrated with the business rules that implement them in a BRMS. The DMN standard is increasingly widely used, has broad industry support and is managed by the Object Management Group, a well established standards body. The standard is intended to “… provide a common notation that is readily understandable by all business users… a standardized bridge for the gap between the business decision design and decision implementation.”

DMN has many use cases

  • Human Decision-making such as documenting human decision-making, improving human decision-making with analytics or training human decision-makers
  • Requirements for automated Decision-making including business rules discovery and analysis, framing predictive analytics or even dashboard design
  • Implementing automated Decision-making by completely specifying business rules, acting as a BRMS front-end or orchestrating complex decisioning technology

One of the projects that applied the approach is The Heart Failure Project. This arose because cardiologists have a need for a system to evaluate patients, using a simple set of conditions, to determine if patient needs to be referred to a heart failure specialist. kpmethodFor this project, the methodology applied was:

  • Run discovery workshops
    These involved actual business owners – heart surgeons and heart transplant specialists – as well as analysts and IT resources. Elicitation techniques were used to describe the specific decisions involved, identify the metrics/KPIs impacted by these decisions and understand where the decisions fit in the overall system and process context.
  • Model and identify suitable decisions for automation in IBM’s Operational Decision Manager (ODM) BRMS
    Decisions were modeled and refined using DMN in DecisionsFirst Modeler. The initial decisions were decomposed to show the more granular and reusable decisions within them as well as the input data and knowledge sources involved. This used a mix of material from the workshops and ongoing Q&A with the experts.
  • Transform decision models into executable decision tables
    Initial decision tables had been sketched in Excel and linked to the model. As the model stabilized these decision tables were “normalized”. These tables were then added to IBM’s ODM and complete logic was specified, taking advantage of the web-based editor in ODM. Each table was linked to a specific decision in the decision model and the two platforms are integrated to make navigation easy.
  • Deploy decision tables in an ODM Decision Service
    A few additional elements, like a Rule Flow, were added to ODM and this allowed the logic in the tables to be deployed as a service that could be called from processes or from a UI such as that used by the Doctors.

At the heart of this approach is iterative business-centric development. An initial high level model with a few decisions and “obvious” input data is developed. In each iteration a part of the model is developed into a more detailed clinical model and a set of decision tables identified for each element of this more detailed model. This allows for a highly iterative and agile approach while ensuring that each new iteration can clearly be put in context and ensuring that reuse is managed across the iterations (because decisions are reused between the diagrams through a shared underlying repository).

One of the key drivers for the project was that this one decision involved many documents. The current approach is to writer a lot of clinical guideline documents. This results in information overload as well as significant regional variation. It can be hard to ensure that the authors of these documents are practicing specialists and the documents are hard to maintain/rarely updated. The decision model combines all the available guidelines and documents into a single model. Each document is identified as a knowledge source and influences some parts of the overall decision-making.

Key benefits of this approach are consistency and consumability. Today all the various guidelines and some additional toolkits are shared through a library. Different regions leverage these to create local implementations reflecting the systems and processes they use. However this is largely cut-and-paste reuse. With the new approach, all the studies and research could be combined into a single decision model. This could then be used to deploy some standard decision services, available on the enterprise infrastructure, to make decisions whenever and wherever needed. Regional implementations could still reflect local systems and processes but could reach out to a shared decision service – ensuring that the clinical guidance was consistently applied.

When working with experts it is often hard to decide what should be automated and what should be left to experts. It is easy to overfit – building a system that does too much or tries to be too precise. In this example there was a key decision that should be made by a medical professional – a four quadrant decision known as the patient’s hemodynamic status. This was required by key decisions in the model. This should not be automated or made more complex as this describes exactly what a physician does. It can’t be generated from stored data nor is there any value in a more granular scale. The decision model allows this to be identified, described and then left outside the automation – but still in the model for clarity and management.

This approach works well for managing automation boundaries. Teams can develop a decision model for the whole decision and then use it to identify the scope of automation. This may leave “master” decisions to be made by people but more often identifies “feeder” decisions that should be left as manual. In fact the model supports automated, automated but overridable and completely manual decisions equally well, allowing the model to be used BEFORE these automation decisions have been made.

Final summary and recommendations:

  • Decision Modeling Workshops
    • Engage Business Owners
    • Reveal Automation Boundaries
    • Integrate Multiple Perspectives and Documents
  • Decision Modeling
    • Supports Iterative Development
    • Focuses BRMS Development
    • Avoids Overfitting
  • Decision Services
    • Improve Processes
    • Supports SOA Best Practices

You can learn more about the integration between DecisionsFirst Modeler and IBM ODM here and my new book on DMN – Real-World Decision Modeling with DMN – is available

An additional blog post here on a session at Building Business Capability that I missed – Business Analysis for Data Science teams. I know Susan Meyer who presented it and we talked several times about her presentation. It’s a really key topic so I wanted to present a summary. Here goes:

There is a lot of interest and excitement right now around data science (data mining, predictive analytics, machine learning). But this is not just a gold rush so much as a real indication of a change. With more and more interest in this topic and a need for companies to use data science effectively, Susan sees a key role for business analysts (as do I).

She points out correctly you don’t need to be a mathematician or statistician to contribute to data science teams. A business analyst who understands the process and has some disciplined curiosity can do a lot. And the process you need to know is CRISP-DM – the Cross Industry Standard Process for Data Mining (see these blog posts on CRISP-DM). This is an ideal process for business analysts as its iterative, begins with business knowledge and allows non-data scientists to be part of the team. She identifies 6 reasons business analysts can help:

  1. They know their vertical and domain
  2. They are used to agile, iterative projects
  3. They understand their company’s business model
  4. They can elicit requirements through data – and build a decision model to frame the analytic requirements
  5. They can partner on the architecture (especially for deployment and data sourcing)
  6. They can build and manage the feedback loop and the metrics involved

Business analysts can dramatically reduce the time spent on data and business understanding and improve the results by anchoring the data science in the real business problem.

Very cool.

Continuing to blog at Building Business Capability 2016 with Ron Ross talking about operational excellence. [My comments in italics]

He began by talking about the new technology available and its potential while expressing worries that technologies, and technological approaches, might not really change our businesses for the better. In particular he expressed concern that “channel mania” might be dragging companies into unhelpful behavior. That said, he says, perhaps this channel mania is just an evolution of the old problem of organizational silos.

It’s clear to him that moving forward technologically will remove people from the loop when it comes to making things right for the customer. A digital future will mean there’s no-one available to make things work for customers. The substitute for this is going to injecting knowledge into these processes – using business rules (and decisions I would add).

So Ron proposed from basic principles for Channel (or Silo) sanity:

  1. Follow the same basic rules through every channel – make decisions consistently about configuration, interactions etc
  2. Know what your rules are – how do you make these decisions?
  3. Give your rules a good life – manage them and treat them as an asset to looked after.

It’s important not to let the technological tail wag the business dog. Make sure that technology choices are supporting business choices and not the reverse. Replacing brick and mortar stores with apps, automating up-sell and cross-sell, demonstrating compliance – all these require better management of knowledge (decision-making, business rules) once people are out of the loop.

The cheapest way to differentiate your business he argues is to decide differently – apply different business rules.  And if you don’t want to be different, use rules to make decisions consistently.

Ron took a minute to make his usual distinction about business rules:

  • They are what you need to run the business. You would need the rules even if you had no software
  • They are not data or metadata – they are rules
  • They are about business communication – not primarily an IT artifact but a business one (or at least a shared one).

Ron believes that the best approach to this is to capture these rules in a structured, natural language-like form that are outside of any executable format. Here we disagree – I prefer a decision model supported by rules not a rulebook of rules.

Ron then focused on 5 big challenges companies are facing when trying to become more operationally excellent.

  1. Know your customer
    This is particularly hard if you are using multiple channels. You need to manage the rules to ensure that you make consistent customer decisions across channels.
  2. Closing communication gaps
    Project teams worry about this a great deal but in fact this is a general problem across many projects. User stories and other agile approaches can leave too much communication hidden and in particular it can leave out the underlying knowledge that supports your organization. And reinventing this each project is unhelpful.
  3. Knowledge recovery, renewal and retention
    Tribal knowledge is a problem. One company found that 60% or more of the staff who have critical tribal knowledge will retire in the next 3 years. And companies end up putting those with critical knowledge into boxes because they can’t afford not to have access to that knowledge. Plus many of the rules currently embedded in systems are wrong, even when they can be extracted into something more readable than code. It’s critical for companies to manage their knowledge so it is “evergreen” and stays current and timely.
  4. Compliance
    With changing regulations and laws, with contracts and agreements, with warranties offered etc. Being able to demonstrate compliance and trace back to the original source is key. Ron thinks that a written rulebook based on RuleSpeak is a a good way to link regulations to automated rules and that you should manage all your rules in this way.
    I disagree strongly with him on this. Our experience is that a decision model does a much better job of this (see this post). Writing a non executable set of rules and trying to link those rules to the regulations and to the executable rules is much more costly in our experience and a decision model works better for linkage. Don’t write two sets of rules (one executable, one not), it will be expensive.
  5. Business Agility
    If IT costs are too high then even simple course corrections or changes are not made when the business sees the need to change. Fighting fires can keep anyone from focusing on serious problems. Managing rules can reduce the cost of change and so drive more business agility.

Ron concluded with a strong statement with which I agree – no new channels will ever emerge for which you will not need to apply business knowledge. So manage it.  Though I would add that not all knowledge is business rules, an increasing amount is analytic and cognitive and a decision model works for those too.

Continuing to blog from Building Business Capability 2016 I am listening to Railin talking about next steps afterr getytingh funding for rules and process improvement. Railinc is a SaaS company supplying software to the rail industry. Railinc is modernizing how it implements rules and processes in their applications. The program covers all 6 product lines and up to 70 applications. This program designed to

  • Increase agility
  • Increase product quality
  • Increasse Railinc Knowledge
  • Reduce TCO

The program got approval in 2013, picked vendors and did technology PoCs in 2014 and last year focused on business methods for rules and process. A program was piloted in a single area. This year the program is enterprise-wide and is transitioning to include technology updates. Next year it is designed to wrap up with a transfer of operational ownership.

Change management, she says, is a journey and this program is mostly about change management. A plan and roadmap is good to keep the route and destination in mind but surprises, detours and side trips will happen! In this case, the plan had a clear set of target benefits but these are not a destination – the destination is the integration of these new technologies and behaviors into the company’s fabric, without needing a center of excellence, and shifting the focus to business-centric not IT-centric.

A big program, though, requires more specificity:

  • Become part of the fabric
    • New business analyst methods and tools around rules, vocabulary and stories
    • New technology like RedHat BRMS which means new (declarative) approaches
    • New governance processes and metrics, standards, audits etc
  • No need for a CoE to sustain
    • Empower business and IT leaders to drive and identify/fix behaviors that prevent this
    • Model behavior and take the hits as the program team, lead by example
    • Honest communication – reality about progress and minimize assumptions
  • Culture Shift to Business-enabled not IT-centric
    • Business ownership of externalized rules and process – very different from requirements with much more collaboration
    • Enable business operations – rule auditing, rules simulation
    • Increased collaboration

The overall approach is a classic change program:

railincprogram

Each year did different work streams. Business assessments started in one product line and expanded while business and technical POCs and development gradually added more applications to the scope. Training was across multiple business and technical rules and used learning by doing for a big part of it. Governance framework was designed early but not rolled out until later – piloting it now and planning a final rollout to wrap up. Operational metrics development is ongoing as this was not something the organization had.

One key lesson of this program is that delays and setbacks are integral to success – you have to embrace them and see them as opportunities for learning and growth.

It’s also important, she says, to understand “paved roads” on your journey.

  • Cultural Norms are important for instance so a change from IT driving to business ownership is a big deal and you must be clear about this.
  • Motivations have to change from schedule and budget to quality and agilty
  • Habits have to change from silos to collaborative
  • Change ownership has to go from the program team to operational teams

They have managed a lot of change with 7 compliant applications, 22 on the roadmap, 1,400 terms, they’ve changed their SDLC and developed a way to pick the technology, and over 75% of their analysts have been trained. In particular the analysts feel they have a voice and that the business/IT relationship has really improved.

 

Continuing with presentations at Building Business Capability, Kaiser Permanente and IBM presented on their decision management and cognitive platform for application development innovation. For those of you that don’t know, Kaiser is the Nation’s largest not-for-profit health plan. They do everything from inpatient, to home health, hospitals, hospice, pharmacy, and insurance. 10M+ members, 17,000 doctors and nearly 200,000 total employees.

IBM has been working with Kaiser for a while on their decision management journey. Thisd began with SOA infrastructure they first added BPM before adding in-context Decision Management to automate decisions. More recently they have begun to focus on predictive analytics, on cognitive services and on event-awareness and event processing. Most recently they begun to focus on decision modeling as a way to frame and structure their business rules and support analytic and cognitive development. They are using IBM’s BRMS Operational Decision Manager (ODM). They are also, it should be noted, a Decision Management Solutions customer.

Kaiser works with IBM on try-storms. These are brainstorming sessions involving Kaiser and IBM teams that result not just in ideas but in working prototypes based on technology from business rules to predictive analytics, complex events to cognitive. These sessions illustrate the value of the technology, rapidly identify scenarios and projects that could use it and help internal customers understand the potential.

Two scenarios are going to be shown – member-centric notifications for health scenarios and fighting the flu with cognitive.  These scenario use as conceptual architecture involving rules and cognitive decision-making against traditional and newer data as well as events.
img_20161102_115145639_hdr

The first demonstration discussed notifications. Often notifications are too generic, too late and too hard to target/change.  The new approach leverages SOA and messaging to pull the right data, Decision Management to control the formatting and content, and cognitive services to handle unusual data. For example, consider air quality – data managed by The Weather Company – and how that might affect notifications.

The demo began by showing the rules for a decision and how they can be configured by business owners. The business user can set up a simulation to see how a particular metric is met by the current rules. This allows them to baseline the rules to see how they are working. If that’s not working they can navigate to the rules for their decision – in this case as a decision table. For instance, the severity level of an air quality level that results in messages could be changed to increase or decrease the number of warnings issued. The new rules can be re-simulated to make sure the result is what was expected. This was a very rule-centric demo.

The second demo focused on detecting flu outbreaks. If this could be done, Kaiser could use rules-based decision management approaches to launch processes, alert people etc. This involves using event processing to detect and leverage structured events such as admissions or prescriptions. The event is an admission related to something epidemic (rather than scheduled, say). Then a rules-based decision checks to see if that was a flu admission. The event processing engine consumes events about flu admissions and checks to see if there have been a cluster of flu admissions in a time window and issues a notification. These get aggregated from hospitals to counties to drive alerts back down to other hospitals in the county.

This can then be combined with more predictive technologies. In this case, there are precursor symptoms that people experience before going to hospital. Tweeting or otherwise updating social media with these symptoms gives possible predictive data. Using Watson sentiment analysis one can analyze the social posts for tone (am I getting more sick and so more miserable) as well as for content including symptoms (not just mentions of the flu).

To make this work a set of Bluemix services are tied together to collect social media data using APIs, check the language and find the sentiment of the language. Another service is then used to classify the content as either about flu symptoms or not. This last was trained using a whole set of phrases that do describe symptoms and others that talk about the flu but not about being sick (such as tweets about epidemiology or CDC announcements). See this post for some details on Watson’s cognitive services.

These alerts are great but to be valuable, rules-based processing has to use this to take some kind of useful action. In this case the flu social media tracking was combined with other data to start predicting that admissions will rise soon in a particular area, allowing hospitals to see where they might need to prepare for flu admissions. The events being captured can also be analyzed geographically, allowing the movement of an epidemic to be predicted. This creates a layer of events – predicting flu is coming to an area, identifying people in an area talking about having the flu and ultimately getting admitted. The rules-based event engine can use this to trigger increasingly precise and useful actions.

A nice example of a rules-based infrastructure with a layer of analytic/cognitive enhancement. David wrapped up with a pitch for decision modeling, using the Decision Model and Notation (DMN) standard, as a way to exploit these decision-making technologies by engaging the business in configuring and designing the decision-making to be automated.

Don’t forget that David and I are speaking later in the week too with more on decision modeling.

I am attending this year’s Building Business Capability conference and blogging sessions like this one from Jim Sinur of Aragon Research. I gave a tutorial earlier in the week on decision modeling with DMN and will be speaking later in the week with David Herring of Kaiser Permanente (and signing my new book, Available now! Real-World Decision Modeling with DMN, at the bookstore after the session). Jim’s presentation is on Delivering the Digital Dream for Real, part of his focus on the move to digital. He plans to share real-world examples, discuss technologies that are out there, and outline a methodology for applying them.

His premise is that companies can only hide from, submit to or leverage digital transformation – it can no longer be ignored. Digital is the new normal and someone will use digital transformation to compete with you.

img_20161102_103851627_hdrWhy should organizations start on this journey? Digital change is happening so companies have to decide on an appropriate response. Companies can be divided based on digital capability and leadership capability. Digital Masters show higher revenue and market valuation while beginners get heavily punished. Those being conservative about it lose revenue while those chasing fashions get a revenue bump but at a cost.

The business and technical forces pulling on you vary by industry. Jim gave an example of a company that seemed like it was in an interesting space but new technologies like social undermined their model.

The journey, if you have control of it, goes like this:

  1. Customer delight
  2. Business operations
  3. New products and services
  4. Business model transformation

But if you get disrupted in the meantime you will have to accelerate it. And improvements in business operations fund the innovations you need. The future of business operations is mobile-first, enlightened decisions and smart actions. Understanding the customer journey is critical as it identifies WHEN you have to make a decision and WHAT decision is going to move the customer forward. Some examples:

  • One example journey is of a customer in a store. As they move around, as they shop in specific areas the store is going to use as digital platform with rules, analytics and cognitive capability to make good marketing and customer service decisions at the right moment.
  • Another example is doctors on their rounds. Same idea – what’s the journey, what do they need to know/have access to/have suggested to them at each point.
  • A third example was a surgery center using a digital platform and tags/sensors to improve operations. They would simulate the planned surgeries the night before and this gave them a baseline that could be used to drive visibility into variations. Plus the tags and models allowed them to find and inform loved ones, patients or staff as necessary.
  • Using IoT to manage power systems and help consumers reduce cost while helping the power company manage resources.
  • Using IoT to track dementia patients and use what’s known about the patient and their behavior to track when to intervene and what kind of intervention is appropriate.
  • IoT data to manage very distributed infrastructure and manage the staff who need to work on it. Drones and more are adding to this mix.

img_20161102_110238448_hdrWe live in a world, he says, with emerging processes that “swarm” to follow some goal. Processes will be changing from flow-directed to goal-directed. Workers, knowledge-workers, managers, services (including analytic and cognitive services), robots and sensors will all be coordinated much more dynamically. This will require both simple and emergent processes ranging from structured processes to more case/collaborative ones to IoT/Agent/Collaborative based ones. And all of them will require smart decision-making.

He illustrated this with a company’s evolving approach to farming. First they scattered measurement pills to track water and fertilizer that would sound alarms when something was needed. Then they linked these pills to bots to take the farmers out of the loop. And now they are linking these to water run off models to stop watering if water is coming anyway.

To get here, companies need a digital target: Goals, phases, benefits and costs; an idea of the organization, skills and technology that will be needed; actual POCs, a measurement approach and some capability for experimentation. This is based on executive vision, plans, customer input, competitive issues, constraints and technologies (both new and legacy). One of the most important things is to establish some business and technical principles that can be broadly applied for example:

  • Attract and spoil customers
  • Create operational excellence
  • Business first IT collaboration
  • Designing for sustainability

New competencies include constituent engagement, hyper awareness, complex problem solving, anticipatory decision making, innovative productivity, operations agility and more.

A digital business platform has lots of technology – some of it is ready, some less so.

  • Collaboration, journey mapping, big and fast data, and pattern processing (analytics) are all good.
  • Cloud integration, advanced analytics, IoT, video and 3D printing, augmented reality, block chain and cognitive are more or less ready

Leveraging these new technologies is a journey – a maturity model that ranges from initiating to opportunistic to managed, habitual and ultimately fully leveraged.

A digital platform helps organizations blend people, compute power and physical devices to deliver the best possible outcome. In an accident, for instance, your phone detects the accident and checks vital signs of passengers using smart clothes and the vehicle’s sensors. It communicates immediately with insurers and others to drive an automated yet highly tailored response. A platform that makes great and well informed decisions.

img_20161102_112216245_hdrIt needs to include

  • Processes and Cases
  • Cognition and Calculations – Decisions
  • Data and System Integration
  • Machines and Sensors
  • Business Applications