≡ Menu

Reltio Cloud is a modern data management Platform as a Service (PaaS) company focused on delivering data-driven applications, founded in 2011 by folks from Siperian which was acquired by Informatica. Unlike most data integration and MDM platforms, which are IT-focused, Reltio’s mission to make it possible for business and IT teams in enterprises to “Be Right Faster” by building data-driven enterprise apps that deliver reliable data, relevant insights and recommended actions. They compare these applications, based on broadly sourced, cross-functional data, with the traditional approach that delivers process-driven and siloed data. With data-driven applications contextual, analytical and operational data can all be brought together. This requires a reliable data foundation.

Reltio Cloud is a modern data management Platform as a Service (PaaS) and it includes:

  • Master Data Management as the core for delivering a foundation of reliable data
  • Predictive Analytics and Machine Learning through the inclusion of Apache Spark in the platform
  • A Graph Model allows for network and analysis integration across highly variable data sources
  • Big Data Scale and Performance so that transaction and newer data can be managed not just customer data
  • Workflow and collaboration capabilities to manage and curate data
  • Data as a Service is core to the platform so that third party data services can be easily integrated.

The graph schema is key to Reltio, allowing them to store both entities and their relationships in a semantically rich way. Data is stored in a combination of Apache Cassandra, graph technology, and in-memory structures such as Elastic. It offers an extensible structure for an organizations entities and relationships. The Reltio cloud collects data from multiple sources, matches, merges and relates them to create these relationship graphs and these graphics then underpin the data-driven applications being developed.

Reltio Insights shares objects (built from the profile and transaction data) with Reltio Cloud and analytics environments like Spark (either the Reltio platform or a customer’s own) to create analytic insights. These insights then get integrated with the master data so that these can be made available to data-driven applications. Reltio Insights is designed to rapidly provision master and transactional data into a Spark, environment. The resulting analytic insights are available throughout the Reltio environment, added to the data e.g. a customer’s churn propensity becomes an attribute of the customer profile.

The applications themselves can offer several different views – for instance, some users such as data stewards might see where the data came from and be able to interact with it to clean it up while others might only see the final, integrated view. A standard feature of the app is to visualize relationships, based on the underlying graph models. Some simple analysis, such as distribution of transactions by channel, can be easily included as can the results of more sophisticated analytics. Anything available in the Reltio data platform can be collaborated upon, managed and updated through data-driven operational applications. The data can then be used to drive analytical model development and provision the data to other operational applications. In addition, everything is tracked for audit and change purposes and the workflow engine can be used to manage requests for updates, changes etc.

Everything in the platform is available as HTML 5 widgets so that additional content like Google maps, can be easily embedded, and this means that Reltio content can also be easily embedded elsewhere. Many customers take advantage of this to mix and match Reltio content in other environments and vice versa. Similarly, all the data in Reltio Cloud is available from a REST API for use in all legacy operational and analytics systems.

You can get more information on Reltio here.

DecisionCAMP 2017 is coming up July 11-14, 2017 at Birkbeck College, University of London. This is going to be a great opportunity to learn about decision modeling, the Decision Model and Notation (DMN) standard and related topics. In fact the week is full of great things to do if you are in London or can make it there:

You can register for DecisionCAMP here.

One of our clients was presenting recently at a TDWI conference and was picked up on TechTarget – Analytics teams give data science applications real scientific rigor. It’s a great article with some good tips about using a repeatable methodology like CRISP-DM, especially when combined with decision modeling as a way to capture business understanding and drive collaboration (see this post too on Helping your analytic projects succeed with decision modeling and CRISP-DM). As Anu Miller of Cisco put it

We ask each other all the time, ‘What business decision are you looking to support?’

This focus on method and on business decisions also helps bring teamwork across the business/data science team divide too. As she went on to say

Those things almost force you to be collaborative. There are no unicorns on our team. We have to work together.

All good advice. If you live in the Bay Area, you can hear me talk about some of the key aspects of this approach at the Global Big Data Conference when I talk about ‘Don’t Apply Big Data Analytics To The Wrong Problem: Put Decisions First’. If you don’t live locally, check out this case study: Bringing Clarity to Data Science Projects with Decision Modeling.

And remember, if you would like to talk about improving your data science approach or other help operationalizing your analytics, get in touch.

Equifax has been expanding beyond credit bureau data in recent years by providing better access to a broad range of their own fraud, employment, wealth, commercial and alternative data sources as well as 3rd party data to position themselves as an insights company. As part of this focus, their Decision Management platform, InterConnect, was rebuilt from scratch as a foundation for cloud-centric and multinational decisioning applications. InterConnect is designed to support a broad range of decision management solutions, with an initial focus on customer acquisition.

InterConnect is designed to be a secure cloud-based decision management platform to define and execute decision policies at the front line. It is focused on delivering robust data, powerful decisioning and streamlined technology.

There are four main decisioning tools in the platform

  • Insight Gateway
    Streamlined transactional access to diverse data sources.
  • Attribute Navigator
    To manage and deploy the data catalog and derived attributes.
  • Model Integration tool
    A single tool to integrate, audit and deploy predictive analytic models into production.
  • Rules Editor
    A rules management environment for creating, testing and optimizing business rules

These four decisioning tools are presented in a common decision portal that is role-based, so only selected elements are exposed to users. This portal is gradually becoming the presentation layer for all Equifax services.

Insight Gateway gives real-time transactional access to data sources. This includes many standard data sources (such as Equifax’s own credit bureau data) as well as specific local data sources developed in each country  Insight Gateway uses a microservices architecture, JSON and self-description to make integration flexible and visual. It is supported by a Data Provisioning Tool that allows for discovery and conditional orchestration/integration.

Attribute Navigator allows users to create a catalog, define the attributes, test and deploy. It supports the definition of custom attributes against multi-bureau data, third party data or customer data. Customer data may be hosted by Equifax or can be sent on each individual transaction. The environment supports testing and auditing of attributes.

Model Integration Tool lets teams integrate, audit and deploy predictive analytic models. It supports scorecards as well as decision trees and ensembles. It can guide users through the integration of SAS, SPSS and R models to generate PMML ready for execution in the platform as well as a specification document for compliance and governance. This generated PMML, or other PMML models, can be executed in the Interconnect platform using the well-known Zementis engines (both ADAPA for cloud deployment and the Zementis Hadoop plugin – reviewed here).

Rules Editor is based on the modern rules platform provided by Sparkling Logic – SMARTS (most recently reviewed here). This provides a data-driven rule editor with authoring, visualization, testing and simulation all in one interface. The rule authoring environment supports cascading and inheritance, rule flow management, champion/challenger, trace execution and reports on key performance metrics.

Configuration of the four services for individual customer requirements and solution orchestration at runtime is delivered by Equifax’s professional services. InterConnect can be custom-designed or accessed as a pure platform utilizing all or individual service as needed. It is available in the US, Canada, UK, Spain, Peru, and Chile. Argentina, Paraguay, Uruguay, Mexico, Australia and India are expected to be added in the future.

You can get more details on the platform here.

Humana presented at InterConnect 2017 on their use of business rules on z/OS. Humana is a 3M member health insurer and a user of IBM Operational Decision Manager (ODM), IBM’s Business Rules Management System and has been focusing on using it to modernize some of their key mainframe systems – something that Humana is focusing on as part of its efforts to reuse existing assets. ODM runs on IBM z/OS for batch, CICS, standalone rules or WAS on z/OS, allowing them to run business rules on their mainframe systems. Using ODM allows Humana to reuse these assets while also transforming their development approach to be more responsive, more aligned with the business and more consistent to ensure compliance and manage risk.

Humana uses ODM for Medicare, Claims, Enrollment and Dynamic forms:

  • Humana has 700 Medicare plans that have to be reviewed for CMS compliance. A .Net application integrated with the decision service answers the CMS questions with an SLA of 2 seconds. The environment allows the business to manage the 1,700 rules in the application using ODM Decision Center. This improves the change cycle from months to weeks.
  • Claims processing handles multiple procedure payments and member cost sharing, for instance. Run as Cobol CICS and batch systems with 500+ rules and decision tables. 3.5M ruleset executions daily. Manual rules that could not be coded in COBOL now in ODM, increasing the rate of STP and driving savings. Savings in first week exceeded cost of development!
  • Enrollment for large and small group – about 30+ rule applications to reduce enrollment from a week to real time.
  • Dynamic forms is for authorization, generating custom questionnaires dynamically. 70+ questionnaires can now be developed and tested by the business. Complete audit trail and ability to make rapid changes have been key.

Architecturally

  • Humana runs ODM on z/OS.
  • Rule Designer (IDE) is used develop the vocabulary, templates, rules etc. This is tested and then pushed to Decision Center for ongoing business user management.
  • Decision Center is used across all environments. This allows business user engagement in a thin client environment and can be synchronized with the technical environment. Decision Center supports deployment, versioning etc and becomes the single source for rules. The support for testing and simulation are key and very popular with business users. They use both the Enterprise and Business Console, though the Business Console is the standard go-forward environment. All this runs on Z Linux, letting them take advantage of the integration of Z and DB2, the power of Z etc.
  • Decision Center is used to deploy Decision Services to the various environments Humana used – z/OS,Linux on Z etc.
  • The Rule Execution Server Console is used to monitor executing rules, trace rule execution and manage the deployed services.
  • They take advantage of the Java/DB2/z integration and performance tuning to maximized the performance of their rule execution. They mix and match decision services deployed for interactive or batch services, integration with COBOL or CICS etc etc. Lots of options for integrating the decision services into the mainframe environment.

Moving forward they are looking at some changes for high availability workload management as well as embedded batch for improved performance. Plus they want to complete the move to proper Decision Services from traditional rule applications.

Overall ODM on z/OS has delivered several benefits:

  • Cost savings
  • Improved time to market and business engagement
  • Single source of rules
  • Incremental adoption

State Farm presented at IBM InterConnect 2017 on their challenges with large scale deployment of IBM Operational Decision Manager (ODM) – IBM’s Business Rules Management System. State Farm has a set of specific things it wants out of its BRMS:

  • Well defined artifact lifecycle for auditing
  • Rigorous deployment process support for confidentiality and consistency
  • Self-service so authorized users can work 24×7

This is a pretty technical topic as State Farm is a large scale user of rules and IBM ODM. They have >500 rule projects with rulesets that vary from 100 rules to 10,000, some invoked every few minutes to very large volume batch jobs. Some of the decisions are trivial but others have big legal implications and must be 100% right. 45 different teams with 430 users of Decision Center are working on projects with over 80 deployment targets on Linux and Z/OS hosts.

They need RuleApps – the deployable units – to have well defined content, be accessible, controlled on need-to-know and governed appropriately for its criticality. Each RuleApp version is built once to ensure consistency and decouple deployment from the Decision Center editing environment. They are then promoted through the test and production servers. Its also important to manage the system users and business users appropriately.

Key precepts then:

  • RuleApp builds that are promoted for real use come from Decision Center
  • Well-formed RuleApp version baselines to track content
  • Self-service tooling to manage RuleApp, XOM, builds etc
  • Keep users our of the RES consoles – make it so they don’t need to be in the console

The automation underlying this has been evolving and is now moving to Decision Services and the Decision Governance Framework as well as working only in Decision Center. UrbanCode is used to manage the deployment automation, accessing the Decision Center and other ODM APIs, storing the compiled artifacts and managing the solution. State Farm originally built this themselves but newer versions have UrbanCode plugins.

State Farm really wanted to manage the executable object model – the XOM – so they could make sure the XOMs needed by RuleApps were deployed and available. Newer versions of IBM ODM allow you to embed the XOM in the RuleApp deployment so it is self-contained and not dependent on the correct (separate) deployment of the XOM.

End to end traceability is the end goal. Baselining the RuleApp means you know which rules and rule versions are in the RuleApp. In theory all you need to know is the ruleset that executed and the baseline of the RuleApp deployed – this tells you exactly which rules you executed. Decision Center tracks who deployed which RuleApp to where and when, linking the deployment to the RuleApp baseline. But to get this detail to the Ruleset level you need to add an interceptor to add the baseline details to each ruleset.

Versioning is critical to this traceability. An intentional versioning scheme is a must and deployment is done by replacing the old deployment explicitly so that version numbers are managed centrally. State Farm embeds version information in names. Most users just deploy asking for latest version, and this works automatically, but this explicit approach gives control and options for using named versions when that is needed.

Lots of very geeky ODM info!

Ginni Rometty kicked off day two of the IBM Interconnect conference, with a pitch that cloud is changing IT, business and indeed society. Cloud, and the IBM Cloud in particular, she says will allow a new generation of business and change the world. Cloud is already 17% of IBM’s business and clearly Ginni sees this growing rapidly. Three elements drive this:

  • IBM Cloud is Enterprise Strong
    Strong public infrastructure, industrialized hybrid, enterprise strong, choices and consistency, secure etc etc. Plus an industry focus to support specific regulatory frameworks, APIs etc. Need a pipeline of innovation too – IBM’s investing in blockchain, quantum and more to ensure their cloud continues to have innovative capabilities.
  • IBM Cloud is Data First
    The value of data, she says, is in the insights it generates for you – not democratizing and sharing your data, but letting you use your own data to drive your own insights. This relies on data diversity (to make sure that public, private and licensed data of various types can be used togather) and data control (to protect the IP represented by your data).
  • IBM Cloud is Cognitive to the core
    Ginni feels that cognitive is going to drive the future and only companies adopting it will survive and thrive. Watson she says ranges from Machine Learning to AI to Cognitive. This has to be embedded in the cloud platform. And it needs new “senses” like an ability process images, listen to audio. Plus motion, sensors to provide “touch”. And Watson is being specialized and trained with data from industry leading sources.

AT&T came up next to talk about the impact of broadband, pervasive mobile connections and how this enables access to the kind of cloud IBM is developing. AT&T also think content matters, especially in mobile as more and more content is being consumed on the mobile device. The work IBM and AT&T is doing essentially allows a BYOD mobile device to connect to IBM Cloud assets as effectively and securely as an on-premise network. Plus they are doing a ton of work around IoT etc.

Mark Benioff of Salesforce.com came up next to talk about how IBM is integrating Watson with Salesforce. 5,000 companies are both IBM and Salesforce customers. The integration is between Salesforce’s Einstein and IBM’s Watson. Critically of course this is about data – bringing the data Watson can access like the Weather Company with the CRM data that Einstein works on. This ability to pull “back office” data and combine it with customer-facing data in the CRM environment allows for new customer-centric decisioning. Mark said that initially customers are focused on things like prioritization of calls or service issues, next best action. But he sees the potential for these AI technologies to really change how people work – enhancing the way people work – but this requires companies use these technologies appropriately.

H&R Block came up next to talk about how they are using Watson and cognitive to change the way they provide tax advice to customers. The Watson component drives a natural language interview, is trained on the tax code to spot deductions and then makes additional suggestions for future changes that will help. A second screen engages the client directly, enabling an integrated conversation around the data being gathered. Interestingly the staff using the software have found it engaging and feel like they are on the cutting edge, especially since they branded it with Watson. Watson, of course, is continuing to learn what works and what does not. They are looking into how to extend it from the in person interviews to the online business and to customer support.

Royal Bank of Canada came on stage to discuss the move to mobile – to a digital experience -in banking. All this requires a focus on the cloud, on building new capabilities on the cloud, to deliver the agility and pervasiveness that are needed. A microservices architecture creates “lego blocks” that can be easily integrated and deployed rapidly. This speeds development but it also changes the way things are built. And this takes more than just training, it requires certification, experiences that move them up the curve, ongoing commitment. This matters because mobile is on the verge of overtaking online banking as an interaction. Online banking used to be one release a year, now it (and the mobile apps) do at least 6.

Ginni wrapped up with a great conversation with Reshma Saujani, founder of Girls Who Code, and how IBM is helping them train a million girls to code. Very cool program and some great opportunities and ideas being generated. Lovely to hear some teens talking working with cloud, AI, APIs and all the rest. And very funny that they had to explain who IBM was to their friends 🙂

I am going to be at IBM InterConnect this week. I am speaking with Kaiser Permanente at 2pm on Monday – Pioneering Decision Services with Decision Modeling at Kaiser Permanente – so come by and here me and Renee speak about our successes with decision modeling with DMN (Decision Model and Notation), business rules and IBM Operational Decision Manager (ODM). I’ll have some copies of my books to give away and it’s a great story – well worth hearing.

Right after I will be easy to find at the meet an IBM Champion booth. Come by and ask me questions about rules, analytics, decision management, decision modeling or whatever! In general I will be around until early Wednesday morning and checking my email plus I will be blogging from the conference (here on JTonEDM) when I get a chance as well as tweeting @jamet123.

Come by and say hi!

 

The Decision Management Systems Platform Technologies Report began in early 2012 as a way to share our research and experience in building Decision Management Systems. Since then we have extended, updated and revised the report many times. This week we released the latest version – Version 8 – with a new, easier to use format. There is so much content in the report now than one great  big document no longer works. To make it easier to use we have now broken it down into a set of pieces. We have also added content on technology for modeling decisions, upgraded significantly the section on monitoring and improving decisions and given the whole thing a refresh.

The documents are:

  • Introducing Decision Management Systems
  • Use Cases for Decision Management Systems
  • Best Practices in Decision Management Systems
  • Five Key Capabilities
    • Managing Decision Logic With Business Rules
    • Embedding Predictive Analytics
    • Optimizing and Simulating Decisions
    • Monitoring and Improving Decisions
    • Modeling Decisions
  • Selecting Products for Building Decision Management Systems

The first three are an excellent introduction for business or technical readers, while the others are more focused on those who are selecting or using the technologies described. You can download them for free on the Decision Management Systems Platform Technologies Report page.

As always, don’t forget that we have extensive experience helping organizations like yours define, configure and implement Decision Management Systems that deliver on the value propositions described in the Report. Our clients are leading companies in insurance, banking, manufacturing, telecommunications, travel and leisure, health management, and retail. Contact us if we can help.

I caught up with the folks from Conductrics to learn about their 3.0 release (I have blogged about Conductrics before). Conductrics has been in the decision optimization business for many years now. At its core Conductrics is about assigning customers to experiences. They recently released 3.0 with some new features.

Conductrics Express is a point-and-click tool to help set up tests and personalized experiences for web sites. It’s a browser extension with lots of control. Historically Conductrics has been more API-centric. Some audiences really like a “headless” and API based platform but others want something more UI-based. In addition, quick tests or temporary projects are common and the new visual setup lets them quickly set something up. For instance, figuring out which customers get which experience sometimes requires some quick A/B testing or experimentation and there is no time to work with IT etc. Conductrics Express sits on top of the API so can be evolved and integrated with other API based projects.

To make it easier to use machine learning, the new version supports explicit induced rules. This gives you an interpretable AI as it converts complex optimization logic into easily digestible, human readable decision rules. Users can use it for audience discovery and can either have it drive the experience or just “listen” to see what it would have recommended. This engine does trial and error or A/B testing and as you collect data it builds a decision tree for audience segmentation.

One of the nice features of the engine it that it predicts likelihood of success for offers but also predicts how likely an option is the best one. This enables you to identify both those experiences that are clearly better than alternatives despite having a low chance of success as well as those that seem significantly better but where there is a high degree of uncertainty. This reflects the reality that additional targeting is lower value for some (sometimes there’s just not much difference between best and worst). This lets you see the marginal benefit of targeting (v picking A or B) etc. and allows you to see poorly served audiences.

The current version allows Inline creation of options, easy specification of goals and has a Javascript API that allows packaging of logic into a file that is locally available e.g. on mobile app. You can also group agents into a multivariate agent for reporting and create mutually exclusive agents to make for more sophisticated analyses. Users can also add rules to constrain or force experiences, use predictive analytics in assignment or randomly assign people to learn from experiments. A single recommendation can be the objective or a list of recommendations can be. All this can be tied together using flowlets that specify logic to tie agents together using adaptive or random logic. This allows for more complex decisions where making one implies another.

Finally, there is a new faster JavaScript API to the existing web service API and a variety of deployment options from hosted to on-premise.

You can get more information on Conductrics here.

I have trained a lot of practitioners in the Decision Model and Notation (DMN) – I am closing in on 1,000 decision modeling trainees now – and one of the interesting questions is always their motivation for using decision models. As I look back across all the folks I have trained, four motivations seem to bubble to the top:

  1. Too many analytic projects go awry and they need a way to frame requirements for analytic projects to tie analytics to business outcomes
  2. They struggle with a “big bucket of rules” in their Business Rules Management System (BRMS) and need a better way to capture, manage and structure those rules
  3. They need clarity and consistency in how they make (or want to make) decisions that are likely to be only partially automated
  4. They need an executable representation of decision-making

These motivations have three things in common:

  1. They need to visually communicate their intent to a wide group of stakeholders, some of whom did not build the model
  2. They need a broad group of people – business experts, business analysts, data scientists, programmers and others – to be able to collaborate on the model
  3. They have tried using documents (requirements, lists of rules, spreadsheets) before but have found them inadequate

Decision models built using the DMN notation and a decent approach (such as the one Jan and I describe in Real-World Decision Modeling with DMN) deliver. DMN Decision Requirements models are easy for a wide group of people to collaborate on and clearly communicate the structure of decision-making. Combined with either DMN Decision Logic models or executable rule artifacts in a BRMS, they manage decision logic effectively and are an executable representation of your decision-making.

Now, there are those that feel like the motivation for decision models is almost always a desire for an executable representation – that only a decision model that is complete and executable is really useful. While this may be true of BPMN tool vendors – who need executable DMN to make their process models go – it is far from true among decision modelers more generally. In fact in our training and consulting projects we see more projects where a decision requirements model alone is tremendously useful than anything else.

Among those who want execution there is a split also: Some want the model to be 100% executable – to be able to generate code from it. Others prefer (as I do personally) to link a business friendly logical model to executable elements in a more loosely coupled way. This allows the business to keep things in the requirements model that are not executable (such as how people should decide to consume the result or characterize an input) while IT can make some performance or platform-specific tweaks that don’t have to show up in the logical model. The cores are linked (most decisions in the model have an executable representation in a BRMS that is editable by the same people who edit the model) but not everything has to be.

Whether you like the “press the big red button” approach that generates code from a DMN model or a blended DMN + BRMS model for execution, never forget all the other use cases where clarity, collaboration, consistency and structure matter even though execution doesn’t. There is a wide range of valuable use cases for DMN – it’s not just about executable models.

Prompted by an interesting post by Bruce Silver called DMN and BPMN: Common Motivation, Different Outcome?

Forrester analyst Mike Gualtieri asks “if analytics does not lead to more informed decisions and more effective actions, then why do it at all?” Specifically in a great post What Exactly The Heck Are Prescriptive Analytics? he says (emphasis mine)

Prescriptive analytics is about using data and analytics to improve decisions and therefore the effectiveness of actions. Isn’t that what all analytics should be about? A hearty “yes” to that because, if analytics does not lead to more informed decisions and more effective actions, then why do it at all?
Enterprises must stop wasting time and money on unactionable analytics. These efforts don’t matter if the resulting analytics don’t lead to better insights and decisions that are specifically linked to measurable business outcomes.

Now he calls it Prescriptive Analytics and we call it Decision Management but we are aligned on the value proposition – operationalized analytic insight. I also really like his focus on the intersection between analytics  and business rules (decision logic). As he points out you can wrap business rules around analytics to make them actionable (starting with the analytic) and you can also start with the business rules and use predictive analytics to improve them. The combination of business rules and predictive analytics is the powerful one. And our experience shows that using a decision model, built using the Decision Model and Notation (DMN) approach, is a great way to show how you are going to use these technologies together to make better decisions and how those decisions are specifically linked to measurable business outcomes – metrics or objectives.

In fact I am in Asia this week presenting on a pilot that demonstrates exactly this. We have been developing a decision model and decisioning architecture, implementing that decision model in business rules and using analytics to inform those business rules as part of a pilot for a large enterprise out here. We are excited about the results as they show exactly what Mike – and I – have been saying: That a focus on decisions, tied to business outcomes, modeled and implemented with business rules and analytics leads to better business results.

If you would like to talk about a pilot or other help operationalizing your analytics, get in touch.

Thanks to my colleague Gagan Saxena for spotting this great post

I recently wrote three articles for KDnuggets on the potential for decision modeling in the context of the CRISP-DM methodology for analytic projects:

  • Four Problems in Using CRISP-DM and How To Fix Them
    CRISP-DM is the leading approach for managing data mining, predictive analytic and data science projects. CRISP-DM is effective but many analytic projects neglect key elements of the approach.
  • Bringing Business Clarity To CRISP-DM
    Many analytic projects fail to understand the business problem they are trying to solve. Correctly applying decision modeling in the Business Understanding phase of CRISP-DM brings clarity to the business problem.
  • Fixing Deployment and Iteration Problems in CRISP-DM
    Many analytic models are not deployed effectively into production while others are not maintained or updated. Applying decision modeling and decision management technology within CRISP-DM addresses this.

Check them out. And if you are interested in how one global leader in information technology is using decision modeling to bring clarity to its analytic and data science programs, check out this Leading Practices Brief from the International Institute for Analytics. We have found that a focus on decision modeling early really helps get and keep analytics projects on track and makes it much easier to operationalize the results.

Zbigniew Misiak posted a great set of tips from practitioners BPM Skills in 2017 – Hot or Not – with my contribution here.  This is a great way to get some quick advice from a wide range of practitioners and experts. As someone with a particular focus on decision management and decision modeling I was struck by the fact that there were 5 distinct recommendations for DMN and decision modeling/decision management (including, obviously, mine).

Some quick follow up points on my note:

  • Hot
    • Decision Modeling and the DMN standard
      • There’s rapidly growing interest from companies in using decision modeling and the Decision Model and Notation (DMN) standard
      • Lots of this interest is coming from business rules practitioners
      • But process modelers using BPMN are keen to use it too as modeling decisions separately simplifies their process models dramatically
      • And analytic/data science teams are seeing the potential for decision models in bringing clarity to their business problem
    • Predictive analytics (not just process analytics)
      • Data mining, data science, predictive analytics, machine learning etc are all hot technologies
      • Practitioners need to move beyond dashboards and process analytics to include these more powerful analytic concepts
      • A focus on decisions and decision modeling is central to finding the places where these analytics will make a difference
    • Declarative modeling
      • Business rules are declarative. So, at some level, are analytics. Process models are procedural
      • Declarative models like DMN allow you to describe decision making without overlaying the procedures we happen to follow today
  • Not
    • Modeling business rules outside the context of decisions
      • There’s no value to business rules that don’t describe how you make a decision
      • You can, of course, describe your data and your process in a rules-centric way and that’s fine
      • But don’t think there’s value in just having a big bucket of rules
    • Embedding business rules directly into processes
      • Decisions are where rules meet processes
      • Embedded rules as gateways or conditions simply makes your process messy and complex
      • Even embedding decision tables as separate tasks will only work for very simple scenarios
      • You need to model the decision and create a decision-making component, a decision service, for the process to use
    • Procedural documentation
      • Works fine for proceses but a procedural approach for documenting analytics or rules simply documents how you do this today, not what you need it to do tomorrow

Check out everyone else’s thoughts BPM Skills in 2017 – Hot or Not

Tom Davenport had a great article recently on Data Informed – “Printing Money” with Operational Machine Learning. His intro paragraph is great:

Organizations have made large investments in big data platforms, but many are struggling to realize business value. While most have anecdotal stories of insights that drive value, most still rely only upon storage cost savings when assessing platform benefits. At the same time, most organizations have treated machine learning and other cognitive technologies as “science projects” that don’t support key processes and don’t deliver substantial value.

This is exactly the problem I see – people have spent money on data infrastructure and analytics without any sense of what decision they might improve with them. By taking a data-first and technology-first approach these organizations have spent money without a clear sense of how they will show an ROI on this. He goes on to talk about how embedding these technologies into operational systems has really added value to an operational process – this, of course, is the essence of decision management as he points out. As he also points out this approach is well established, it’s just much more accessible and price-performant now than it used to be. It’s always been high ROI but it used to be high investment also – now its more practical to show an ROI on lower investments.

In the example he discusses in the article, the solution stack

…includes machine learning models to customize offers, an open-source solution for run-time decisioning, and a scoring service to match customers and offers

Tom goes on to identify the classic elements of a solution for this kind of problem:

  • A Decision Service
    This is literally a service that makes decisions, answers questions, for other services and processes. Identifying, developing and deploying decision services is absolutely core to success with these kinds of technology. The graphic on right shows how we think about Decision Services:

    • It runs on your standard platform to support your processes/event processing systems
    • It combines business rules, analytics (including machine learning) and cognitive as necessary to make the decision
  • A Learning Service
    This is what we call decision monitoring and improvement. You connect the Decision Service to your performance management environment so you can see how different decision choices affect business results. This increasingly includes the kind of automated learning Tom is talking about to improve the predictive power of the analytics and to teach your cognitive engine new concepts. But it can also involve human intervention to improve decision making.
  • Decision Management interface
    The reason for using business rules and BRMS in the Decision Service is to expose this kind of management environment to business owners.

We have seen this combination work over and over again at clients – mostly with human-driven learning to be fair as machine learning in this concept is still pretty new. Our experience is that one of the keys to success is a clear understanding of the decision-making involved and for that we use decision modeling. You can learn more about decision modeling from this white paper on framing analytic requirements, by reading this research brief (if you are a member of the International Institute for Analytics – the organization Tom founded) or by checking out these blog posts on the analytic value chain.

I have a lot more on how these decision management technologies work together in the Decision Management Systems Platform Technologies report which will be updated with more on machine learning and cognitive in the coming months.

BPTrendsReal-World Decision Modeling with DMN CoverPaul Harmon over on BPTrends interviewed Jan Purchase and I about our new book, Real-World Decision Modeling with DMN. The interview covers a pretty wide range of topics – a definition of Decision Modeling, the bottom line value of modeling a decision, the difference between decisions and rules and how decision modeling helps projects. We talk about some of our memorable experiences of applying decision modeling, our favorite best practices and the value in applying decision modeling and the Decision Model and Notation (DMN) standard alongside BPMN. We outline our method for applying decision modeling and DMN, and talk a little about the book – who it is aimed at and what it contains (Over 220 practical illustrations, 47 best practices, 13 common misconceptions to avoid, 12 patterns and approaches and 3 worked examples among other things). We end with some thoughts on how to get started.

Anyway, I hope you enjoy the interview over on the BPTrends site and, if you do, why not buy the book? Real-World Decision Modeling with DMN is available on amazon and more details are available on the Meghan-Kiffer site.

Real-World Decision Modeling with DMN CoverI am delighted to announce that Jan Purchase, founder of Lux Magi, and I have finished what we believe is the definitive guide to decision modeling with the Object Management Group’s Decision Model and Notation (DMN) standard, Real-World Decision Modeling with DMN. The book is now on general release and available for purchase from Amazon and Barnes and Noble (full details on the Meghan Kiffer site).

If you are a regular follower of this blog you will know that decision modeling is an important technique for improving the effectiveness, consistency and agility of your organization’s operational decisions and a vital adjunct to the continuous improvement of your business processes.

We have tried hard to develop a truly comprehensive book that provides a complete explanation of decision modeling and how it aligns with Decision Management and more broadly with Digital Transformation. The book describes the DMN standard, focusing on the business benefits of using it. Full of examples and best practices developed on real projects, it will help new decision modelers to quickly get up to speed while also providing crucial patterns and advice for more those with more experience. The book includes a detailed method for applying decision modeling in your organization and advice on how to select tools and start a pilot project. It contains:

  • Over 220 practical illustrations
  • 47 best practices
  • 13 common misconceptions to avoid
  • 12 patterns and approaches
  • 3 worked examples

Here are some of the great quotes from our initial reviewers:

“This comprehensive and incredibly useful book offers a wealth of practical advice to anyone interested in decision management and its potential to improve enterprise applications. Using a blend of user case studies, patterns, best practices and pragmatic techniques, “Real-World Decision Modeling with DMN” shows you how to discover, model, analyze and design the critical decisions that drive your business!”
—David Herring, Manager of BPM & ODM Delivery at Kaiser Permanente.

“Well written and very impressive in its scope.”
—Alan Fish, Principal Consultant, Decision Solutions, FICO and author of “Knowledge Automation: How to Implement Decision Management in Business Processes”.

“If you are looking for a complete treatise on decisions, look no further. Even though you end up in decision modeling and Decision Modeling Notation (DMN), you are treated to all the aspects of decisions explained with great care and clarity. This is a greatly needed book that will be studied by decision managers and practitioners for the next decade or more. It will end up on my physical and logical bookshelves.”
—Jim Sinur, Vice President and Research Fellow, Aragon Research.

“Written by two of the foremost experts in decision management, this book provides an extensive exploration of business decisions and how they are used in modern digital organizations. Taylor and Purchase distill for us the why, how, when and where to apply decision modeling in order to specify business decisions that are effective. Going beyond just an introduction to the Decision Model and Notation (DMN), the authors position this standard with respect to other well established standard notations such as BPMN. This is truly a first comprehensive handbook of decision management.”
—Denis Gagne, Chair of BPMN MIWG, CEO & CTO at Trisotech.

I very much hope you will buy and enjoy Real-World Decision Modeling with DMN. I’d love to hear what you think of it and stories about how it helped you on projects so please drop me a line.

I’ll leave the last word to  Richard Soley, Chairman and CEO of the Object Management Group who wrote the foreword for us:

A well-defined, well-structured approach to Decision Modeling (using the OMG international DMN standard) gives a repeatable, consistent approach to decision-making and also allows the crucial ‘why?’ question to be answered — how did we come to this point and what do we do next? The key to accountability, repeatability, consistency and even agility is a well-defined approach to business decisions, and the standard and this book gets you there.”

What are you waiting for? Go buy it –  Meghan KifferAmazon, Barnes and Noble.

JMP 13 and JMP Pro 13 launched in September 2016. JMP is a business unit of SAS with more than $60M in revenue with growth and a focus in manufacturing and life sciences. Lots of engineers, analysts, researchers and scientists as users. JMP has specialty products in Genomics and Life Sciences but this release is about the core products. The tool remains focused on design of experiments, predictive modeling, quality and reliability, and consumer research though dashboards and data wrangling are being added too.

JMP 13 remains a fat client tool and with this release there are a couple of key themes:

  • Increased ease and efficiency of preparing data
  • Enriching the kinds of data and this release is adding free text
  • Handling very wide data is another area, especially as more memory is added to the desktop
  • Finally reporting, dashboards and their priority focus areas remain important

New capabilities include:

  • Lots of focus on reducing the number of clicks for actions – so for instance creating a script to recreate an analysis on new or refreshing data is now 1 click not 2.
  • JMP tables created as the software is used can now be joined using standard query builder capabilities. This essentially creates a new JMP table that can then be analyzed. This includes filters and where clauses for instance. Building the new table also prototypes the SQL that would be required on the production systems.
  • A virtual join has been added also to support situations where very tall datasets are involved. For instance, sensor data might have thousands of entries per sensor. Joining this to a table about the locations of sensors, say, might blow memory limits. The virtual join allows a join to be defined and used without ever filling the memory with joined data.
  • New dashboard building tools are included to make it easier to arrange several graphs onto a dashboard without having to go through the full application development tools. Multiple templates are provided for a simple drag and drop construction. Full interactive views (with all the support for JMP changes at runtime) or summary views are available for dashboards. Lists can be added as filters and one graph can be used to filter another – so as a user clicks on one graph, the data in the other is filtered to only those data selected in the first one.
  • Sharing reports and dashboards has been an increasing focus in recent versions. Stand-alone interactive HTML reports and dashboards have been improved. For instance, graph builder capabilities, making HTML graphs more interactive. Data for these files are embedded in the files – the data is a snapshot not live – but only the data that is needed. Multiple reports can be packaged up and an index page is generated to front end a set of independent reports.
  • JMP Pro has improved the ability to quickly organize and compare multiple models. Scripts can be published to a Formula Depot for multiple models and then used in a comparison without adding the corresponding columns to the data set. From here code can be generated for the models too – adding C, Python and Javascript to SAS and SQL.
  • Text handling has been added too. JMP allows more analysis of text using the Text Explorer which handles various languages and identifies top phrases, word clouds etc. As usual in JMP, these are interactive and can be cross-tabbed and integrated with graphs and other presentations. JMP Pro allows this to be integrated with structured data to build a model.

For more information on JMP 13 click here.

Final session for me at Building Business Capability is Denis Gagne of Trisotech talking about the intersection of process, case and decision modeling – BPMN, CMMN and DMN – and the importance of these standards to business analysts.

Denis reminds us that all organizations are facing a new digital future with rapid technology change, changing customer behavior and increasing competition. Music, media, transportation and more have been disrupted and everyone is at risk of disruption. In response companies cannot just focus on operational improvement (good for the company, not for customers) or just on customer experience (expensive for the company) – we must focus on both – on transformation, on outcomes and on new business models. We must do new things that make our old things obsolete. And this requires LOB managers and executives, enterprise and business architects, business and process analysts to bring different perspectives to bear effectively.

garnterworktypesGartner has a great model of work efforts and all these must be improved, requiring a focus on structured and unstructured processes as well as on decisions. And these improvement efforts must balance cost against quality, and time against value. Different approaches balance these things differently, leading to different kinds of improvement. Business analysts in this world he argues therefore you need to have three practices in your toolbox:

  • Process Management – with BPMN, Business Process Model and Notation
  • Case Management – with CMMN, Case Management Model and Notation
  • Decision Management – with DMN, Decision Model and Notation

These notations provide unambiguous definitions for processes, cases and decisions. These can be interchanged and even more importantly is that the skills in developing and reading these models are transferable. This means that despite turnover, multiple parties and external resources, we still know what the model means.

whentouseThe most common question he gets is when to use which. BPMN he says is about processing, CMMN is about managing context and DMN is about deciding. It’s not too hard… And he had a nice chart showing the difference between BPMN and CMMN that also showed the broad applicability of DMN.

  • BPMN allows you to define a process – a triggering event followed by a sequence of activities that lead to some business outcomes – using standard shapes to represent the elements.
  • CMMN allows you to define a case – in a context there is an event followed by a condition and an action – again using standard shapes.
  • DMN allows you to define decision-making – decisions, decision requirements and decision logic – also using standard shapes

He wrapped up with some great advice:

  • In BPMN and using too many gateways – you need DMN to model that decision-making
  • In BPMN and using too many intermediary or boundary events – you need to use CMMN to model the case work you have identified
  • In BPMN and using ad-hoc processes – replace them with CMMN
  • In CMMN and using lots of sequencing – use BPMN

The standards as a set he says allow you to build a more agile, intelligent and contextual business.

By the way, my new book on DMN – Real-World Decision Modeling with DMN – is available

 

Continuing blogging from Building Business Capability and it’s time for Jan Vanthienen to talk decision modeling for the business analyst. Jan is a leading researcher into decision modeling and decision tables at the University of Leuven.

Observation 1: Decisions are important for business, not only processes.

Too often, he says, the process is modeled with no real thought being given to the decision-making required – a claims process that does not really describe which claims will be paid, only how to pay or reject them. To describe them, companies must model them and the DMN (Decision Model and Notation) standard is ideal. DMN defines two layers:

  • The requirements for a decision in terms of other decisions, data and sources of knowledge.
  • The logic of the decisions, derived from these knowledge sources and applied to the data to actually make the decisions in the requirements.

Once you model the decision separately, much of the decision-making complexity that overloads a process model can be removed to simplify and clarify the process.

Observation 2: Decision logic should not be shown as process paths. 

Decisions should be modeled as peers of the processes that need them so they can change independently, be managed by different people etc. This reduces the complexity of processes by removing decision logic and decisions from the process model and instead linking the decision to a specific process task.

Observation 3: Multiple decisions in a process might be in a single decision model

Observation 4: Proven techniques exist to represent decision tables and decision models

From a methodology perspective we need to do three things

  1. Model the decision structure
    • Build a decision requirements graph – a decision requirements diagram – to show how the decisions, data and knowledge sources form a network describing the decision making and its structure.
    • Generally this is best done top down and descriptions of questions and answers in requirements or policies often lead very naturally to sub-decisions.
    • Descriptions of data elements are similarly clear. But if you just try to write a decision table against these data elements you will end up with an unwieldy model. Build a model first.
    • img_20161104_092831706Decision requirement models are built using expertise, by seeing which conditions depend on the results of other decisions, and by normalizing the model to eliminate redundancy.
  2. Model a single decision table
    • Decision tables are not the only way to manage the business rules for a decision in the model but they are a very effective one and one that business users find very transparent.
    • img_20161104_093400122Representing a description of logic as a decision table clarifies and makes consistent the logic it represents. And the logic read in several ways without ambiguity – what will I do if? OR when will I do x?
    • Decision tables can show rules as rows or as columns, with each column/row being based on a fact. This creates a repeated structure for each row. And the rows are all rows that only have ANDed conditions (ORs lead to new rules as they should).
    • The standard also defines a “hit indicator” to define how to handle situations where multiple rules in a decision table are true. But stick to Unique and perhaps Any’s because the others get you into trouble…especially First which is just a way to code ELSEs!
    • Decision tables can be correct, compact and consistent but generally you can only get 2 of the 3. Start focusing on correctness, work to ensure consistency. Only take compactness as a bonus.
    • Jan likes to use rules as rows or rules as columns depending on the shape of the table. I prefer to stick to rules as rows unless it’s really important…
    • You can build decision models and tables interactively with people, from text of policies or regulations, or by mining data/rules/code. You can also find all the possible condition values and outcomes and build an empty table. As you fill in the logic you can clarify and normalize.
    • DMN allows the modeling of decisions alongside processes, simplifying processes and creating a more declarative and understandable model of decision-making. It also separates the structure from the logic while allowing them to be linked and reused. DMN standardizes and extends decision modeling and decision table approaches but builds on a long history of techniques and approaches that work.
  3. Model the process connection
    • Sometimes the entire process is about a decision. Model the decision first and then think about how to execute it, how to build a process around it. A process might do all the sub-decisions in parallel to minimize the time taken OR specific decisions might be made first because they eliminate the need to do other parts of the process for efficiency OR some decisions might be made about several transactions at the same time in a group OR…
    • The decision model describes how to make a decision so that you can not only build a decision-making system but also answer questions about how the decision works, how it is made.

By the way, my new book on DMN – Real-World Decision Modeling with DMN – is available