≡ Menu

Decision Management Solutions joined the OneDecision.io consortium back in September and we have been working with them ever since both within the Decision Model and Notation (DMN) standards process and to provide some integration between the OneDecision.io Java-based reference implementation for DMN execution (which supports basic decision tables, JSON data types, and the standardized DMN XML interchange format) and DecisionsFirst Modeler our decision requirements modeling platform.

We believe that the best way to integrate execution-oriented environments like OneDecision.io (or IBM Operational Decision Manager and other commercial Business Rules Management Systems) is by linking the decision requirements diagrams you build to the matching implementation in your target environment. We have now completed the initial prototype for the integration of DecisionsFirst Modeler Enterprise Edition with OneDecision.io and you can see the results in the video.

If you are interested in this integration, or any others, please get in touch – info@decisionsfirst.com.

DMNBookFrontCoverSmallI am working on a new book – Real-World Decision Modeling with DMN – with Jan Purchase and he recently posted another great post – How DMN Allows Business Rules to Scale. While decision modeling with DMN is not JUST about writing business rules (as I noted earlier), this is a great use case for it. Jan does a nice job outlining why it can be hard to scale business rules projects and how it gets especially hard when you start thinking about how to structure them.

I would add a couple of things:

  • We have found that the many:many relationship between process tasks and business rules is best managed using a decision model. While simply grouping rules into decision tables or rulesets helps with simple decisions, complex ones can end up smeared across multiple tasks if you are not careful. Structuring the decision explicitly using a decision model really helps.
  • When doing decision models we regularly identify rules and facts (information) that are not in the end needed. The SMEs say they need this piece of information to make a decision or tell you that such and such is a rule. However building a decision model forces real choices and we often find that the way they REALLY make the decision does not use that information and the rule, while potentially true in the general case, is not relevant to the specific project at hand. The decision model acts as a lens, focusing you on what you need to know to get something done – a decision to be made.
  • Decision models don’t assume business rules are the objective, allowing you to build them even if you are not sure you can/will document the business rules. As I said in this post on reasons to model decisions, there are many reasons to model decisions. This allows you to start with a decision model without having to know where you are going to end up.

Decision modeling is a powerful tool and one you should be considering, especially if you are working in business rules.

I got an email today from a doctoral student trying to complete their dissertation. They are looking for 10 or so participants to complete data collection for a study on The Role of Data Governance Mechanism in Managing the Big Data Environment.

If you’re interested, please contact Stephanie by email for more information:
Stephanie Cutter

As regular readers know I have been working on a new decision modeling book – Real World Decision Modeling with DMN – with Jan Purchase. While you wait for this book from Meghan-Kiffer press you might want to check out Questioning BPM? This was a fascinating exercise in which Paul Harmon and Roger Tregear asked a whole bunch of us – about 30 – to answer a set of questions about business process management.

I wrote on a couple of topics for this book:

  • Should BPM include decisions or not?
    When initiating a BPM project or setting up a BPM competency, organizations often wonder if they should include decisions, and business rules, in these BPM initiatives. The answer, as it so often seems to be, is both Yes and No.
  • Why do we need a separate modelling notation for decisions?
    The Object Management Group has long had a modeling notation for business processes – the Business Process Model and Notation. This has recently been joined by a decision modeling notation – the Decision Model and Notation or DMN (as well as the Case Management Model and Notation or CMMN). Why do those modeling processes need to know this new notation and how should they use it alongside their process models?

The book is available now from amazon.com and as a Kindle edition. There’s a great summary on the MK Press site too – Questioning BPM?

If you want to learn more about decision modeling check out our white paper or sign up for our training.



Besides working on Real World Decision Modeling with DMN  I recently contributed a piece on the use of decision modeling for framing predictive analytics to The Big Analytics Book, a book being produced by the folks at AnalyticsWeek. AnalyticsWeek rounded up 60+ thought leaders in analytics, including yours truly, and got us all to write a piece of advice. Mine was focused on the potential for decision modeling to accurately and usefully frame analytics projects:

Framing analytics projects matters because it is easy to build a great analytic that does not truly impact business results – that does not improve decision-making. In the words of one of our customers “there’s too much white space between analytic success and business success”. Linking your metrics to the decisions that make a difference and then modeling these decisions really helps ensure your analytic projects are on-target.

It’s a fun book and you should check out the press release here and sign up for a copy at thebiganalytics.com.

If you want to learn more about decision modeling for analytics check out our white paper or sign up for our training.

DMNBookFrontCoverSmallAs you may have noticed I am working on a new book – Real-World Decision Modeling with DMN – with Jan Purchase. Yesterday Jan had a great blog post – Why Decision Modeling? (In 1000 Words). Jan makes some great points, emphasizing the value of  decision modeling with DMN in:

  • Transparency of the logic
  • Separation  of concerns between processes and decisions
  • Managing Complexity, Maintaining Integrity
  • Agile change of the way we make decisions
  • Standardization of representation
  • Traceability of our implementation

Jan and I are enjoying writing the book. One of the reasons it’s so much fun is that we both agree and bring slightly different perspectives – specifically I tend to spend more of my time building decision requirements models while Jan spends more time drilling the model all the way down to the level of tabular decision logic – decision tables. Reading his post I thought I would add a little additional perspective on the value of decision requirements models.


Decision requirements models are represented with one or more decision requirements diagrams like the one to the right. These show your decision (rectangle), the sub-decisions into which that decision can be decomposed – the other decisions that are required to be made first, the input data (ovals) provided to the decision and the knowledge sources (documents) that contain the know-how you need to make the decisions.

These diagrams are a key element for several of the benefits Jan identified:

  • Transparency: The diagrams are much easier to read than a set of rules would be. I recently built a model to match some existing business rules in a BRMS demo and it was immediately clearer what the decision-making was.
  • Complexity: The diagrams manage complexity in your logic by breaking it down into self-contained pieces that are combined to make a decision.
  • Traceability: Tracing the impact of a change to a policy, shown as a knowledge source, is made much easier as you can walk through the model using the diagram relationships.

But they can do more too.

  • Risk Management
    Even if you don’t plan to write the decision logic, the business rules, for your decision the diagram brings clarity to the way you make decisions. This can be really valuable in a risk management context as it allows a clear definition of the decision-making approach that can be discussed, agreed and even shared with regulators.
  • Training
    We have had several customers take decision requirements models and use them to train front-line decision-makers like claims adjusters or underwriters. The models are more robust and easier to follow than the original document describing the decision.
  • Framing analytics
    When teams are going to use analytics, especially predictive analytics, to improve decision-making it is really important to frame the problem accurately and a model of the decision-making to be improved is perfect for this.
  • Orchestrating Decision Management Systems
    Decision Management Systems often involve business rules, predictive analytics, constraint-based optimization and adaptive control or A/B testing capabilities. How these pieces are being orchestrated to deliver value can be hard for non-technical people to understand – a decision requirements model makes it clear. In our decision modeling tool, DecisionsFirst Modeler, you can even build explicit links from the model to the implementation components.
  • Automation Boundaries
    One of the biggest challenges in automating decisions is determining what to automate and what to leave as a manual decision. A decision requirements model let’s you discuss and agree the decision-making and then consider what makes sense in terms of automation.

The book covers how to build these diagrams, as well as how to write decision logic, and discusses best practices for using these diagrams in all these different situations. If you want to know when Real-World Decision Modeling with DMN is available – and I hope you do – sign up for notification here. If you want something to read in the meantime we have a white paper on decision modeling with DMN, some upcoming online training. We also offer services in decision management and decision modeling and you can schedule a free consultation.

Many organizations have buried their operational decision making in business processes and information systems, making it hard to optimize how these decisions are made. This matters because more and more of the value created by business processes is associated with these kinds of decisions. As more processes are digitized and automated to keep pace with today’s consumers the role of decision-making in these processes is growing and many business processes are essentially “decision processes” in this new digitized world – especially in the key business areas of risk management and customer centricity.

Those of us who work in this space have noted a couple of things:

  • The complexity of decision-making processes can be addressed by externalizing and modeling decisions
  • Managing risk and customer centricity require both process and decision innovation.
  • Combining processes and decisions drives transformation and innovation in business operations creating real business value.

I am giving a webinar on this – How to Innovate Risk Management and Customer Centricity – with Roger Burlton of the Process Renewal Group , February 17th at 11am Pacific. Roger and I both work with leading organizations striving to deliver excellence in risk management and customer centricity and our experience makes it clear that a combined process and decision focus is the most effective way to improve business processes. You can register here.

DMNBookFrontCoverI am delighted to announce a collaboration with Jan Purchase of LuxMagi on a new book, Real-World Decision Modeling with DMN. You can read the full announcement here and Real-world Decision Modeling with DMN will be available from MK Press in print and Kindle versions. As Richard Soley, who has graciously agreed to write a foreword for us, says:

“A well-defined, well-structured approach to Decision Modeling (using the OMG international DMN standard) gives a repeatable, consistent approach to decision-making and also allows the crucial ‘why?’ question to be answered—how did we come to this point and what do we do next?” said Richard Mark Soley, Ph.D., Chairman and CEO, Object Management Group, Inc. ”The key to accountability, repeatability, consistency and even agility is a well-defined approach to business decisions, and the standard and this book gets you there.”

Our aim is to provide a comprehensive book with both a complete explanation of decision modeling as an approach/technique and the DMN standard itself. Plus some solid advice on the business benefits of using it as well as lots and lots of examples and best practices developed on real projects. Jan and I have been using decision modeling for years on all sorts of projects and we want to distill that experience to a book that will help new decision modelers get up to speed quickly while still helping those with more experience with crucial patterns and advice.

The two companies have used decision modeling on many projects and between us we have probably taught over 1,000 people decision modeling and DMN. We have worked with business analysts, process analysts, developers, data scientists and subject matter experts – all of whom have found DMN an accessible yet precise way to describe business decisions. Decision modeling lets you capture, communicate and facilitate agile improvement in even the most complex of business decisions. It’s been critical for our clients’ compliance efforts and for staying up to date with regulations. We are going to bring this breadth of perspective and deep experience to the book so companies and individuals can successfully adopt decision modeling.

I wish I could tell you the book was ready but it’s not – Jan and I are working hard on it and have a tremendous amount of great material already done but we really want to produce a pretty definitive guide – not just to DMN but to decision modeling using DMN.  As we work on finishing it, Jan and I will be posting about the book and the thoughts about decision modeling that writing it has provoked as well as asking questions to make sure we cover everything we need to and more. Watch this blog, Jan’s blog and the various LinkedIn groups we belong to for more updates. I hope you’ll engage with us and that you won’t find the wait too arduous.

To learn more and to sign up to be notified when it is published, visit http://www.mkpress.com/DMN/.

I have just finished updating Enterprise Scale Analytics with R with new data from the Rexer Analytics Survey for 2015.

As R has become more popular, the role of analytics has become increasingly important to organizations of every size. Increasingly, the focus is on enterprise-scale analytics—using advanced, predictive analytics to improve every decision across the organization. Enterprise-scale adoption of analytics requires a clear sense of analytic objectives; an ability to explore and understand very large volumes of data; scalable tools for preparing data and developing analytic models; and a rapid, scalable approach for deploying results.

According to the widely cited Rexer Analytics Survey, R usage has steadily increased in recent years. Organizations using R to develop analytic models face particular challenges when trying to scale their analytics efforts at an enterprise level. Complex data environments can make integrating all the data involved difficult. The typical R package is single-threaded and memory limited, creating challenges in handling today’s increasingly large data sets. These same limitations can mean it takes too long to analyze and develop models using this data. When all the analysis is done, deploying the results can add a final hurdle to achieving business value at scale.

Solutions such as Teradata Aster R that combine commercial capabilities with open source R offer a way to address these challenges. This paper introduces R, explores the challenges involved in scaling analytics across the enterprise, identifies the specific issues when using open source R at scale, and shows how Teradata Aster R can help address these issues.

You can download this white paper, sponsored by Teradata, here.

I am pleased to announce a new Decision Table Modeling online training offering taught by Professor Jan Vanthienen, a leading decision table expert. We are running a pilot of this class February 2-4 an you can get details, and a great price, here. To give you a taste of Jan’s approach, here’s a guest article by him.

The Value of Good Decision Table Modeling

By Jan Vanthienen, KU.Leuven

Managing and modeling decisions is crucial for business. The new DMN (Decision Model and Notation) standard emphasizes the importance of business decisions, and also offers a standard notation and expression for decision requirements and decision logic.

Advantages of Good Decision Tables
Decision tables look straightforward: a number of rows or columns containing decision rules about combinations of condition expressions with their respective outcome. The reason for their success is simply that every column or every row (depending on orientation) is about one specific condition. This fixed order of conditions allows a complete and easy overview of decision rules for a specific decision. It also allows grouping of related rules into tables, thereby providing an overview to a large number of decision rules.

The real advantage for business, however, is the ability to obtain consistency, completeness and correctness of the decision logic. Avoiding redundancy and overlapping rules is a key element in constructing and maintaining decision tables that offer true value for business.

Tables for Decision Logic Modeling
DMN provides constructs for both decision requirements and decision logic modeling. The requirements level allows for modeling and managing linked decisions, abstracting from the detailed logic of each decision. The decision logic level standardizes the way to express decision logic, e.g. in the form of decision tables. DMN provides a common model and notation that is readily understandable by all business users, from the business analysts needing to create initial decision requirements and then more detailed decision logic models, e.g. in the context of business processes, to the business people who will manage and monitor those decisions, and finally, to the developers responsible for implementing the decisions.

Decision logic modeling can take many forms, depending on the decision at hand, but decision tables are an important element. Most people know what decision tables look like: a number of rows or columns containing decision rules about combinations of condition expressions with their respective outcome. Decision tables have always been known for their ability to offer a compact, readable and manageable representation of decision logic. But many do not realize how the decision table concept has been refined throughout the years into a strict and powerful modeling technique (based on consistency by construction, normalization, completeness, correctness, etc.).

Decision Table Methodology
Different forms of decision tables exist in business practice, even under different names and with different semantics. What DMN offers is a standard notation and the ability to recognize and unambiguously interpret and exchange tables of rules in different forms. The core methodology to build sound decision tables is not part of DMN, but it still holds.

The decision table methodology offers:

  • Guidelines for composing effective decision tables (form, structure, meaning, etc.).
  • An overview of advantages and disadvantages of different types of decision tables (different hit policies).
  • A simple eight step method to construct good decision tables, starting from the description of the decision and leading to compact, normalized and optimized decision tables.
  • A sound decomposition of the decision structure.
  • Best practices on obtaining completeness, consistency, readability, maintainability.
  • A transition from the specification to design, implementation and maintenance.

Sometimes Scott Adams just nails it and late last year I saw this great strip on The Generic Graph. Work with analytics long enough and you see something akin to this – something Mychelle Mollot of Klipfolio called Building a One-size-fits-all Dashboard – one of the 6 mistakes she talks about in this article that she pithily summarizes as the “this sucks for everyone” problem. Mychelle goes on to propose that the solution is not to create a generic dashboard for the broadest possible audience but to create multiple dashboards targeted to specific roles within the organization. I would agree but go further – design dashboards to help specific roles make specific decisions.

Many roles, especially operational roles, have to make many decisions. How they make these decisions, the actions they take as a consequence, are what determine if they will meet their metrics. Displaying their metrics on a dashboard so they can see how well they are doing may be motivating but to actually improve those metrics you will need to help them make better, more profitable decisions. Yet most organizations, most dashboard projects, have never really thought about the decisions made by the people they are trying to help – at least not in any systematic way. In fact, instead of building a dashboard to support decision-making explicitly, most projects begin as Mychelle notes by being “data-centric” – pulling together all the data that might help. This creates a lot of visual confusion  and forces people to jump around multiple tabs or pages looking for the data they need right now to make the decision in front of them.

So how can you fix this problem? Well Mychelle lays out the first two steps:

  1. Figure out who your dashboards need to service
  2. Start with the more junior roles, those with an operational focus

Then move on to the more decision-centric steps:

  1. List the metrics or KPIs they care about
  2. Identify the decisions – the choices – they make that have an impact on these metrics
  3. Model these decisions (using the new Decision Model and Notation standard and a tool like DecisionsFirst Modeler) to understand how they make (or should make) these decisions
    This will give you a sense of the information and knowledge they need as well as how the decision-making breaks down into more granular decisions
  4. Use this model to layout your dashboards, one per decision, looking for opportunities to automate the lower level decisions while you are at it

Decision-centric dashboards really work. Go build some.

MobileReportCoverLast year BPM expert Sandy Kemsley and I did some research on the infrastructure you need to develop excellent, smarter, mobile apps. Mobile devices have gained enough traction with consumers and employees to require mobile applications as a part of an enterprise strategy. However, these mobile apps must be more than mere information presenters – they need to be “smarter” than typical enterprise applications. In the report we discuss the requirements for excellence in enterprise mobile and the challenges of using a traditional enterprise platform for delivering modern mobile apps. While most enterprise have realized that they need to add a mobile development capability, we also identify the decision and process management capabilities enterprises need to drive excellence in their mobile applications. Only by applying process and decision management on the back end, as well as mobile app development on the front end, can enterprises deliver the next generation of smarter mobile apps.

Report and webinar recording here.

Bill Fair, one of the founders of Fair Isaac, once said that to succeed with analytics you had to “grab the decision by the throat and don’t let go”. As Big Data and analytics become ever more central to organizations, and as more and more money is spent on analytics, this advice seems particularly timely.

As I have said before, it’s easy to spend money on data infrastructure, especially big data infrastructure, and on analytics without seeing much of a return. Too many companies assume that if they just collect enough data, hire enough data scientists, build enough analytics – spend enough money – that somehow their business results will improve.

I have bad news – they won’t.

Data and analytics, even big data and advanced analytics, cannot improve your business results. At least not directly. What they can do is allow you to improve your decision-making. Improve your decision-making and your business results will improve – pay fewer fraudulent claims by more accurately deciding which claims are fraudulent, manage risk better by accurately deciding how much risk a loan or supplier represents, retain more customers by deciding who’s at risk and what will stop them churning and on and on. Improved decision-making is that turns your data and analytic investment into better business results.

Which brings us back to Bill Fair and his pithy phrase:

  • Unless you know which decisions you need to improve, and what better decisions look like, you can’t improve them. Which metrics matter and which decisions make a difference is critical to identifying where to apply analytics.
  • If you can’t separate the decision you make from the process that acts on it or the system that stores the data it needs, you can’t change and evolve the decision to see which analytics work and which does not. Changing the analytics you use to decide which claims are fraudulent should not – must not – involve a change to your claims process as well.
  • If you don’t have a clear sense of the decision and how it impacts your business it’s hard to identify the kind of analytics that will help. If you are making decisions about how to save someone who is threatening to churn right now you need different analytics than if you are trying to decide how to proactively stop someone churning next month, even though your objective is the same.
  • Only if you have a clear sense of the business and legal constraints on a decision can you design appropriate experiments to adapt and improve your decision-making. Your data may “speak” but the regulations yell and its not generally practical just to turn the machine learning on and leave it to figure out what’s best.
  • Outcomes are not the same as decisions and analytic decision-making will not always result in good outcomes – it just results in more good outcomes overall. To analytically improve decisions you need to analyze the decisions you made and how they worked out. Yet most organizations only track outcomes and don’t record how they made decisions.

So if you want to succeed with analytics, grab the decision by the throat and don’t let go.




I have spoken at Predictive Analytics World a few times and it’s a great event. Predictive Analytics World for Business is coming up April 3-7, 2016 in San Francisco. This is one of the best events for predictive analytics professionals, managers and commercial practitioners. The conference focuses on case studies about data science, on their business impact and increasingly on big data. PAW SF is set to be one of the largest cross-vendor predictive analytics events ever. The program has top predictive analytics experts, practitioners, authors and business thought leaders, including keynote addresses from: Patrick Surry, Chief Data Scientist at Hopper; Kim Larsen, Director of Client Algorithms at Stitch Fix; and Eric Siegel, Conference Founder of Predictive Analytics World; plus a special plenary session from industry heavy-weight, Dr. John Elder, CEO & Founder of Elder Research.

You can click here to view the agenda which has over 30 sessions across 3 tracks- “All Levels,” “Expert/Practitioner,” Financial Services. The agenda includes  case studies about Autodesk, Can I Rank?, Capital One, Chase, CIBC, City of Boston, Experian, GE, Hewlett Packard, Hopper, Incapsula, Lynda.com, Telenor, Mashable, Microsoft, Omaha Public Power District, PayPal,  The Co-operators, WPC, Stitch Fix, US Bank, a large domestic car manufacturer and major fashion and apparel retailer.

You can sign up for event updates here and if you have inquiries e-mail regsupport@risingmedia.com. Register here – the super early bird price ends this week.

One of my pet peeves is people who develop mobile applications and then insist on writing all their business logic as Javascript or Node.Js or something. As if the fact of delivering a mobile application overrides the need to be able to have business/IT collaboration around business logic, reduces the need for business agility or obviates requirements to show traceability and transparency in decision-making to regulators. Now there are several ways around this – using business rules management systems to build decision services that are cloud-centric or cloud-accessible for instance – but the folks at InRule just announced an interesting one. The new release on InRule can generate Javascript deployments for the rules you are managing, allowing rules to be deployed on a server and pushed into a browser when that’s applicable. I blogged about the preview release of this (see this post) and I think this marks an interesting and useful development. Something that makes it easy to push partial decisions (or even complete ones) into browser-based applications is definitely worth looking into.

I recently wrote a new white paper on how System Integrators (and perhaps internal consultants) can Sell Business Rules:

SellBusinessRulesCoverSystem Integrators (SIs) often deal with complex logic and decision-making in mission critical systems that directly impact their client’s business. One way to ease managing the logic is through technology – like a business rules management system (BRMS). BRMS are particularly effective for complex processes like determining eligibility, compliance, claims processing and underwriting. This paper will discuss:

  • How a SI can “sell” clients on the value of a BRMS by focusing on the 3 key ways a BRMS delivers high Return on Investment (ROI)
  • The power of a BRMS to generate business value, develop efficiency and reduce maintenance costs
  • Ways to communicate the benefit of using a BRMS to your customers

You can download the paper from the sponsor, Progress, here.

My old friend Carsten Ziegler and some colleagues just published a new book on business rules in an SAP environment in German: Business Rules Management mit ABAP

The book is designed to help you integrate business rules with your ABAP environment. It shows how to model business rules in BRFplus to support and automate decisions. It discusses the benefits of SAP Decision Service Management for system-wide implementation as well how to work with decision tables, decision trees, formulas and other types of rule expressions.It also has information on deployment scenarios, application development approach and a worked example.

I wrote a foreword for them (Carsten assures me he translated it accurately) as I did for his last book on BRFplus. I also worked with him on content for Applying Real-World BPM in an SAP Environment.

I last got an update from Rapid Insight in 2014 and caught up again with them recently to discuss their 3.0 release. Rapid Insight was founded in 2003 and has over 200 client sites across education, healthcare and other companies. The product set is focused on predictive analytics, ad-hoc analysis and self-service data preparation.

Rapid Insight automates the process of building predictive models, making it easy to explore data and to quickly build models. The tool is designed to deliver automation in a flexible way, allowing users to automate as much or as little of the process as they prefer. The resulting models are tied back to the original data preparation processes so that they can easily be implemented and re-developed in the future.

They launched Rapid Insight Analytics 3.0 this year. The 3.0 product supports in-database and in-hadoop optimization using SQL pushback and has been re-written to exploit multicore processing. The products remains Windows clients with a wide range of access to data on any server or environment. The core user interface models the project workflow with data access, data preparation and modeling steps laid out. New steps can be added from a palette and each can be configured as you would expect. Multiple streams can be defined and R can easily be integrated. New in 3.0 is the ability to define sub routines so that pieces of projects can be reused. Jobs can be scheduled and managed as you would expect.

The Analytics 3.0 environment begins in a new data viewing environment that consumes the data processed with the flow. This automatically classifies the variables, identifies relationships between them and allows a wide range of (updated and refreshed) visualizations of this data. A report can be built incrementally as useful visualizations are identified. These reports can now be immediately shared with others via the cloud. 3.0 allows new clustering approaches and decision trees to be applied to the data.

Models can still be developed automatically from the data and the performance of this has been significantly improved in 3.0. The model can be built completely automated or the user can pick specific variables and techniques or take a hybrid approach, specifying some variables and letting the tool pick the others. The new environment makes it easy to transform and filter data in situ, new model analyses are available (and can be added to the reporting environment) and more information about distributions etc is available instantly.

All the jobs are invokable from a command line or other code environment and can execute for any number of records. Models can also be exported to PMML and deployed to a deployment environment.

More information is available on Rapid Insight here.

I caught up with Bruce Silver, well known for his work on business process modeling with BPMN, for his presentation on the Decision Model and Notation standard – DMN. Bruce began by introducing DMN – you can read my DMN introduction here – and emphasized that it is both a model (underpinnings) and a notation (for visual representation).  He identifies 5 key elements in DMN:

  • Decision Requirements Diagrams
  • Decision Tables
  • FEEL
  • Boxed Expressions
  • Metamodel and Schema

While I would argue that the first two and the last are the most important, Bruce feels all five deserve their place in the sun and that FEEL and Boxed Expressions are particularly under appreciated.

Bruce thinks that one can learn from the history of business process management. In 2000 for instance business process relied on long text requirements diagrams or a poor proprietary modeling tool. In 2005 BPMN fundamentally changed this by standardizing the specification approach but the execution was defined in BPEL. By 2008 this become unsustainable and BPMN 2.0 in 2011 provided both a standard notation with execution specifics and the revolution really began. DMN he thinks will follow the same trajectory with requirements documents being replaced now in 2015/2016 with decision requirements models and decision logic models. These are executable to some degree and can be linked to executable environments but Bruce believes that this will evolve to true executable models in the next couple of years to transform the decision management space.

Bruce went through the various elements, highlighting the key points of each.

  • Decision Requirements Diagrams he believes have power because they can include decisions that are human or external, allowing it to describe something larger than a decision service and managed/extended over time.
  • Decision Tables are a well-established topic (see Jan Vanthienen’s work for instance) and DMN allows a wide range of tables to be defined in a standard way. There are some constraints but really it just helps standardize things
  • FEEL – the Friendly Enough Expression Language – is the third element and is hard to learn about because it is a LANGUAGE. It’s an expression language that references variables defined in the models and produces an answer without side-effects. It has some powerful features for decision-making like iteration, filters, queries, list handling etc.
  • Boxed expressions are the next piece. Decision Tables are one kind of boxed expression but this goes beyond it and provides a boxed, laid out approach to expressing decision logic. Bruce thinks this standardized way to show other kinds of language is as important as the use of decision tables.
  • Finally there’s a metamodel and schema that will allow interchange of models between tools, a critical element. The new 1.1 specification has a much more robust schema and metamodel.

He discussed the conformance levels in the specification too but the consensus is these won’t matter very much as there’s no official testing of them. The one important element is that there is a subset of FEEL, Simple FEEL, designed to support decision tables only.

What’s missing? Well several things:

  • No UI for modeling data
  • No business glossary, though the details of one can be captured
  • No link to SBVR rules or policy rules
  • No methodology (as usual with notation standards)
  • Nothing about testing, about actions taken to access data or about execution optimization or error handling.

These areas are where tools can differentiate (though some may come to the standard in the future)

Bruce is working on his own methodology called DMN Method and Style. This is based on his experience with BPMN and aims to standardize some layout and style guidelines for decision tables and to some extent for decision requirements models. But the biggest thing is the focus on a method to develop decision requirements models – how to decompose your decision down based on various structural

Bruce is very focused, quite rightly, on the overall decision as a thing – not just the pieces that can be executed in the process but the overall decision. Sub-decisions may be linked to process tasks for execution, subsets may be grouped into decision services etc etc. But the execution should follow the decision – as Jan said, get the decision right first then develop an implementation pattern.

Bruce seems to think that everyone else thinks decisions should be broken up into pieces in a process framework – while some people do, we (Decision Management Solutions) don’t – we always begin with the decision in mind and model/manage it as a whole. He makes a valid point that such a complete model may have a complex relationship to the process – the whole decision may not be executed in the process and/or several layers of decisions may be worth linking from the process model. This is exactly how we have been doing this for years – build the model for the whole decision, then figure out how to map elements of that to the process.

He wrapped up with a summary of the status – the DMN 1.1 specification is what you need for schema/metamodel information but the DMN 1.0 specification describes the notation pretty well. 10 or 11 tools support DMN (including DecisionsFirst Modeler, Decision Management Solutions’ tool). His book he says will be out later this year and he hopes also to offer training (Decision Management Solutions has already trained 700 or so people on decision modeling and has DMN training coming up in November).

I presented on the lessons we have learned deploying decision management and decision modeling at scale at various clients, specifically some large financial institutions. We have some case study like papers available (contact us for a copy) and here are the key takeaways:

  • Decision Modeling
    • Widely Accessible, Collaborative
      Lots of people can build and use decision models
    • Many Use Cases
      Decision models work for manual decisions, decision requirements and decision automation.
    • Sketchability Drives Creativity?
      The diagrams are easy to draw on a white board
    • Networks Not Flow
      Decision models have reuse and structure – they are networks not sequence
    • Find Gaps, Reuse, Duplication
      Models help you find gaps and identify reuse.
  • Processes and Decisions
    • Model Processes and Decisions
      Modeling them both really helps build simpler, smarter and more agile processes
    • Process Analysis is Not Enough
      Some processes can’t be improved by focusing on the process, only on the decisions
  • Architecture and Governance
    • Put Decisions In Context
      Map them to processes, data, metrics and more.
    • Reinvent Compliance
      Focus on building compliance decision models
    • Decisions Are Better Drivers
      Invest in technology based on how it improves decisions
    • Manage Decisions Before Rules
      Use decision models to drive business rules
  • Agile, Iterative
    • Decisions Are High Change
      So model them and use business rules to make sure you can change them easily
    • Model Iteratively
      Use decision models to frame and structure business rules iterations
    • Decisions Drive Data
      Focus on the data you need to make and improve decisions

Our upcoming live online training is a great way to get started quickly with the decision management approach and learn the decision modeling requirements technique. Contact us for an IIBA discount.