≡ Menu

Decision Services need more than rules – #rulesfest

Share

I just completed my presentation at RulesFest so here are the key points.

For the RulesFest audience I assumed that either they were already using business rules or at least that they plan to be, and that  they knew what a rule engine is and how it works. Before going on to my five points, let me define a decision and a decision service:

A decision has a number of characteristics. It involves selecting and committing to a course of action, not just knowing something. It involves some consideration, of the data available now as well as of historical data perhaps, and it involves applying judgment, experience, policies or regulations that constrain how the decision can be made or guide how it should be made. I often find it helpful to phrase decisions in terms of questions – should this claim be approved, what discount should this customer get, is this financial instrument going to bring the world financial system to its knees – that kind of thing. The possible answers are the actions the decision is selecting from and answering the question means committing to a selection from that list.

So, if this is a decision then a decision service is clearly going to be a service that makes decisions – a systems component that performs a particular decision-making process for us. It is, if you like, a service that answers questions for other services. In an event-centric model, decision services should be replaced with decision agents or something similar but I will use decision service as a shorthand for a component suitable for your architecture that makes decisions.

There are five things you really need to know if you are going to build and deploy successful decision services:

  1. Decisions matter more than rules do
    One of the biggest problems I see when I work with companies adopting business rules is what I call the “Big bucket o’ rules” problem. They pick a business rules engine or a business rules management system and then they start capturing their rules. And they capture a lot of them – they interview experts, read policy manuals and reverse engineer code into rules. Especially when they reverse engineer rules they end up with a lot of fairly low-level rules. In one, big, bucket.
    The problem is that they can’t use this big bucket of rules. They can’t plug rules into a process easily, though they will try to add a rule here and a rule there on their (often very complex) process diagrams. They can’t really manage the rules either – there’s no organizational structure to these rules so they end up being grouped by source or by the person doing the rule identification.
    Rules come and go, rules change. The need for a decision typically does not change. New decisions within a business are rare, new rules or changed rules are common. Connecting systems and processes to decisions – decision services – and driving those with rules will get you the results you are looking for. Coincidentally, decisions are a great way to clean up those horrible processes your business users build also. So begin with the decision in mind.
  2. Execution is less important than management
    IT people – technical people – worry about execution. They pick business rules products based on execution POCs. They design systems based on performance worries. They spend their time checking and tweaking performance. They write rules to maximize performance, even at the expense of readability. They want their rules-based systems to run FAST.
    But I have to tell you that execution is never the cause of failure for rules projects. Or if it is then it is not the performance of the rules piece – its data access or some other aspect. Even when I come across companies ripping out rules implementations “because of performance” it has never been the case that the poorly performing system was performing poorly despite well-architected rules. The use of business rules was not causing the poor performance. The reality is that rule engines are fast, they work well and that all the major and most of the minor vendors (as well as the open source community) have produced reliable, fast products.
    So if rule execution does not cause real problems, what does? Well it turns out that business rule management is critical. When you come to make a change to your rules, and you will do, do you know where your rules are? Can the non technical people you work with read your rules? Or are they gobbledegook? Are they organized for you or for them? How about the objects they work on? Are they modeled and described the way you see the data in the system, or the way the people who run the business see the data?
    One of the consequences of this, and one of the reasons it is so important, is that you don’t want to try and get all the rules at the beginning. Some rules are obvious, agreed, clear. Some are contentious, unusual, poorly understood. Some rules are fairly stable, others change so fast it’s not even funny. Don’t try and capture every rule at the beginning, set up a process to continuously improve and extend the rules you have.
  3. Your data knows where (some of) your rules are
    People look for rules in many places. They read code (much more often than they should), they analyze policies and regulations, they interview experts. But think about how people make decisions – they use their expertise, their know-how which we can represent as rules; they apply policies and regulations (you hope they do anyway) that also map to rules; they apply their personal biases and discrimination (we can probably skip implementing those); they look at the data available (this is what we send to the decision service) and they think about what worked or did not work in the past – they analyze historical data about the effectiveness of past decisions to draw conclusions that can help with the current decision.
    To bring this to bear we need to think about analytics – the use of mathematical techniques to extract insight from historical data. Folks in FICO R&D used to talk about analytics simplifying data to amplify its meaning. All that historical data, past transaction history, can be used to find some of the rules you are looking for. Historical data tells you what the thresholds should be, they let you see the impact of using <6 instead of greater than 6.2. And you can mine data for rules – using association rules or decision trees say. Dean Abbott had a great presentation on this at PAW recently, for example.
  4. Data depth can improve its width
    Most IT people see historical data as something to be stored and regurgitated. Something to be purged when it gets too old. But a depth of historical data can be turned into width – creating new data elements from historical trends and analysis. Predictive analytics turn uncertainty about the future into usable probability. Past behavior is a great predictor of future behavior. You can use analysis of historical data to essentially add new attributes – if you have a lot of historical data about marketing results, for instance, you might be able to predict how likely someone is to buy something. You can turn uncertainty about who will buy product X into a usable probability – how likely is it that this person will buy product X. Of course once you have done that you need rules to deal with the “so what” problem – to act on these analytic insights. This depth of data –lots of historical data below the record you are currently processing – turned into additional width – new attributes to assess.
  5. You can’t use the same rules on every transaction.
    Learning is a key element of reasoning systems. Because decisions are externalized from your systems and under business control, you can see how these decisions might be made differently and experiment. The most effective way to do this is to integrate what is known as adaptive control or champion/challenger in your decision making – marketing and web people sometimes call this A/B testing. Capital One, for instance, is famous as a serious tester. It runs thousands of randomized tests to find out what kind of credit card solicitation is most effective. Should it offer a six-month teaser rate of 2% or a three-month teaser rate of 4%? On the web or in the mail it offers these alternative solicitations to random groups of prospects and analyzes the answer. Constantly challenging the approach you use for most of your decisions with alternative decision-making approaches so that you learn what works and what does not. The CEO of Harrahs is famous for saying there are three ways to get fired from Harrahs – steal from the company, sexually harass staff or guests, or do something without running a test with a control set!

5 points to bear in mind when adopting business rules. There’s lots more on the recorded webinars.

Share

Comments on this entry are closed.

  • m ellard October 15, 2010, 9:24 am

    Excellent stuff this. Thanks for this post.
    Just saw a blog post that was talking about data quality specifically, but it had a point I think about rules in general that makes sense. It talks about rules as a service – and, when you think about it that way, rules exist to support decisions – so decisions end up front and center, and rules become more relevant and more streamlined. Here’s the link – maybe a worthwhile angle on this discussion?:
    http://ebs.pbbiblogs.com/2010/09/29/a-services-approach-for-data-quality/

    • James Taylor October 15, 2010, 10:13 am

      Thanks for the comment. Data quality rules are a slightly different beast but I agree that a service-oriented approach would be useful. One of the reasons I focus on decisions rather than business rules is that business rules get everywhere – in user interfaces, in data quality, in processes. Keeping a focus on decision-making logic and delivering decision services is critical to success with a BRMS but it is certainly not the only use of rules.
      Check out this discussion over on ebizQ – What exactly do you mean by business rules.