≡ Menu

Live from Business Rules Forum – Turbocharging Business Rules Engines with BI 2.0

Share

Charles Nicholls of SeeWhy was up next, talking about Turbocharging business rules with BI 2.0. Charles wrote a nice little eBook called In Search of Insight (it’s free and you can download it here). I have also blogged about SeeWhy a couple of times – here and here.

Charles defined BI 2.0 as a business revolution caused by the move to SOA, which enables intelligence to be built into business processes. He made the point that this is not just analysis of real time data, also about how you act. He half-jokingly said that BI 1.0 let’s you support the decisions you already made 🙂 BI 2.0 in contrast is continuously calculated, flexible/adaptable, real time, intelligent (self tuning or adaptive) and closed loop (automatically adjusting based on results). BI 1.0 cannot be closed loop – it is “open loop” because it is performance reporting in retrospect. BI 2.0 would monitor customer metrics continuously and then trigger action based on changes in them and rules could use these constantly changing metrics to make better decisions.

Charles talks about Intelligent Processes as relevant, personal and responsive. Personal also means targeted so that you avoid the problem of averages – one store may be out even though, on average, there is plenty of supply. He makes the point that there is more and more data and as a result no-one is going to be looking at most of it because there simply isn’t time.

In BI 2.0, service-oriented, modular services replace monolithic applications with a matching move to real-time from batch. The BI 1.0 approach was database-centric and based on data at rest using queries. SOA makes applications more layered, forcing you to handle data in flight and so BI must be more event-based. Very hard to add BI on to distributed systems after the fact because of the distribution of data and processing unless you can plug in BI to your SOA.

Charles sees a move from presenting historical data to a real-time, automatic, event-driven interpretation of streams of data. En route we pass through Business Activity Monitoring which gives more in-flight, real-time data but not necessarily help me take the right action. His definition of BI 2.0 handles event stream processing – capture, analyze and act based on in-flight analysis of data as it streams in. Uses SOA to get access to very large range of data sources and acting in a variety of ways. Typically runs in-memory as the cost of computing and memory plummets. In-memory means no need to store first and much more able to process in flight. These kinds of Decision Services might be purely based on the analytics or might use results from these kinds of analytics.

He gave an example of updating a rules-based approach with an in-memory profiling approach. The example was around detecting fraud while avoiding false positives. False positives were running at 886/1000 – only 14 correct out of the first 1000. Classic fraud rule might be 4+ cards swipes for more than a certain amount in a certain period. These can have a high false positive ratio. The dynamic profile decides how likely this is to be fraud given the particular card holder. This in-memory profiling drives down the false positives. Using in-memory analytics to profile means that the rules have a constant stream of expected data against which to run. Also able to use the dynamic profiles to have a generic rule that uses the expected value for a given situation and a specific person rather than having many explicit rules for different situations. This technology works well as an adjunct to the kind of explicit rules and predictive analytics that show up in decision services.

He ended by pointing out that while an artificially intelligent system would manage itself, a more practical (or smart enough) system would run itself but still expect to have someone supervise. An autopilot might be a perfect example.

Share

Comments on this entry are closed.