Ed Lynch, from the AptSoft acquisition, was on next for me talking about using Business Event Processing for an agile business response. This was interesting given IBM’s recent announcements on business event processing.
Charles Brett from Forrester began with an overview of Event Processing. Charles, like many, assumed that event processing was a narrowly focused approach suitable only for critical financial markets. However, now he feels that Event Processing is getting more attention and this interest often comes from the business. It has better awareness and more usage than they expected.
They divide events into internal and external IT events (technical events either generated by internal equipment or fed in by an outside, technical vendor) and internal and external non-IT events (machines on the shop floor, news feeds, arrival/departure information, RFID etc). These different kinds of events are used in different categories of processing:
- Systems and Operations Event Processing
Classic IT monitoring
- Business Activity Monitoring
- Rules-based processing
Not sure I would group this as a kind of event handling but I see his point and I might call this “decision management” 🙂
- Complex Event Processing
He showed a quadrant chart with Simple to Complexity of events and Human speed to Machine Speed as axes. Complex Event Processing is the top right segment in his mind.
- Business Event Management/Processing
- General Purpose Event-based Processing
These are not mutually exclusive. He gave some examples of event processing that were focused on non-IT events such as machine and pipeline events. One common factor is a desire to respond and change more quickly than IT can typically manage. His examples are mostly what I would consider event-based decision making in that it is not enough for the business to detect and correlate events, it must also be able to decide how to act in real-time, a classic decisioning problem. Many of the problems are more about using events to connect information from multiple systems than about any real-time correlation. The events could be written to a database and then processed overnight, for instance, and generate the same results. Nevertheless, it is the events to which the systems are responding I suppose.
He went on to discuss how Event Processing relates to SOA. Real-time is becoming more and more important, time in general is becoming more important. Event processing can and should use service-based resources through an ESB, for instance, and in many ways Event Processing extends SOA.
Event processing is NOT about throughput but about complexity and the importance of the time definition. Don’t hold off waiting for high throughput and listen to the business people who know what the business events really are. Event processing can and should extend IT and SOA. He ended with a slide, which I edited slightly:
Business people recognize the events that matter, IT rarely does.
This led into the IBM part of the presentation where Ed presented on Business Event Processing with IBM. A business person needs their business to respond to opportunities that can be sensed and processed. They might figure this out by data mining, by experience or by examining historical events. However, although the business understand the business events that matters they can’t do it themselves.
The intent of the new product is to put power in the hands of business users so they can collaborate with IT – an attitude I like when discussing rules. The business users wants to be able to access certain events and take certain actions, and these typically require the IT department to expose those event streams and action hooks.
They have identified 5 event processing patterns around detect, decide and respond:
- Information dissemination
- Business monitoring
- Active diagnostics
- Predictive Processing
- Service Availability
WebSphere Business Events is about helping the business folks manage the business monitoring scenario – to express and then deploy correlations and actions to be taken.
- IT goes first, and surfaces the events by giving meaning to existing technical events. Simialrly they surface available actions
- Business people use a rule template, essentially, to express a pattern or filter – Event-Condition-Action – where the events are those made available, as are the actions. The information in the event can be used in the conditions.
- The event flow is assembled by combining the events and filters into a flow – when an event arrives, that passes the filter, you move to the next step.
- The event flow ends with an action, from the list, to execute.
Essentially various event sources put events into the “event cloud”. Some of these have been surfaced so you can define correlation patterns and drive the actions you want using services, processes etc. Because the correlation patterns are declarative, you can keep changing them. You can add new steps and hot deploy them. WebSphere Business Events allows you to have business users manage these correlations dynamically. The simplicity of authoring is what they regard as the key factor – the use of declarative rules using a friendly, template driven user interface – the need to make sure rules can be managed in business terms is always key (wiki entry).
It is clear to me that one possible outcome is the triggering of a decision service that is also controlled by the business, though clearly there is overlap with the rules in the correlation. Perhaps I would describe these as event decisions or, as someone at Tibco once said, an Event-Decision Architecture. As usual, though, the issue is going to be how dependent the rules are on technology implementation. Those related to real business issues (good customer, best offer, eligible, product pricing) should be in decision services while those related to event correlation should not be.
An interesting session – more on event processing to come on the blog soon.