Some time ago a regular reader, Dave Wright, left a comment on a blog post I wrote as a guest. In it he asked “What comes after EDM?”. This, of course, is both an interesting and a difficult question. Thinking about it I decided to split it into two parts – one about technologies that I see coming “after” the current set of technologies used in EDM and one about the kind of business one can run after one adopts EDM. The second one I will post tomorrow and I would love comments on either so let your creativity flow.
Today EDM relies on a number of key technologies:
- Business rules management
An ability to express the rules of a business, its business logic, in a friendly and declarative syntax – typically, though not always, through a business rules management system.
- Data mining ad descriptive analytics
An ability to mine data and develop insights and business rules based on what has happened in the past
- Predictive analytics
An ability to use data about the past to make predictions about the future and represent those predictions as equations that can be executed – turning uncertainty into probability
An ability to define the tradeoffs in a problem and mathematically find the optimal solution given the business needs that must be met
An architecture that allows decisions to be extracted and managed as reusable, self-contained decision services and then composited into applications to run the business. This might be process-centric (BPM) or event-centric (EDA) but it is almost certainly built on a service-oriented approach and architecture (SOA).
- Data and information management
Everything from data warehouses to high performance databases to performance management and reporting tools to manage, ensure the quality of and understand the information you have.
Adaptive control is missing from the list simply because it is a technique rather than a technology. Several changes in this technology list can be expected:
- More natural language and domain-specific processing
Already rules languages are understandable to the business and easy to map to specific domains. Some, like Haley, are already close to natural language. Over the next few years this will increase as business users take more control of the logic in their systems and demand that they can do this using their own words and jargon.
Ade McCormack had a relevant analogy – think of an ethnic restaurant, he said. In the kitchen you can speak the language of the food, in the dining area you must speak the language of the diners. Increasingly the business diners will expect to express logic in their language to the exclusion of the language of the IT cooks and the software will evolve to make this possible
- More automation in analytics
Already analytic tools like those from SAS, SPSS, InfoCentricity and Fair Isaac are automating more and more of the grunt work in the development of analytic models. Folks like KXEN have taken this far enough to allow non-statisticians to develop models. Both trends will continue and accelerate so that professional modelers are more productive and so that business users are able to do more of the modeling they need for themselves. This will mean more models being built, even in areas where the marginal value is small because the cost of building a model is so low.
The use of genetic algorithms and other evolutionary approaches, combined with every increasing processing power, will also mean that more models are built and compared/combined for existing problems and this should mean ongoing continuous improvement in the quality of models.
- Automated adaptive control
One of the consequences of the automation of model development is that there will be more models in production and the value of updating these models will be smaller – the model is not making that much difference so time and money cannot be spent refining it. This, plus the general desire to automate more of this process will lead to automation of the adaptive control process. Automated routines will test new models, even ones that are expected to do poorly but that will shed light on the problem, and collect and use the results. Simple versions of this already exist for models where the results are very immediate (offer accepted or not) but the ability to consider longer-term effects and overall customer profitability in these automated systems will soon come.
- Broader data sources
Clearly the addition of unstructured and semi-structured text to decision automation is happening already. Products exist to find insights in text and include that with structured data to make better predictions. Clearly this will continue to evolve as new data sources such as voice and video become well enough understood to be analyzed by readily available computing power.
- Time-sequenced decision analysis will be automated
Today most decision analytics are focused around a single decision. Yet each decision is connected to prior and subsequent decisions and, in theory, you are trying to optimized the whole set of decisions over a customer’s lifecycle. When you get a customer to sign up for something this creates new opportunities for subsequent decisions, for instance. Doing analytics across such time-sequenced decisions is still extraordinarily difficult but the math involved is likely to get figured out and factored in to decision making in the next few years.
I am sue there are others but these were the ones that came to mind. Let me know what you think.
Tomorrow, how will adopting EDM change businesses.