≡ Menu

It’s been a while since I did a product review on the blog, but I recently caught up with the team at Zoot and thought a blog post was in order.

Zoot, for those of you who don’t know them, deliver capabilities and services for automated decisioning across the customer credit lifecycle. They’ve been at this a while – 31 years and counting – and focus on delivering reliable, scalable and secure transactions in everything from customer acquisition, to fraud detection, credit origination, collections and recovery. They have some very large financial institutions as clients as well as some much smaller ones and a number of innovative fintech types.

Zoot’s customers all run their systems on Zoot infrastructure. Zoot has 5 data centers (2 in the US, 2 in the EU and a new one in Australia) for regional support and redundancy – though each is designed to be resilient independently and is regularly reviewed to make sure it can support 10x the average daily volume.  These data centers run the Zoot solution framework – tools and services supporting a variety of capabilities including data access, user interfaces, decisioning and more.

The core of the Zoot platform is the combination of the WebRules® Live execution service and the WebRules® Builder configuration tools. These cover everything from designing, developing and deploying workflow and user interfaces to decisioning, attributes and source data mapping. Zoot’s focus is on making these tools and services modular, on test-driven development, and on reusability through a capabilities library. The same tools are used by Zoot to develop standard capabilities and custom components for customers and by customers to extend these and develop new capabilities themselves. Most clients begin with pre-built functionality and extend or customize it, though some are starting to use Zoot in a Platform as a Service way, building the whole application from scratch to run on the Zoot infrastructure.

Zoot’s library consists of hundreds of capability-based microservices across 7 broad areas:

  • Access Framework, functions as a client gateway and makes it easy to bring real-time data into the environment and manage it.
  • User interface, to define responsive, mobile-friendly UIs that create web-based pages for customer service and other staff.
  • System automation, to handle background and management tasks.
  • Data and Service Acquisition, to integrate third party data into the decisioning from a wide range of providers and internal client systems.
  • Decisioning, to apply rules to the data and make decisions throughout the customer credit lifecycle.
  • Data Management, to manage the data created and tracked through the workflow, store it if necessary and deliver it to the customer’s environment.
  • Extensions, to fulfill the unique needs for clients, such as machine learning and AI models.

One of the key differentiators for the Zoot platform is the enormous range of data sources they provide components for. Any data source a customer might reasonably want to access to support their decisioning is integrated, allowing data from that source to be rapidly pulled into decisions without coding. Even when clients come up with new ones, Zoot says they can quickly and easily add new sources to the library.

WebRules Builder is a single environment for configuring and building all kinds of components. A set of dockable views can be used to manage the layout and users can flag specific components as favorites, use search to find things across the repository and navigate between elements that reference each other.

A flow chart metaphor is widely used to define the flow of data and logic. Components can be easily reused as sub-flows and the user can drill down into more detail when needed. Data is managed throughout the flows and simple point and click mapping makes it easy to show how external data is mapped into the decisioning. Flows can be wrapped around inbound adaptors to handle errors, alternative data sources etc. Libraries exist, and custom versions can be created with a collection of fields, flows, reports and other elements. These can be imported into specific projects, making the collection of assets available in a single action.

Within these flows the user can specify logic as either rules or decision tables. Decision tables are increasingly common in Zoot’s customers, as in ours. A partner region allows for external code to be integrated into the client’s processes – for instance a machine learning model or external decisioning capability. An increasing number of clients are using this to integrate machine learning with their decisioning – though some of this is parallel running to see how these more opaque models compare to the established approaches already approved by regulators. Debugging tools show the path through the flows for a transaction and all the data about the flow of transactions – which branch was taken, which rules fired – can be recorded for later analysis outside the platform.

Sample data for testing can be easily brought in and Zoot provides sample data from their third party data interfaces also to streamline this process. APIs and interfaces can be tested inside the design tools, with data entered being run through the logic and responses displayed in situ. Unit tests can be defined, managed and executed in the environment. Clients can handle their production data entirely outside Zoot, passing it in for processing, but a significant minority of clients use database capabilities to store data temporarily on the Zoot infrastructure. System scripts are used to make sure that all the data ends up back in the client’s systems of record and data lake when processing is complete.

Zoot occupies an interesting middle ground among decisioning providers. Everything is hosted on their infrastructure – clients don’t have the option to run it on their own infrastructure – and Zoot has invested heavily in providing a robust infrastructure to support this. Yet Zoot is not trying to “own” the customer data or do multi-customer analysis, as many SaaS and PaaS companies are – their customers own their own data. Indeed, Zoot makes a point of pointing out that all the data gets pushed out to the client nightly or weekly. This gives clients a managed infrastructure without losing control of their data, an interesting combination for many I suspect.

More on the Zoot platform at https://zootsolutions.com

Today’s businesses are heavily digitized and must be structured so that while interacting through digital channels, the value gained from customer experience between humans and digital touchpoints can be handled seamlessly and effectively. As a result, the line between business and technology is gradually blurring. In addition, most organizations still face many challenges with the agility, flexibility, and customer-driven focus of business systems and processes needed to support new digital business models.

By fusing business and technology in the digital age, the automation of digitized decision making or Digital Decisioning integrates analysis methods that go beyond data-driven to integrate data with predictive analytics and support for AI applications. The approach in the book is described to be easily understandable for business professionals with examples.

As one of those who have worked in the field of rule-based AI for many years, James Taylor’s approach includes easy-to-understand, rule-based AI, integration with analysis methods, sophistication added by machine learning AI, and a continuous improvement loop. Please read it and use it as a reference book for system planning, development, and use.

eBook available at: http://www.contendo.jp/dd

Download the book summary flyer in Japanese here.

The single most critical and most neglected aspect of artificial intelligence (AI) projects is problem definition. All too often, teams start with data, determine what kind of machine learning (ML)/AI insights they can generate, and then go off to find someone in the business who can benefit from it. The result? Lots of successful AI pilots that can’t make it into production, and they don’t end up providing viable and positive business outcomes.

It’s estimated that 97% of enterprises have invested in AI, but is it really serving the business?1

Gartner’s 2019 CIO survey points to the fact that, although 86% of respondents indicate that they either have AI on their radar or have initiated projects, only 4% of projects have actually been deployed.2

Susan Athey, Economics of Technology Professor at Stanford Graduate School of Business, calls out the gap between ambition and execution when it comes to AI projects: “Only one in 20 companies has extensively incorporated AI in offerings or processes. Across all organizations, only 14% of respondents believe that AI is currently having a large effect on their organization’s offerings.”3

So what’s the problem? For one thing, many AI projects are technology-led, focusing on algorithms or tools that teams are familiar with. Others start with whatever data the team happens to have available. But data is frequently siloed and difficult to access, so is it the right and relevant data? While it’s true that data, tools, and algorithms are vital for the success of AI projects, putting the focus on the technical aspects is risky. Combining readily available data with known tools and algorithms is certainly likely to produce an AI-driven result more quickly—but there’s no guarantee it will have business value.

There’s a better way. Though it may sound counter-intuitive, AI teams need to work backwards to get their projects into business production. In other words, they need to pinpoint where they want to end up and then figure out how to get there. For a more practical and rewarding payoff, they need to focus on decision-making and on what a better decision looks. By collaborating with business units to define the decision-making that needs to be improved, identifying the kinds of ML/AI that would really help, and only then going to look for data, AI project teams will drive true business value.

So how does your team step out of its comfort zone and learn to work backwards? Advisory Data Scientist at IBM Aakanksha Joshi and Decision Management Solutions CEO James Taylor will show you how to achieve success with your next AI project. They will be offering five lightning rounds at the IBM Digital Developer Conference, where you’ll gain data and AI skills from IBM experts, partners, and the worldwide community. You’ll have the opportunity to participate in hands-on experiences, hear IBM client stories, learn best practices, and more.

Data & AI 2021

June 8, 2021 | 24-hour conference begins: 10:00 am AEST

Free and on demand

Register today

We look forward to seeing you there!

1 Building the AI Powered Organization, HBR July-2019
2 2019 CIO Survey: CIOs Have Awoken to the Importance of AI
3 MIT Sloan Management Review September 06, 2017

Don’t jail your logic in code

Our friends at Data Decisioning forwarded an article from The Register recently – Inflexible prison software says inmates due for release should be kept locked up behind bars.

The basic building blocks of this story is that there is a module calculating release dates for prisoners that was clearly implemented exactly the same way the rest of the system was coded.

This is what we would call a “worst practice” because decisions are not the same as the rest of your system and should be implemented differently. Deciding things about customers, about transactions or, in this case, about prisoners is not the same as workflow or data management. Decision-making should be implemented as stateless, side-effect-free decision services using decisioning technology (decision models, business rules). Not code.

Why? Well decisions are different.

  • They are rich in business understanding (in this example, legal understanding).
  • They are prone to regular change (in this case because the political and social environment is changing).
  • And they are often required to be transparent (in this case to demonstrate that all the correct laws and regulations have been followed).

Code is impossible for business experts to verify, time consuming and expensive to change, and opaque. So writing code to implement decisions is a terrible idea. It’s like taking your business know-how and locking it in jail!

Back to our example. In this case the code of this module “hasn’t been able to adapt” to new regulations even though it has been nearly 2 years! And regulations are generally signaled well in advance so they’ve really had more than two years. If they had recognized that decision-making should be implemented using decisioning technology, this would have been easy to fix.

The most revealing part of the story is the attitude of those who wrote the software. They dispute the use of the term “bug” to describe the system’s “lack of adaptability”. I guess being hard to change is a “feature”? This of course is nonsense. No-one is writing code for a completely rigid, defined, unchanging world. Lacking necessary adaptability is therefore definitely a bug.

What’s worse is that they knew all along that they would have to make changes! They say

It is not uncommon for new legislation to dictate changes to software systems. This translates to a change request and not a bug in the system.

No, wrong. It’s not a change request, its normal operations. New regulations are normal. Therefore change to regulatory rules is normal. Therefore being able to change the rules is normal too, and not a change request. The idea that this kind of change might require 2,000 hours of programming is nonsense. Leaving aside the apparently outrageous rates, this is terrible design and shows either a total disregard for their customer or a total lack of awareness of best practices (or perhaps both).

What this company did was take business domain know-how, business logic, and lock it away in opaque, hard to change, hard to manage code. And then when the inevitable happened and the rules changed, they failed their customer who now has to add workarounds and fixes outside the system – I can see the yellow stickies all over the terminals in my mind’s eye….

So, don’t be like those bozos. Identify the decisions in your systems. Model them so you understand them. And then implement them in a Business Rules Management System so your business partners can make their own changes when the regulations change, the market changes, customer change, their business changes or the world changes. Because it will.

Keep your logic out of code jail.

Bart de Langhe and Stefano Puntoni recently published a great article in the MIT Sloan Management Review called “Leading With Decision-Driven Data Analytics.” In contrast to so much of the literature that focuses first on data, they focus on decision-making. In fact they go so far as to say that:

“Leaders need to make sure that data analytics is decision-driven.”

They describe how focusing on data and on insights can lead companies down blind alleys and is not really a way to become “data-driven” at all. We like to say that companies should do analytics backwards. The authors focus on the purpose of data:

“Instead of finding a purpose for data, find data for a purpose. We call this approach decision-driven data analytics.”

They contrast this decision-centric approach to traditional data-centric ones very nicely:

“Data-driven decision-making anchors on available data. This often leads decision makers to focus on the wrong question. Decision-driven data analytics starts from a proper definition of the decision that needs to be made and the data that is needed to make that decision.”

This has been our experience as well. Companies that focus on the decision they want to improve before doing their analytic work are much more likely to succeed in operationalizing an analytic or data-driven approach. Bert and Stefano are focused on management decisions where we focus on operational ones, but the conclusions are the same.

They identify three steps to success:

  1. Identify the alternative courses of action.
  2. Determine what data is needed in order to rank alternative courses of action.
  3. Select the best course of action.

I would add to this only that building a decision model is a critical step between 1 and 2, especially for decisions you are going to make more than once. Defining a decision as a question and possible (alternative) actions is the right first step. To get from that to the data and analytics you need often involves breaking down the decision into sub-decisions and considering each of them independently. This is what a decision model is particularly good at. Applying their steps 2 and 3 to each sub-decision naturally leads “up” the model to a successful step 3 for the main decision.

It’s a great paper and you should definitely read it. You might also enjoy these papers on decision modeling and on framing analytic requirements using decision modeling.

We help a lot of clients select, install and adopt a Business Rules Management Systems (BRMS). These clients are looking to get automate decision-making with transparency, deliver business control of their critical decision-making logic and establish an ability to drive continuous improvement through simulation and impact analysis. Adopted correctly, these benefits ensure that a BRMS delivers a great ROI.

To maximize this ROI our clients are looking to get the benefits of their BRMS faster, spend less on implementing their BRMS and increase the size of the benefit they see. Here are some tips based on our experience:

Faster

The best way to get benefit from a BRMS faster is to get to a working decision service faster. More than anything, our experience shows this means capturing the requirements for that service faster.

For this we use decision modeling and the Decision Model and Notation (DMN) standard as well as our decision modeling software, DecisionsFirst™ Modeler. Experience is that this can reduce the time to get your decision requirements and business rules right by 5-10x, getting you to an ROI months earlier than traditional rules-first analysis approaches.

Cheaper

Decision modeling also dramatically reduces the amount of re-work by getting the requirements right the first time. This lowers cost too. More importantly, it creates the kinds of rules that business users can maintain themselves, reducing IT costs by eliminating the need for projects to make rule changes. It let’s you take more advantage of simulation tools in your BRMS, reducing the need for and cost of testing.

Small, regular changes also cost less than waiting until there are enough changes to justify a project. And these updates are themselves much cheaper because a decision model makes it easier to tell what change is needed.

Bigger

Bigger ROI comes from using the BRMS on a larger scope, something that getting faster, cheaper projects will help ensure. But it also comes from creating an environment in which the business can truly take advantage of rapid business rules updates – something a BRMS is really good at but that goes unused all too often. The role of a decision model in creating an environment where this kind of rapid iteration is the norm really can’t be overstated – it’s the key.

So, if you want a bigger, faster, cheaper ROI from your BRMS, don’t forget to add decision modeling. Check out our recently updated white paper Maximizing the Value of Business Rules for more. If you’d like our help with selecting or adopting a BRMS, drop us a line.

POSITION FILLED

Decision Management Solutions is growing and looking for a Delivery Manager for its projects.

The Delivery Manager will be an experienced hybrid agile project manager and will be responsible for managing several concurrent, discipline based, high visibility projects using agile, and fixed milestone methods in a fast-paced environment that may cross multiple internal business divisions and services engagements.

Goals

  • Deliver agile projects that provide exceptional business value to users
  • Achieve a high level of performance and quality, and
  • Further the delivery of discipline and supporting methodologies

The team at Machine Learning Week/Predictive Analytics World has announced the schedule for 2021 (virtual conference, May 24-28, 2021) and issued their call for speakers. This is a great conference and will be a great opportunity to present. As always those with case studies and real experience will be particularly welcome!

I will once again be chairing a business-oriented track focused on operationalization of models, business management of machine learning and best practices for extracting real business value from machine learning, AI and predictive analytics. So if you’d like to talk about THOSE issues, I’d really like you to apply! Feel free to reach out to me directly with questions but I encourage you to apply.

Topics you might think about presenting on:

  • Success stories on how you build analytic models that added real business value
  • Horror stories on how to build models that don’t add value
  • Project management approaches to engage the business and IT in analytic projects
  • What other technology you use besides your favorite ML/analytic workbench and why it helps you get to production
  • What you’ve learned about hiring, developing and training analytic talent
  • How your company learns and improves when it comes to machine learning and analytics – communities, wikis etc.
  • Rollout best (and worst) practices
  • Experience with ML Ops and other operationalization steps

Plus of course anything around best practices and experience actually building the models is always welcome!

Deadline is November 6, 2020! Sign up here.

Eric Siegel and I had a great discussion about doing Machine Learning BACKWARDS recently – you can watch the recording below or on our YouTube Channel. Eric, if you don’t know, is the founder of Predictive Analytics World, a leading consultant, and author of “Predictive Analytics“. You can also check out Eric’s new Coursera class.

This discussion was prompted by Eric and I talking about the rate of failure in Machine Learning projects. For instance, one survey said that 85% or more of machine learning projects fail to add business value and that number has gone up, not down, in recent years. Our premise is that the best way to avoid these failures is to do machine learning backwards – to begin with the outcome you want, an improved decision, and work back to the models you need and the data that will let you build them.

At the end we took some questions and one of the questions we got was:

How do you recommend getting senior executives engaged?

First, we said, you need to focus the discussion on the value from deploying a solution not the core technology. This means you might want to avoid using the words “model” or “predictive model” or “machine learning model”. Instead, focus on is exactly which decisions within which large scale operations are going to be improved and to what degree they could potentially be improved. Then you can start to talk about probabilities such as that these people are much more likely to cancel and how these probabilities are going to help make decisions more profitably. After all, you can place customers into at least two very different groups based on those probabilities and treat them accordingly generating differentiation.

I discussed one useful exercise we have done with executives. We start by asking them how they are measured – how they measure their own personal success, which metrics they care about because those metrics drive their bonus. Then we’ll ask them to identify the decisions that get made in the organization that have an impact on those metrics. The first few are always big strategic decisions that the executive team make.

If you keep pushing on it, though, gradually they realize that there are decisions made by all sorts of people in the organization and indeed by bits of software infrastructure that matter to the metric also. And while they trust their own judgment – they don’t need analytics – they are much less sure about the judgment further down the organization or in the IT department. Once they realize that machine learning is not about improving their personal decisions but about improving the quality of decision making at the operational frontline they get much more excited.

Machine learning teams often feel pressure to make a strategic difference to the company. They mistakenly assume that the way to do this is to have machine learning influence the company’s executives and executive-level decisions. This is a mistake. Better, instead, to work with executives to find the high volume, repeatable decisions that make a difference and use machine learning to improve them. Because these decisions are made so often, even small improvements multiply to give you a strategic impact.

Lots more good tips in the video. If you are interested in how we approach this why not read our white paper on Framing Analytic Requirements.

A few months back, Scott Adams posted a great Dilbert that I have been meaning to write about for a while (click on the image to see the original).

In the strip, Dilbert says “You don’t go to war with the data you need. You go to war with the data you have.”

Now Scott Adams was being funny but in fact there is a kernel of truth here. We come across many companies that are failing to apply data to their decision-making, delaying building predictive analytic models or postponing their adoption of machine learning because they don’t have the data they “need”. It’s not integrated enough, clean enough, precise enough or just not as good as it “will be soon”. This is a mistake. You should do as Dilbert advises, and “go to war with the data you have“.

The trick is to start with the decision you want to improve, rather than with the data. Understand the decision, model how you think you make that decision today, work with those who make the decision every day to capture your current approach. This decision making is possible with the data you have – it must be, as this is how you decide right now.

Now you can ask some interesting questions like:

  • What would help you make this decision more accurately?
  • Which pieces of the decision give you the most trouble?
  • Where do you spend your time in this decision?
  • Is the data you need to make this decision presented the way you use it in this decision?
  • Which pieces of this decision are data analysis – places where you decide something about the data so you can base some other decision on that analysis?

Sometimes the answer to these questions will lead you to new data or identify that your data needs to be improved. If it does, at least you can show exactly WHY you need that new data and so calculate an ROI. But often it reveals that you need to use the data you have in different ways.

The biggest benefit comes from identifying possible predictive models. Because you know how the decision is made, you will be able to see how accurate a predictive model must be to be useful. Often this is a lot lower than you think. We have had clients realize they only needed a model that as a little better than a coin flip and others who only needed 70-80% accuracy. You might need 99.99% but you probably don’t.

Until you know, you can’t answer the question if your data is good enough or not. Without a business-driven target for accuracy, your data team will assume something must be really accurate to be useful and they could easily overshoot. Plus many predictive models cope with missing and bad data quite well or can at least degrade gracefully when the data is of poor quality, allowing reasonable predictions even when data is less good.

So, don’t wait for the data you think you need, start improving decisions with the data you have. It’s noble, its heroic and it works.

Working with companies that are investing in becoming analytic enterprises, we have determined that there are three critical success factors. Whether you are focused on business analytics, data mining, predictive analytics, machine learning, artificial intelligence, or all of the above, these factors will be critical. Check out these videos that talk about them:

  1. Analytic Enterprises Put Business Decisions First
    The first critical success factor for analytic enterprises is keeping the focus on business results by beginning (and ending) with business decisions, not analytic technology. https://youtu.be/DQ9GHSOxd9s
  2. Analytic Enterprises Predict, Prescribe, and Decide
    The second critical success factor for becoming an analytic enterprise is moving beyond reporting and analysis of the past to prediction and action by using more advanced analytics to predict, prescribe, and decide. https://youtu.be/2128C4p8wVM
  3. Analytic Enterprises Learn, Adapt and Improve
    The third critical success factor for becoming an analytic enterprise is recognizing that applying analytics is not a one-time exercise, and focusing on how to use analytics to learn, adapt, and continuously improve. https://youtu.be/rlnNtk9bSyc

And if you enjoy the videos, check out our white paper on building an Analytic Enterprise.

Gartner recently published a piece “Top 10 Trends in Data and Analytics, 2020” that you can currently get from our friends at ThoughtSpot (registration required). It’s an interesting report you should definitely check out.

My favorite section was the one on Decision Intelligence, within which they include the kind of digital decisioning or decision management I’ve been doing for the last couple of decades (and in which the firm I founded, Decision Management Solutions, specializes).

In this section they correctly point out that, while automating decisions is a critical component of digital decisioning, it’s not necessary to automate 100% of the decision 100% of the time. Often we find that sometimes an automated decision requires some human inputs or that only a certain percentage of transactions can realistically be handled by a decision engine. We build decision models to understand the problem well enough to make these calls – to decide on the automation boundary – and it was great to see the shout out for decision modeling (and the Decision Model and Notation standard) in the report. The team at Gartner linked decision modeling to improved agility (faster changes), transparency and business user enablement – all key benefits we see in client after client.Personally I always get the biggest satisfaction from seeing how digital decisioning enables continuous improvement, generating the data you need to review, improve, simulate and compare decision-making approaches. As the report says, the key is to pass actionable insights directly to decision engines to act and then enable humans to review the effectiveness of this and close the loop. Putting business owners in the driver seat for improving their own automated decisioning systems is a powerful tool that generates a huge ROI.

There were also some good pieces of advice on how to scale your Machine Learning (ML) and Artificial Intelligence (AI) efforts. I would add to their advice that scaling ML/AI in a fast changing world requires more than just adopting the right ML/AI techniques. It needs the active engagement of business domain knowledge through decision modeling and business rules too. No matter what you do to improve your AI/ML, there’s no substitute for combining it with in-house business knowledge. I also appreciated their comment that the approaches that got you to an AI pilot won’t get you to production – you need an approach like the one Cassie Kozyrkov discussed and I comment on in Some great advice on Machine Learning from Google (and me) or the ideas in this post on Most companies are not succeeding with advanced analytics. But you can.

Anyway, it’s well worth registering for and downloading. If you want to learn more about decision modeling, check out our great white paper on Decision Modeling with DMN and if you want an overview of our approach to machine learning, check out this paper on Enabling the Predictive Enterprise.

DMS Covid

Last quarter Mark Breading of Strategy Meets Action wrote a couple of interesting pieces – COVID-19: A tipping point for insurance digital transformation? and Will COVID-19 Be The Tipping Point for Digital Transformation? I blogged about the general sense that digital transformation is being pushed by COVID-19 last week. But Mark makes a compelling case that this is particularly true in Insurance.

As Mark says

The work from home movement, voluntary or mandatory quarantining, retail store closures, and limits on public gatherings all serve to significantly increase our dependence on digital capabilities.

He goes on to talk specifically about insurance

Digital interaction capabilities: Self-service portals for agents and policyholders, websites that are easy to navigate and built using responsive design approaches, mobile apps for policy service and claims, and world-class call center technologies will become more critical than ever. Volumes are likely to increase as fewer face-to-face interactions occur by necessity.

We do a lot of work with insurers and they have historically been challenged when it comes to digitization:

  • Self-service portals only allow customers to do limited things – generally simple data updates or document review.
  • Agent portals are likewise often very passive, presenting data to agents but not really helping them manage their business.
  • Mobile apps focus on reporting the status of a policy service update or a claim when the customer just wants the update made or the claim paid.
  • Call centers increasingly have access to all your data but must constantly refer you to others for approvals.

The problem is that insurers have added digital channels, lightly digitized their data (scanned documents) and automated (digitized) processes without reviewing how they make decisions. They have paved the cowpath. And their use of Robotic Process Automation tools increases the odds that they will continue to do so. This is going to have to change. They are going to have to digitize decisions.

  • If policy update approvals are automated, customers can use the portal to do things not just request them.
  • If agency management decisions are automate, the agent portal can suggest how to grow the business and improve customer service not just passively support an agent who might not know what to do next.
  • If claims handling decisions are automated, mobile apps can support submission of claims and then respond immediately with “this claim is approved and will be paid in XX amount”.
  • If underwriting decisions are automated, websites and mobile apps can issue binding quotes and kick off the onboarding process 24×7.
  • If call approvals are thought of as the decisions they are not processes then call center reps can immediately assist customers, not just promise to talk to their supervisor.

COVID-19 is driving digital transformation. Insurance is a decision-centric industry and only if decisions are also automated can it be transformed.

Check out other posts on this blog on COVID-19 or on our company blog. To learn more about decision modeling and how it can help deliver agility and efficiency, download our white paper Agility and Efficiency with Decision Modeling.

DMS Covid

A survey in CIO magazine on IT leaders’ thinking in the current crisis revealed that a plurality (37%) chose digital transformation as their first priority to help the business persevere through the current disruption. Moreover, a full 61% of respondents agreed with the statement that the effects of the pandemic are actually accelerating digital transformation efforts.

We certainly see this in some of our customers. The movement of staff to working from home, the need for clients to interact remotely, restrictions on travel and meetings – all these are increasing the value of digital channels while also providing concrete motivation to get over hurdles previously seen as insurmountable.

What’s interesting, though, is the extent to which digital transformation driven by COVID-19 is decision-centric – how much it relies on digital decisions not just digital data, digital channels and digital processes.

COVID-19 means digitizing decisions about customer interactions so you can build and sustain profitable interactions with your customers. Decisions about how to personalize and target the content you display, the emails you send and the offers you make are essential. Companies are finding that they have neglected these digital decisions, investing in digital channels only to deliver cookie-cutter digital content when they could be engaging customers directly and precisely. Digital “micro” decisions need to be made for each customer, each time.

COVID-19 means that manual approvals, manual discount calculations, manual eligibility checks and manual pricing are all problematic. Many companies have found that their highly automated digital processes just route work to a person for critical decisions. With more remote workers, more workers having to flex their schedules to cope with home schooling and customers increasingly doing likewise, trying to coordinate human decision-makers for these transactional decisions is not getting it done. These decisions need to be digitized.

Finally COVID-19 means that the way people and resources are assigned needs to change. When everyone was in the office together, informal ways to assign work and manage resources worked OK. Now they’re a recipe for delay and confusion. Decisions about assignments and allocation need to move beyond first-in,first-out queues and informal group discussions to precise, data-driven and digitized decisions.

Digital transformation is happening faster thanks to COVID-19 but the real opportunity is for companies to digitize their customer and transaction decisions.

Check out other posts on this blog on COVID-19 or on our company blog. To learn more about decision modeling and how it can help deliver agility and efficiency, download our white paper Agility and Efficiency with Decision Modeling.

White Paper Cover

With everything else going on at the moment, you may be finding it difficult to think about improving your decision-making. Yet COVID-19 has added a new set of constraints on your decision making and changed you objectives. This makes the management, and optimization, of decisions even more valuable.

A few weeks back we published a new white paper – The Customer Journey to Decision Optimization. Sponsored by FICO, this is available on their website. This paper lays out steps to adopt decision optimization, walking through how to codify your current practice, systematically improve your decision strategy and apply mathematical optimization.

We’ve published a couple of follow-up blog posts over on the company blog too:

Why not download the paper and start your own “Journey to Decision Optimization”?

DMS Covid

We blogged previously about the need to react quickly and accurately to change and the importance of building and sustaining customer relationships remotely. One final area of focus is the need to be more flexible and dynamic in how resources are assigned and managed.

Companies are going to have more staff working remotely. Remote workers can’t share work as readily with their co-workers, or bounce assignments to others by simply standing up and waving across the room. A group of collocated workers could be assigned work pretty mindlessly and left to sort it out for themselves – simply queuing up the work for the group worked OK. This approach had problems before but now these problems become too serious to ignore. Organizations need to rethink how work is assigned and move away from first-in, first-out queues to more sophisticated allocation and assignment.

For instance, one call center we know supports remote workers and multiple locations by investing in its call routing approach. Instead of just dumping calls into a queue, the stated or predicted topic of the call as well as customer details and value are used to route it appropriately. To maximize the engagement and value of these calls, the organization also determines the best possible upsell or cross-sell to be made at the end of the call (including the possibility than not making one was the best option). Where possible, calls are routed not just to someone who can solve the problem the customer has but also to someone who’s good at selling the identified upsell. This focused assignment is highly dynamic and can be changed whenever new products or campaigns are launched or when the kinds of calls being received is impacted by outside events. Staff are given work they are good at, customers get agents who can help and who are enthusiastic about the products being up-sold, and results improve.

Even when organizations have built dynamic allocation and assignment approaches, there are often overrides. When call volumes get high, when the quarter-end is looming or when marketing campaigns are in flight, all the sophistication is thrown away as the override kicks in. This is never great – overriding things like this inflicts long term damage on data-driven improvement for instance – but it becomes very high risk when circumstances may often result in new overrides. Like now, when the constant changes to COVID-19 rules and regulations are causing companies to keep overriding their systems and process to cope.

The new environment is going to require much more managed overrides and more rapid change. One of our clients, for instance, has a routing algorithm that considers a wide range of factors before deciding how to handle a particular product return. Various situations exist that “override” the default algorithm but these are built into it. Business users can simply identify that a particular circumstance has occurred and the built-in changes to the algorithm are triggered. The business gets the appropriate response, users still rely on the system in exactly the same way and all the usual data gets collected. The “override” gets 100% compliance from users because its built in and the cost and impacts of the override are transparent because the system still runs and still creates the data needed for analysis.

All of this brings up one last point. Precise resource assignment increasingly involves algorithms – machine learning and predictive analytics. Which customers are churn risks, which products will be appealing, who’s likely to be able to handle this problem – all of these are candidates for advanced analytic models and machine learning algorithms. Focusing on the business problem – how to do the assignment or allocation – frames your need for these analytics so you can not just build the algorithm, you can also get it across the “last mile” and into production.

These systems are going to require new decisions to be made – dynamic, managed, automated assignments and allocations will replace first in, first out queues and simple distribution. Processes will change to add these decisions. Analytic insight will matter in these decisions, as will expertise and experience. Batch analysis will be replaced with real-time scoring and simple logic with managed business rules. Decision models will help you integrate these elements to deliver the real-time decisions you need.

Dynamic, managed assignment is something you can build into your systems now. We’ll be posting regularly on ideas and approaches and producing some great content. To stay up on it, why not sign up for our newsletter.

DMS Covid

We blogged about the need to react quickly and accurately to change already as this is one of the key ways you can adapt your operations. A second is to ensure that you can build and sustain your customer relationships even as your customers are more remotely and have fewer in-person interactions with you.

The new era is going to require much more automation to support remote interactions. Customers are going to be more remote. They’re going to be under more stress too, so quicker responses will be more valued than ever. And the staff you would need to support a manual decision may be remote themselves, making manual responses problematic. Automated customer interactions can’t be cookie-cutter, though. Customers still want to feel that you know them and value them. You can’t pick between automated and personalized when your customers want both. And you still need the flexibility we talked about earlier, because all of this is a moving target.

An organization we work with recently began investing in building and sustaining customer relationships online, having for years relied on in-person interactions. When prospective customers start looking online at information about products, this organization makes sure it presents the right first product. Using everything it knew about the prospect, it presented targeted (and compliant) “first best offers” to these clients. Engaged customers could buy the product online but it was a relatively complex product to buy so many would choose, in the end, to talk to someone. So now the lead, the initial offer and the partially completed application are all routed to the right point of contact. Rapid feedback on customer behavior and flexible automation let them keep the initial conversation feeling personalized and relevant even as products, marketing campaigns and the day-to-day business context changes. The automation meant that customers could choose to no people when automation was what they wanted and could be routed to the right person when it wasn’t.

This organization applied this same mindset to helping its agents in this new, online engagement approach. Agents used to in-person meetings are threatened by digital channels, and by fully online direct-to-consumer competitors. Helping agents manage their customer portfolio, identifying opportunities to engage with existing customers around special events or campaigns, and helping the agents target customers and prospects precisely all help agents add value to the customer relationship, even as it becomes an increasingly electronic one.

To deliver systems that engage remotely with customers in this way requires automation of the underlying decision and the targeted application of machine learning. You can’t just look at the data though, you need to combine what the data tells you with what your expertise tells you and filter all this through the current (changing) state. The analytic insight you develop must be in service to the customer treatment decisions you are trying to make. Decision modeling will frame up the analytics you need and, integrated with business rules, deliver targeted customer treatment across all your channels.

We have a proven, established way to build systems that will engage customers remotely and we’ll be posting regularly on ideas and approaches and producing some great content. To stay up on it, sign up for our newsletter.

DMS Covid

One of the key ways you can adapt your operations is to develop systems and processes that can react quickly and accurately to changes as they happen. The response of governments at the city, state and federal level is going to evolve as the situation changes. The behavior of customers, partners, suppliers, agents and distributors is going to change too. You need to keep operating through all this change, without creating unnecessary manual clean up, without “hacking” temporary fixes every few weeks and without putting yourself at risk for legal or compliance (or publicity) problems.

One of the persistent myths in systems development is that systems and processes can’t be both flexible and compliant. COVID-19 puts this old saying under real pressure. If you work in a regulated industry (insurance, financial services) or deal with regulated consumer data, your regulators are not suddenly going to give you a pass because there’s a pandemic on. They might be a little more flexible, give you a little more time to report things or adopt new policies, but they’re going to expect you to remain compliant – even as what it means to BE compliant changes all the time. Now you really need to be flexible AND compliant.

One marketing department we worked with has the kind of flexibility you need. They can change the marketing offers they make to new customers whenever they need to. They can try multiple approaches to picking an offer to see which works best in rapidly changing circumstances. They have the transparency and auditability they need to show regulators that they never tried to sell products to people not eligible for them. This kind of system allows you to turn on a dime as states relax and tighten restrictions, guidance changes and new polices and regulations come out.

The key to compliant flexibility lies in business enablement. Like the marketing department in this example, you need business owners to understand and be engaged in managing the system, to be empowered to make the changes they need. They are the ones closest to the regulations, closest to the customer, setting the policies. When they can see how the system is behaving, change the way the process works, and drive the outcomes they need then you’ll get the quick, accurate, compliant response to change that the new era is going to require.

Building these kind of flexible yet compliant systems requires a focus on the decision-making embedded in these systems and on exposing that decision-making so business owners can change it themselves. The decision making is what changes the most often and the business owners are the ones who see those changes first and who understand them best. Decision modeling, business rules management and a focus on continuous improvement all contribute to developing these systems that can react quickly and accurately.

The good news is that there are proven, established ways to build these systems and we’ll be posting regularly on ideas and approaches and producing some great content. To stay up on these ideas, why not sign up for our newsletter.

DMS Covid

The COVID-19 Coronavirus has upended the world economy – and your business. While the impacts to date have been dramatic, we have to face the fact that this is the beginning of a new normal – a world in which this virus circulates. This will have short- and medium-term consequences for your business regardless of the industry you operate in.

You’ve probably spent the last few weeks re configuring everything to function effectively as a remote enterprise. You’ve dealt with the immediate impacts like surges in VPN access, new laptop requirements, making processes work when people are working from home. Plus ,you’ve survived the immediate economic hit. In the coming weeks and months, organizations like yours will begin to focus on cost optimization and seeking opportunities in the new environment. Some companies will succeed at this – they will adapt, survive and thrive in this new environment. Others will fail.

Now is the right time to be thinking about how you should adapt operations to cope with the new normal. It’s time to plan because this is not going away. The problems you are facing will change and evolve but there’s a new normal coming and you’ll want to have new systems and processes to cope.

  • How will you create targeted, engaging and profitable interactions with customers you will never meet? And sustain relationships you started offline now they must move online?
  • How will you create resilience in your supply chain and other processes, without tying up more capital? How do you ensure you can adapt rapidly to shifting targets, regulations and opportunities?
  • How will you shift your adapt operations to handle an increasingly volatile and dynamic world? Can you assign and allocate resources based on what’s happening now, or soon, not on how things used to work?

Our focus here at Decision Management Solutions is on digital decisioning – using our DecisionsFirst™ approach to apply technology and deliver automated solutions to decision-making problems. The systems we help our customers build address exactly these issues: they ensue you can personalize and target the digital interactions you have with customers; they deliver resilience, transparency and agility; and they support dynamic assignment and allocation.

We’ve learned a lot about building these kinds of systems over the decade or more we’ve been doing this and we’d like to share some tips and help you see how you can do it too. We’ll be posting regularly on ideas and approaches and producing some great content. To stay up on it, sign up for our newsletter.

Some of our old friends at Gartner have just published some great research on Decision Management. Specifically they have extended their work on Decision Management Suites (blogged about here) and focused on How to Choose Your Best-Fit Decision Management Suite Vendor [Gartner subscription or modest fee]. As they say in the intro:

Decision management suites go beyond business rule management systems by providing more features for designing, deploying, maintaining and auditing decision-making software. This report describes the steps data and analytics leaders should take to identify the best vendor for their business needs.

Gartner

Selecting a decision management vendor or business rules management system is something we here at Decision Management Solutions do a lot so I was excited to see what the Gartner team had come up with.

They suggest that you identify potential providers by considering their support for both business rules and analytics. As I have described in my books (most recently Digital Decisioning: Using Decision Management to Deliver Business Impact from AI ), the decision services you build are going to need business rules, are likely to need machine learning or predictive analytics and may need optimization (though much less often). Streaming analytics and event processing are included in this list by the Gartner team but I see this more as a niche market – generally not relevant but occasionally critical.

Building decision services with a mix of these elements requires decision modeling – specifically decision requirements modeling. You can build a decision requirements model that pulls together business rules, machine learning and optimization to give you an effective, graphical blueprint. Don’t leave home without one. We build a lot of decision services and we would never do so without a decision model. You shouldn’t either.

Below is the list of topic areas they considered and they have some great content in the paper. Based on our experience developing decision services and modeling more than 4,000 decisions, I have a few additional comments:

  1. Ease of Authoring
    Our experience is that this is all about decision requirements models and associated decision tables. People often evaluate other authoring elements but they don’t really matter – business users like decision tables and decision requirements models are essential for getting decision tables right..
  2. Application Solutions and Templates
  3. Operating Environment
  4. Build, Version and Deploy
    Always consider this as two separate threads – one focused on how non-technical users do versioning and deployment, one on integrating with IT’s processes like CI/CD. Don’t mix them and don’t assume that being good at one makes a platform good at the other. Some platforms are very programmer-friendly but baffle business users.
  5. Scalability and Latency
  6. Logging, Monitoring and Evaluating
    Remember that the data you capture to track how you made decisions and how that worked out for you will drive continuous improvement. This is much more important than the logging or monitoring of rule execution which is interesting only sometimes.
  7. Process Orchestration/Workflow
    Meh. Build stateless, side-effect free decision services and leave the workflow somewhere else.
  8. Simulation
    This is REALLY important. Don’t miss this. And don’t confuse it with testing, which it is not. Testing is to see if something is broken. Simulation shows the impact of a change. Business users make lots of changes that should not result in test failures. Make sure they can simulate the impact of a change before they commit it.
  9. Rule Validation
  10. Microsoft Excel Support
    Not a fan. Excel is super flexible but that’s not necessarily a good thing. Either commit to a product that does everything inside Excel or one that provides decent editors. Don’t get stuck in the middle.
  11. Rule Harvesting
    Never harvest rules until you have a decision model in place. Ever. Really, just don’t. Please.

Supporting this paper is a Toolkit: Decision Management Suite Vendor Profiles [Gartner subscription required]. While I don’t think some of these are really Decision Management Suites – a couple are really workflow or streaming engines with some rules support thrown in – its mostly a good list and a reasonable framework to use. Just remember, as the authors point out, to weight the factors that matter to your project.

If you’d like our help selecting a vendor or platform, contact us and we’d be happy to talk you through what we do. If you want more background on decision requirements modeling, check out this paper.