≡ Menu

BPTrendsReal-World Decision Modeling with DMN CoverPaul Harmon over on BPTrends interviewed Jan Purchase and I about our new book, Real-World Decision Modeling with DMN. The interview covers a pretty wide range of topics – a definition of Decision Modeling, the bottom line value of modeling a decision, the difference between decisions and rules and how decision modeling helps projects. We talk about some of our memorable experiences of applying decision modeling, our favorite best practices and the value in applying decision modeling and the Decision Model and Notation (DMN) standard alongside BPMN. We outline our method for applying decision modeling and DMN, and talk a little about the book – who it is aimed at and what it contains (Over 220 practical illustrations, 47 best practices, 13 common misconceptions to avoid, 12 patterns and approaches and 3 worked examples among other things). We end with some thoughts on how to get started.

Anyway, I hope you enjoy the interview over on the BPTrends site and, if you do, why not buy the book? Real-World Decision Modeling with DMN is available on amazon and more details are available on the Meghan-Kiffer site.

Real-World Decision Modeling with DMN CoverI am delighted to announce that Jan Purchase, founder of Lux Magi, and I have finished what we believe is the definitive guide to decision modeling with the Object Management Group’s Decision Model and Notation (DMN) standard, Real-World Decision Modeling with DMN. The book is now on general release and available for purchase from Amazon and Barnes and Noble (full details on the Meghan Kiffer site).

If you are a regular follower of this blog you will know that decision modeling is an important technique for improving the effectiveness, consistency and agility of your organization’s operational decisions and a vital adjunct to the continuous improvement of your business processes.

We have tried hard to develop a truly comprehensive book that provides a complete explanation of decision modeling and how it aligns with Decision Management and more broadly with Digital Transformation. The book describes the DMN standard, focusing on the business benefits of using it. Full of examples and best practices developed on real projects, it will help new decision modelers to quickly get up to speed while also providing crucial patterns and advice for more those with more experience. The book includes a detailed method for applying decision modeling in your organization and advice on how to select tools and start a pilot project. It contains:

  • Over 220 practical illustrations
  • 47 best practices
  • 13 common misconceptions to avoid
  • 12 patterns and approaches
  • 3 worked examples

Here are some of the great quotes from our initial reviewers:

“This comprehensive and incredibly useful book offers a wealth of practical advice to anyone interested in decision management and its potential to improve enterprise applications. Using a blend of user case studies, patterns, best practices and pragmatic techniques, “Real-World Decision Modeling with DMN” shows you how to discover, model, analyze and design the critical decisions that drive your business!”
—David Herring, Manager of BPM & ODM Delivery at Kaiser Permanente.

“Well written and very impressive in its scope.”
—Alan Fish, Principal Consultant, Decision Solutions, FICO and author of “Knowledge Automation: How to Implement Decision Management in Business Processes”.

“If you are looking for a complete treatise on decisions, look no further. Even though you end up in decision modeling and Decision Modeling Notation (DMN), you are treated to all the aspects of decisions explained with great care and clarity. This is a greatly needed book that will be studied by decision managers and practitioners for the next decade or more. It will end up on my physical and logical bookshelves.”
—Jim Sinur, Vice President and Research Fellow, Aragon Research.

“Written by two of the foremost experts in decision management, this book provides an extensive exploration of business decisions and how they are used in modern digital organizations. Taylor and Purchase distill for us the why, how, when and where to apply decision modeling in order to specify business decisions that are effective. Going beyond just an introduction to the Decision Model and Notation (DMN), the authors position this standard with respect to other well established standard notations such as BPMN. This is truly a first comprehensive handbook of decision management.”
—Denis Gagne, Chair of BPMN MIWG, CEO & CTO at Trisotech.

I very much hope you will buy and enjoy Real-World Decision Modeling with DMN. I’d love to hear what you think of it and stories about how it helped you on projects so please drop me a line.

I’ll leave the last word to  Richard Soley, Chairman and CEO of the Object Management Group who wrote the foreword for us:

A well-defined, well-structured approach to Decision Modeling (using the OMG international DMN standard) gives a repeatable, consistent approach to decision-making and also allows the crucial ‘why?’ question to be answered — how did we come to this point and what do we do next? The key to accountability, repeatability, consistency and even agility is a well-defined approach to business decisions, and the standard and this book gets you there.”

What are you waiting for? Go buy it –  Meghan KifferAmazon, Barnes and Noble.

JMP 13 and JMP Pro 13 launched in September 2016. JMP is a business unit of SAS with more than $60M in revenue with growth and a focus in manufacturing and life sciences. Lots of engineers, analysts, researchers and scientists as users. JMP has specialty products in Genomics and Life Sciences but this release is about the core products. The tool remains focused on design of experiments, predictive modeling, quality and reliability, and consumer research though dashboards and data wrangling are being added too.

JMP 13 remains a fat client tool and with this release there are a couple of key themes:

  • Increased ease and efficiency of preparing data
  • Enriching the kinds of data and this release is adding free text
  • Handling very wide data is another area, especially as more memory is added to the desktop
  • Finally reporting, dashboards and their priority focus areas remain important

New capabilities include:

  • Lots of focus on reducing the number of clicks for actions – so for instance creating a script to recreate an analysis on new or refreshing data is now 1 click not 2.
  • JMP tables created as the software is used can now be joined using standard query builder capabilities. This essentially creates a new JMP table that can then be analyzed. This includes filters and where clauses for instance. Building the new table also prototypes the SQL that would be required on the production systems.
  • A virtual join has been added also to support situations where very tall datasets are involved. For instance, sensor data might have thousands of entries per sensor. Joining this to a table about the locations of sensors, say, might blow memory limits. The virtual join allows a join to be defined and used without ever filling the memory with joined data.
  • New dashboard building tools are included to make it easier to arrange several graphs onto a dashboard without having to go through the full application development tools. Multiple templates are provided for a simple drag and drop construction. Full interactive views (with all the support for JMP changes at runtime) or summary views are available for dashboards. Lists can be added as filters and one graph can be used to filter another – so as a user clicks on one graph, the data in the other is filtered to only those data selected in the first one.
  • Sharing reports and dashboards has been an increasing focus in recent versions. Stand-alone interactive HTML reports and dashboards have been improved. For instance, graph builder capabilities, making HTML graphs more interactive. Data for these files are embedded in the files – the data is a snapshot not live – but only the data that is needed. Multiple reports can be packaged up and an index page is generated to front end a set of independent reports.
  • JMP Pro has improved the ability to quickly organize and compare multiple models. Scripts can be published to a Formula Depot for multiple models and then used in a comparison without adding the corresponding columns to the data set. From here code can be generated for the models too – adding C, Python and Javascript to SAS and SQL.
  • Text handling has been added too. JMP allows more analysis of text using the Text Explorer which handles various languages and identifies top phrases, word clouds etc. As usual in JMP, these are interactive and can be cross-tabbed and integrated with graphs and other presentations. JMP Pro allows this to be integrated with structured data to build a model.

For more information on JMP 13 click here.

Final session for me at Building Business Capability is Denis Gagne of Trisotech talking about the intersection of process, case and decision modeling – BPMN, CMMN and DMN – and the importance of these standards to business analysts.

Denis reminds us that all organizations are facing a new digital future with rapid technology change, changing customer behavior and increasing competition. Music, media, transportation and more have been disrupted and everyone is at risk of disruption. In response companies cannot just focus on operational improvement (good for the company, not for customers) or just on customer experience (expensive for the company) – we must focus on both – on transformation, on outcomes and on new business models. We must do new things that make our old things obsolete. And this requires LOB managers and executives, enterprise and business architects, business and process analysts to bring different perspectives to bear effectively.

garnterworktypesGartner has a great model of work efforts and all these must be improved, requiring a focus on structured and unstructured processes as well as on decisions. And these improvement efforts must balance cost against quality, and time against value. Different approaches balance these things differently, leading to different kinds of improvement. Business analysts in this world he argues therefore you need to have three practices in your toolbox:

  • Process Management – with BPMN, Business Process Model and Notation
  • Case Management – with CMMN, Case Management Model and Notation
  • Decision Management – with DMN, Decision Model and Notation

These notations provide unambiguous definitions for processes, cases and decisions. These can be interchanged and even more importantly is that the skills in developing and reading these models are transferable. This means that despite turnover, multiple parties and external resources, we still know what the model means.

whentouseThe most common question he gets is when to use which. BPMN he says is about processing, CMMN is about managing context and DMN is about deciding. It’s not too hard… And he had a nice chart showing the difference between BPMN and CMMN that also showed the broad applicability of DMN.

  • BPMN allows you to define a process – a triggering event followed by a sequence of activities that lead to some business outcomes – using standard shapes to represent the elements.
  • CMMN allows you to define a case – in a context there is an event followed by a condition and an action – again using standard shapes.
  • DMN allows you to define decision-making – decisions, decision requirements and decision logic – also using standard shapes

He wrapped up with some great advice:

  • In BPMN and using too many gateways – you need DMN to model that decision-making
  • In BPMN and using too many intermediary or boundary events – you need to use CMMN to model the case work you have identified
  • In BPMN and using ad-hoc processes – replace them with CMMN
  • In CMMN and using lots of sequencing – use BPMN

The standards as a set he says allow you to build a more agile, intelligent and contextual business.

By the way, my new book on DMN – Real-World Decision Modeling with DMN – is available

 

Continuing blogging from Building Business Capability and it’s time for Jan Vanthienen to talk decision modeling for the business analyst. Jan is a leading researcher into decision modeling and decision tables at the University of Leuven.

Observation 1: Decisions are important for business, not only processes.

Too often, he says, the process is modeled with no real thought being given to the decision-making required – a claims process that does not really describe which claims will be paid, only how to pay or reject them. To describe them, companies must model them and the DMN (Decision Model and Notation) standard is ideal. DMN defines two layers:

  • The requirements for a decision in terms of other decisions, data and sources of knowledge.
  • The logic of the decisions, derived from these knowledge sources and applied to the data to actually make the decisions in the requirements.

Once you model the decision separately, much of the decision-making complexity that overloads a process model can be removed to simplify and clarify the process.

Observation 2: Decision logic should not be shown as process paths. 

Decisions should be modeled as peers of the processes that need them so they can change independently, be managed by different people etc. This reduces the complexity of processes by removing decision logic and decisions from the process model and instead linking the decision to a specific process task.

Observation 3: Multiple decisions in a process might be in a single decision model

Observation 4: Proven techniques exist to represent decision tables and decision models

From a methodology perspective we need to do three things

  1. Model the decision structure
    • Build a decision requirements graph – a decision requirements diagram – to show how the decisions, data and knowledge sources form a network describing the decision making and its structure.
    • Generally this is best done top down and descriptions of questions and answers in requirements or policies often lead very naturally to sub-decisions.
    • Descriptions of data elements are similarly clear. But if you just try to write a decision table against these data elements you will end up with an unwieldy model. Build a model first.
    • img_20161104_092831706Decision requirement models are built using expertise, by seeing which conditions depend on the results of other decisions, and by normalizing the model to eliminate redundancy.
  2. Model a single decision table
    • Decision tables are not the only way to manage the business rules for a decision in the model but they are a very effective one and one that business users find very transparent.
    • img_20161104_093400122Representing a description of logic as a decision table clarifies and makes consistent the logic it represents. And the logic read in several ways without ambiguity – what will I do if? OR when will I do x?
    • Decision tables can show rules as rows or as columns, with each column/row being based on a fact. This creates a repeated structure for each row. And the rows are all rows that only have ANDed conditions (ORs lead to new rules as they should).
    • The standard also defines a “hit indicator” to define how to handle situations where multiple rules in a decision table are true. But stick to Unique and perhaps Any’s because the others get you into trouble…especially First which is just a way to code ELSEs!
    • Decision tables can be correct, compact and consistent but generally you can only get 2 of the 3. Start focusing on correctness, work to ensure consistency. Only take compactness as a bonus.
    • Jan likes to use rules as rows or rules as columns depending on the shape of the table. I prefer to stick to rules as rows unless it’s really important…
    • You can build decision models and tables interactively with people, from text of policies or regulations, or by mining data/rules/code. You can also find all the possible condition values and outcomes and build an empty table. As you fill in the logic you can clarify and normalize.
    • DMN allows the modeling of decisions alongside processes, simplifying processes and creating a more declarative and understandable model of decision-making. It also separates the structure from the logic while allowing them to be linked and reused. DMN standardizes and extends decision modeling and decision table approaches but builds on a long history of techniques and approaches that work.
  3. Model the process connection
    • Sometimes the entire process is about a decision. Model the decision first and then think about how to execute it, how to build a process around it. A process might do all the sub-decisions in parallel to minimize the time taken OR specific decisions might be made first because they eliminate the need to do other parts of the process for efficiency OR some decisions might be made about several transactions at the same time in a group OR…
    • The decision model describes how to make a decision so that you can not only build a decision-making system but also answer questions about how the decision works, how it is made.

By the way, my new book on DMN – Real-World Decision Modeling with DMN – is available

Last day of blogging from Building Business Capability and the first topic is modern business architecture with Gagan Saxena, VP of consulting at Decision Management Solutions, and Andrew Ray of Goldman Sachs. The presentation is focused on a new approach to business architecture to address problems in the legacy approaches – one that connects business at rest to business in motion in a model-driven way.

A perfect architecture would provide an organizing framework that synchronizes

  • Business Strategy
  • Business Operations
  • Business Organization
  • Business Systems
  • Business Performance

At rest (design) and in motion (operations). And it’s essential that over-complex technical solutions are not developed for problems that could perhaps have been avoided.

Yet business architecture has added more and more techniques and areas of focus, making it more complex, and yet failing to address new areas. They call out two particular areas – knowledge management and performance management – that are both long standing and yet still not well addressed. Nothing is ever removed from the list, overwhelming anyone trying to stay in touch.

Business is changing. In particular, margins and time to respond have shrunk and put real pressure on point to point, static solutions. Each regulation has an impact too, often on many systems. But how to demonstrate compliance?

Instead of coding rules into each system, compliant decision-making can be defined in DMN (Decision Model and Notation) and shared between the systems. But there are many such regulations that overlap so a decision architecture had to be developed to manage this. Yet most business architecture approaches don’t include a decision architecture.

atrestarchHistorically business architecture used was focused on “business at rest”. Documents are developed and published but they are not model-based, making it hard to keep a business architecture “alive” as systems are developed and evolved. Performance monitoring and learning is often dumped as deadlines loom and what is learned is often not fed back well enough.

Instead you want a model-based approach to support business in motion. These models tie directly to implementation, allowing the data captured in motion to be linked to the models and used to improve them. Decision models (DMN), process models (BPMN), data models and models of metrics or performance as a minimum.

To develop this set of models the team used

  • Business Model Canvas to  define stakeholders and value chains
  • Capability map
  • Strategy Map
  • These led to a Balanced Scorecard to track progress
  • And a set of strategic initiatives and projects
  • Capabilities were defined and executed to drive
  • Business service performance which is fed back into the scorecard

modernbizarch

The capabilities defined – around process, data, decisions or events – can be composed into higher level capabilities or patterns. Performance management is built in and the knowledge embodied in these capabilities can trace back to the real-world source. The embodied models can be automated and optimized while supporting agility and rapid, safe change.

The example they used is client onboarding. This SOUNDS simple but it’s really not thanks to regulations (45 KYC regulations for instance), a network of legacy systems, many manual processes and a continual effort to reduce costs. “A house of cards” resulting in excessive time and cost to onboard as customer (thanks in large part to legal negotiation).

Normally the focus would have been on one specific item but the team instead has focused on a more holistic approach designed to address the whole front to back process. Yet an agile approach was essential to show progress while a coherent overall architecture had to be the result. The approach involved:

  • Business Model Canvas gave a big picture context of key elements and relationships without a lot of documentation.
  • Capability Map identified the capabilities needed across perspectives of strategic monitoring, customer management, process management and infrastructure management.
  • Strategy Map showed the dependencies between strategy themes and objectives.
  • Balanced Scorecard showed the objectives and metrics for each perspective with targets and deadlines.
  • Strategic Initiatives were defined to hit the targets defined for each of these objectives – many initiatives supported several objectives.
  • capabilitymodelsCapabilities are defined that are reused across processes. These are defined using one or more of
    • User Stories
    • Process Models
    • Data Models
    • Event Models
    • Decision Models

Results:

  • Showing progress against an agreed strategy has kept leaders engaged and sustained funding. A complete front to back design exists even though it is being delivered incrementally with agile methods.
  • The Business Capability model has helped ensure reusable assets are created and has helped focus a large agile program.
  • Model driven development has put business people in the driving seat, increasing agility and transparency while reducing time to value.
  • BPMN and DMN have been key both in engaging the client/business and in freeing up technology to focus on engineering problems.
  • DMN’s ability to trace from executable logic to a logical design to the original regulation allowed for compliance and allowed regulatory/policy knowledge to be automated in decisions.

By the way, my new book on DMN – Real-World Decision Modeling with DMN – is available

Continuing with blogs from Building Business Capability I am self-blogging the session I co-presented with David Herring, who leads the Process Transformation and Decision Management Program at a leading Northern California Healthcare organization, on “Pioneering Decision Services with Decision Modeling”.

David works at a large not-for-profit health plan that does everything from inpatient, to home health, hospitals, hospice, pharmacy, and insurance. 10M+ members, 17,000 doctors and nearly 200,000 total employees. They have been on a decision management journey. Beginning with SOA infrastructure they first added Business Process Management before adding in-context Decision Management to automate decisions. More recently they have begun to focus on predictive analytics, on cognitive services and on event-awareness and event processing. Their messaging bus now handles 2B messages a month and 500 web services are connected, supporting BPM, Decision Management, advanced analytics and performance management.

Decision Management Solutions has been working with them on its adoption of decision management and, in particular, its adoption of decision modeling. Decision Management relies on a clear understanding of the decisions involved and increasingly therefore on decision modeling. Decision modeling supports the whole lifecycle – giving an unambiguous definition at the beginning, scoping and specifying the interfaces of decision services and providing a framework for monitoring and improvement. Decision models with the Decision Model and Notation standard (DMN) clearly express requirements by documenting decisions and sub-decisions, showing a precise structure for the decision-making, showing what information is consumed by the decision (and precisely where it is consumed) and identifying all the sources of knowledge – policies, regulations or expertise – that define the decision-making approach. Decisions in a decision model can also be connected to business processes, to goals and performance metrics, to organizational structures and to enterprise data.

Decision models in DMN have two layers – a decision requirements layer showing the structure and a decision logic layer expressing how each piece of decision-making executes. DMN decision requirements models can also be integrated with the business rules that implement them in a BRMS. The DMN standard is increasingly widely used, has broad industry support and is managed by the Object Management Group, a well established standards body. The standard is intended to “… provide a common notation that is readily understandable by all business users… a standardized bridge for the gap between the business decision design and decision implementation.”

DMN has many use cases

  • Human Decision-making such as documenting human decision-making, improving human decision-making with analytics or training human decision-makers
  • Requirements for automated Decision-making including business rules discovery and analysis, framing predictive analytics or even dashboard design
  • Implementing automated Decision-making by completely specifying business rules, acting as a BRMS front-end or orchestrating complex decisioning technology

One of the projects that applied the approach is The Heart Failure Project. This arose because cardiologists have a need for a system to evaluate patients, using a simple set of conditions, to determine if patient needs to be referred to a heart failure specialist. kpmethodFor this project, the methodology applied was:

  • Run discovery workshops
    These involved actual business owners – heart surgeons and heart transplant specialists – as well as analysts and IT resources. Elicitation techniques were used to describe the specific decisions involved, identify the metrics/KPIs impacted by these decisions and understand where the decisions fit in the overall system and process context.
  • Model and identify suitable decisions for automation in IBM’s Operational Decision Manager (ODM) BRMS
    Decisions were modeled and refined using DMN in DecisionsFirst Modeler. The initial decisions were decomposed to show the more granular and reusable decisions within them as well as the input data and knowledge sources involved. This used a mix of material from the workshops and ongoing Q&A with the experts.
  • Transform decision models into executable decision tables
    Initial decision tables had been sketched in Excel and linked to the model. As the model stabilized these decision tables were “normalized”. These tables were then added to IBM’s ODM and complete logic was specified, taking advantage of the web-based editor in ODM. Each table was linked to a specific decision in the decision model and the two platforms are integrated to make navigation easy.
  • Deploy decision tables in an ODM Decision Service
    A few additional elements, like a Rule Flow, were added to ODM and this allowed the logic in the tables to be deployed as a service that could be called from processes or from a UI such as that used by the Doctors.

At the heart of this approach is iterative business-centric development. An initial high level model with a few decisions and “obvious” input data is developed. In each iteration a part of the model is developed into a more detailed clinical model and a set of decision tables identified for each element of this more detailed model. This allows for a highly iterative and agile approach while ensuring that each new iteration can clearly be put in context and ensuring that reuse is managed across the iterations (because decisions are reused between the diagrams through a shared underlying repository).

One of the key drivers for the project was that this one decision involved many documents. The current approach is to writer a lot of clinical guideline documents. This results in information overload as well as significant regional variation. It can be hard to ensure that the authors of these documents are practicing specialists and the documents are hard to maintain/rarely updated. The decision model combines all the available guidelines and documents into a single model. Each document is identified as a knowledge source and influences some parts of the overall decision-making.

Key benefits of this approach are consistency and consumability. Today all the various guidelines and some additional toolkits are shared through a library. Different regions leverage these to create local implementations reflecting the systems and processes they use. However this is largely cut-and-paste reuse. With the new approach, all the studies and research could be combined into a single decision model. This could then be used to deploy some standard decision services, available on the enterprise infrastructure, to make decisions whenever and wherever needed. Regional implementations could still reflect local systems and processes but could reach out to a shared decision service – ensuring that the clinical guidance was consistently applied.

When working with experts it is often hard to decide what should be automated and what should be left to experts. It is easy to overfit – building a system that does too much or tries to be too precise. In this example there was a key decision that should be made by a medical professional – a four quadrant decision known as the patient’s hemodynamic status. This was required by key decisions in the model. This should not be automated or made more complex as this describes exactly what a physician does. It can’t be generated from stored data nor is there any value in a more granular scale. The decision model allows this to be identified, described and then left outside the automation – but still in the model for clarity and management.

This approach works well for managing automation boundaries. Teams can develop a decision model for the whole decision and then use it to identify the scope of automation. This may leave “master” decisions to be made by people but more often identifies “feeder” decisions that should be left as manual. In fact the model supports automated, automated but overridable and completely manual decisions equally well, allowing the model to be used BEFORE these automation decisions have been made.

Final summary and recommendations:

  • Decision Modeling Workshops
    • Engage Business Owners
    • Reveal Automation Boundaries
    • Integrate Multiple Perspectives and Documents
  • Decision Modeling
    • Supports Iterative Development
    • Focuses BRMS Development
    • Avoids Overfitting
  • Decision Services
    • Improve Processes
    • Supports SOA Best Practices

You can learn more about the integration between DecisionsFirst Modeler and IBM ODM here and my new book on DMN – Real-World Decision Modeling with DMN – is available

An additional blog post here on a session at Building Business Capability that I missed – Business Analysis for Data Science teams. I know Susan Meyer who presented it and we talked several times about her presentation. It’s a really key topic so I wanted to present a summary. Here goes:

There is a lot of interest and excitement right now around data science (data mining, predictive analytics, machine learning). But this is not just a gold rush so much as a real indication of a change. With more and more interest in this topic and a need for companies to use data science effectively, Susan sees a key role for business analysts (as do I).

She points out correctly you don’t need to be a mathematician or statistician to contribute to data science teams. A business analyst who understands the process and has some disciplined curiosity can do a lot. And the process you need to know is CRISP-DM – the Cross Industry Standard Process for Data Mining (see these blog posts on CRISP-DM). This is an ideal process for business analysts as its iterative, begins with business knowledge and allows non-data scientists to be part of the team. She identifies 6 reasons business analysts can help:

  1. They know their vertical and domain
  2. They are used to agile, iterative projects
  3. They understand their company’s business model
  4. They can elicit requirements through data – and build a decision model to frame the analytic requirements
  5. They can partner on the architecture (especially for deployment and data sourcing)
  6. They can build and manage the feedback loop and the metrics involved

Business analysts can dramatically reduce the time spent on data and business understanding and improve the results by anchoring the data science in the real business problem.

Very cool.

Continuing to blog at Building Business Capability 2016 with Ron Ross talking about operational excellence. [My comments in italics]

He began by talking about the new technology available and its potential while expressing worries that technologies, and technological approaches, might not really change our businesses for the better. In particular he expressed concern that “channel mania” might be dragging companies into unhelpful behavior. That said, he says, perhaps this channel mania is just an evolution of the old problem of organizational silos.

It’s clear to him that moving forward technologically will remove people from the loop when it comes to making things right for the customer. A digital future will mean there’s no-one available to make things work for customers. The substitute for this is going to injecting knowledge into these processes – using business rules (and decisions I would add).

So Ron proposed from basic principles for Channel (or Silo) sanity:

  1. Follow the same basic rules through every channel – make decisions consistently about configuration, interactions etc
  2. Know what your rules are – how do you make these decisions?
  3. Give your rules a good life – manage them and treat them as an asset to looked after.

It’s important not to let the technological tail wag the business dog. Make sure that technology choices are supporting business choices and not the reverse. Replacing brick and mortar stores with apps, automating up-sell and cross-sell, demonstrating compliance – all these require better management of knowledge (decision-making, business rules) once people are out of the loop.

The cheapest way to differentiate your business he argues is to decide differently – apply different business rules.  And if you don’t want to be different, use rules to make decisions consistently.

Ron took a minute to make his usual distinction about business rules:

  • They are what you need to run the business. You would need the rules even if you had no software
  • They are not data or metadata – they are rules
  • They are about business communication – not primarily an IT artifact but a business one (or at least a shared one).

Ron believes that the best approach to this is to capture these rules in a structured, natural language-like form that are outside of any executable format. Here we disagree – I prefer a decision model supported by rules not a rulebook of rules.

Ron then focused on 5 big challenges companies are facing when trying to become more operationally excellent.

  1. Know your customer
    This is particularly hard if you are using multiple channels. You need to manage the rules to ensure that you make consistent customer decisions across channels.
  2. Closing communication gaps
    Project teams worry about this a great deal but in fact this is a general problem across many projects. User stories and other agile approaches can leave too much communication hidden and in particular it can leave out the underlying knowledge that supports your organization. And reinventing this each project is unhelpful.
  3. Knowledge recovery, renewal and retention
    Tribal knowledge is a problem. One company found that 60% or more of the staff who have critical tribal knowledge will retire in the next 3 years. And companies end up putting those with critical knowledge into boxes because they can’t afford not to have access to that knowledge. Plus many of the rules currently embedded in systems are wrong, even when they can be extracted into something more readable than code. It’s critical for companies to manage their knowledge so it is “evergreen” and stays current and timely.
  4. Compliance
    With changing regulations and laws, with contracts and agreements, with warranties offered etc. Being able to demonstrate compliance and trace back to the original source is key. Ron thinks that a written rulebook based on RuleSpeak is a a good way to link regulations to automated rules and that you should manage all your rules in this way.
    I disagree strongly with him on this. Our experience is that a decision model does a much better job of this (see this post). Writing a non executable set of rules and trying to link those rules to the regulations and to the executable rules is much more costly in our experience and a decision model works better for linkage. Don’t write two sets of rules (one executable, one not), it will be expensive.
  5. Business Agility
    If IT costs are too high then even simple course corrections or changes are not made when the business sees the need to change. Fighting fires can keep anyone from focusing on serious problems. Managing rules can reduce the cost of change and so drive more business agility.

Ron concluded with a strong statement with which I agree – no new channels will ever emerge for which you will not need to apply business knowledge. So manage it.  Though I would add that not all knowledge is business rules, an increasing amount is analytic and cognitive and a decision model works for those too.

Continuing to blog from Building Business Capability 2016 I am listening to Railin talking about next steps afterr getytingh funding for rules and process improvement. Railinc is a SaaS company supplying software to the rail industry. Railinc is modernizing how it implements rules and processes in their applications. The program covers all 6 product lines and up to 70 applications. This program designed to

  • Increase agility
  • Increase product quality
  • Increasse Railinc Knowledge
  • Reduce TCO

The program got approval in 2013, picked vendors and did technology PoCs in 2014 and last year focused on business methods for rules and process. A program was piloted in a single area. This year the program is enterprise-wide and is transitioning to include technology updates. Next year it is designed to wrap up with a transfer of operational ownership.

Change management, she says, is a journey and this program is mostly about change management. A plan and roadmap is good to keep the route and destination in mind but surprises, detours and side trips will happen! In this case, the plan had a clear set of target benefits but these are not a destination – the destination is the integration of these new technologies and behaviors into the company’s fabric, without needing a center of excellence, and shifting the focus to business-centric not IT-centric.

A big program, though, requires more specificity:

  • Become part of the fabric
    • New business analyst methods and tools around rules, vocabulary and stories
    • New technology like RedHat BRMS which means new (declarative) approaches
    • New governance processes and metrics, standards, audits etc
  • No need for a CoE to sustain
    • Empower business and IT leaders to drive and identify/fix behaviors that prevent this
    • Model behavior and take the hits as the program team, lead by example
    • Honest communication – reality about progress and minimize assumptions
  • Culture Shift to Business-enabled not IT-centric
    • Business ownership of externalized rules and process – very different from requirements with much more collaboration
    • Enable business operations – rule auditing, rules simulation
    • Increased collaboration

The overall approach is a classic change program:

railincprogram

Each year did different work streams. Business assessments started in one product line and expanded while business and technical POCs and development gradually added more applications to the scope. Training was across multiple business and technical rules and used learning by doing for a big part of it. Governance framework was designed early but not rolled out until later – piloting it now and planning a final rollout to wrap up. Operational metrics development is ongoing as this was not something the organization had.

One key lesson of this program is that delays and setbacks are integral to success – you have to embrace them and see them as opportunities for learning and growth.

It’s also important, she says, to understand “paved roads” on your journey.

  • Cultural Norms are important for instance so a change from IT driving to business ownership is a big deal and you must be clear about this.
  • Motivations have to change from schedule and budget to quality and agilty
  • Habits have to change from silos to collaborative
  • Change ownership has to go from the program team to operational teams

They have managed a lot of change with 7 compliant applications, 22 on the roadmap, 1,400 terms, they’ve changed their SDLC and developed a way to pick the technology, and over 75% of their analysts have been trained. In particular the analysts feel they have a voice and that the business/IT relationship has really improved.

 

Continuing with presentations at Building Business Capability, Kaiser Permanente and IBM presented on their decision management and cognitive platform for application development innovation. For those of you that don’t know, Kaiser is the Nation’s largest not-for-profit health plan. They do everything from inpatient, to home health, hospitals, hospice, pharmacy, and insurance. 10M+ members, 17,000 doctors and nearly 200,000 total employees.

IBM has been working with Kaiser for a while on their decision management journey. Thisd began with SOA infrastructure they first added BPM before adding in-context Decision Management to automate decisions. More recently they have begun to focus on predictive analytics, on cognitive services and on event-awareness and event processing. Most recently they begun to focus on decision modeling as a way to frame and structure their business rules and support analytic and cognitive development. They are using IBM’s BRMS Operational Decision Manager (ODM). They are also, it should be noted, a Decision Management Solutions customer.

Kaiser works with IBM on try-storms. These are brainstorming sessions involving Kaiser and IBM teams that result not just in ideas but in working prototypes based on technology from business rules to predictive analytics, complex events to cognitive. These sessions illustrate the value of the technology, rapidly identify scenarios and projects that could use it and help internal customers understand the potential.

Two scenarios are going to be shown – member-centric notifications for health scenarios and fighting the flu with cognitive.  These scenario use as conceptual architecture involving rules and cognitive decision-making against traditional and newer data as well as events.
img_20161102_115145639_hdr

The first demonstration discussed notifications. Often notifications are too generic, too late and too hard to target/change.  The new approach leverages SOA and messaging to pull the right data, Decision Management to control the formatting and content, and cognitive services to handle unusual data. For example, consider air quality – data managed by The Weather Company – and how that might affect notifications.

The demo began by showing the rules for a decision and how they can be configured by business owners. The business user can set up a simulation to see how a particular metric is met by the current rules. This allows them to baseline the rules to see how they are working. If that’s not working they can navigate to the rules for their decision – in this case as a decision table. For instance, the severity level of an air quality level that results in messages could be changed to increase or decrease the number of warnings issued. The new rules can be re-simulated to make sure the result is what was expected. This was a very rule-centric demo.

The second demo focused on detecting flu outbreaks. If this could be done, Kaiser could use rules-based decision management approaches to launch processes, alert people etc. This involves using event processing to detect and leverage structured events such as admissions or prescriptions. The event is an admission related to something epidemic (rather than scheduled, say). Then a rules-based decision checks to see if that was a flu admission. The event processing engine consumes events about flu admissions and checks to see if there have been a cluster of flu admissions in a time window and issues a notification. These get aggregated from hospitals to counties to drive alerts back down to other hospitals in the county.

This can then be combined with more predictive technologies. In this case, there are precursor symptoms that people experience before going to hospital. Tweeting or otherwise updating social media with these symptoms gives possible predictive data. Using Watson sentiment analysis one can analyze the social posts for tone (am I getting more sick and so more miserable) as well as for content including symptoms (not just mentions of the flu).

To make this work a set of Bluemix services are tied together to collect social media data using APIs, check the language and find the sentiment of the language. Another service is then used to classify the content as either about flu symptoms or not. This last was trained using a whole set of phrases that do describe symptoms and others that talk about the flu but not about being sick (such as tweets about epidemiology or CDC announcements). See this post for some details on Watson’s cognitive services.

These alerts are great but to be valuable, rules-based processing has to use this to take some kind of useful action. In this case the flu social media tracking was combined with other data to start predicting that admissions will rise soon in a particular area, allowing hospitals to see where they might need to prepare for flu admissions. The events being captured can also be analyzed geographically, allowing the movement of an epidemic to be predicted. This creates a layer of events – predicting flu is coming to an area, identifying people in an area talking about having the flu and ultimately getting admitted. The rules-based event engine can use this to trigger increasingly precise and useful actions.

A nice example of a rules-based infrastructure with a layer of analytic/cognitive enhancement. David wrapped up with a pitch for decision modeling, using the Decision Model and Notation (DMN) standard, as a way to exploit these decision-making technologies by engaging the business in configuring and designing the decision-making to be automated.

Don’t forget that David and I are speaking later in the week too with more on decision modeling.

I am attending this year’s Building Business Capability conference and blogging sessions like this one from Jim Sinur of Aragon Research. I gave a tutorial earlier in the week on decision modeling with DMN and will be speaking later in the week with David Herring of Kaiser Permanente (and signing my new book, Available now! Real-World Decision Modeling with DMN, at the bookstore after the session). Jim’s presentation is on Delivering the Digital Dream for Real, part of his focus on the move to digital. He plans to share real-world examples, discuss technologies that are out there, and outline a methodology for applying them.

His premise is that companies can only hide from, submit to or leverage digital transformation – it can no longer be ignored. Digital is the new normal and someone will use digital transformation to compete with you.

img_20161102_103851627_hdrWhy should organizations start on this journey? Digital change is happening so companies have to decide on an appropriate response. Companies can be divided based on digital capability and leadership capability. Digital Masters show higher revenue and market valuation while beginners get heavily punished. Those being conservative about it lose revenue while those chasing fashions get a revenue bump but at a cost.

The business and technical forces pulling on you vary by industry. Jim gave an example of a company that seemed like it was in an interesting space but new technologies like social undermined their model.

The journey, if you have control of it, goes like this:

  1. Customer delight
  2. Business operations
  3. New products and services
  4. Business model transformation

But if you get disrupted in the meantime you will have to accelerate it. And improvements in business operations fund the innovations you need. The future of business operations is mobile-first, enlightened decisions and smart actions. Understanding the customer journey is critical as it identifies WHEN you have to make a decision and WHAT decision is going to move the customer forward. Some examples:

  • One example journey is of a customer in a store. As they move around, as they shop in specific areas the store is going to use as digital platform with rules, analytics and cognitive capability to make good marketing and customer service decisions at the right moment.
  • Another example is doctors on their rounds. Same idea – what’s the journey, what do they need to know/have access to/have suggested to them at each point.
  • A third example was a surgery center using a digital platform and tags/sensors to improve operations. They would simulate the planned surgeries the night before and this gave them a baseline that could be used to drive visibility into variations. Plus the tags and models allowed them to find and inform loved ones, patients or staff as necessary.
  • Using IoT to manage power systems and help consumers reduce cost while helping the power company manage resources.
  • Using IoT to track dementia patients and use what’s known about the patient and their behavior to track when to intervene and what kind of intervention is appropriate.
  • IoT data to manage very distributed infrastructure and manage the staff who need to work on it. Drones and more are adding to this mix.

img_20161102_110238448_hdrWe live in a world, he says, with emerging processes that “swarm” to follow some goal. Processes will be changing from flow-directed to goal-directed. Workers, knowledge-workers, managers, services (including analytic and cognitive services), robots and sensors will all be coordinated much more dynamically. This will require both simple and emergent processes ranging from structured processes to more case/collaborative ones to IoT/Agent/Collaborative based ones. And all of them will require smart decision-making.

He illustrated this with a company’s evolving approach to farming. First they scattered measurement pills to track water and fertilizer that would sound alarms when something was needed. Then they linked these pills to bots to take the farmers out of the loop. And now they are linking these to water run off models to stop watering if water is coming anyway.

To get here, companies need a digital target: Goals, phases, benefits and costs; an idea of the organization, skills and technology that will be needed; actual POCs, a measurement approach and some capability for experimentation. This is based on executive vision, plans, customer input, competitive issues, constraints and technologies (both new and legacy). One of the most important things is to establish some business and technical principles that can be broadly applied for example:

  • Attract and spoil customers
  • Create operational excellence
  • Business first IT collaboration
  • Designing for sustainability

New competencies include constituent engagement, hyper awareness, complex problem solving, anticipatory decision making, innovative productivity, operations agility and more.

A digital business platform has lots of technology – some of it is ready, some less so.

  • Collaboration, journey mapping, big and fast data, and pattern processing (analytics) are all good.
  • Cloud integration, advanced analytics, IoT, video and 3D printing, augmented reality, block chain and cognitive are more or less ready

Leveraging these new technologies is a journey – a maturity model that ranges from initiating to opportunistic to managed, habitual and ultimately fully leveraged.

A digital platform helps organizations blend people, compute power and physical devices to deliver the best possible outcome. In an accident, for instance, your phone detects the accident and checks vital signs of passengers using smart clothes and the vehicle’s sensors. It communicates immediately with insurers and others to drive an automated yet highly tailored response. A platform that makes great and well informed decisions.

img_20161102_112216245_hdrIt needs to include

  • Processes and Cases
  • Cognition and Calculations – Decisions
  • Data and System Integration
  • Machines and Sensors
  • Business Applications

AllAnalytics recently asked its readers “What is the greatest danger spot for analytics projects?” and the results are pretty clear. Here’s a snapshot (the percentages have been pretty stable):

allanalyticsresultssnapshot

  • Top of the heap is “Identifying the Business Problem” with over 40%
  • Then it’s a close run thing between “Data sourcing” and “Putting data into action”, both at 20%

What’s interesting about this is that these problems don’t line up with the functionality in your typical analytic tool. What they do line up really well with, however, is decision modeling.

  1. Decision models scope and define the business problem for an analytic project really effectively. They focus everyone – business, IT and analytics teams alike – on the decision-making to be improved and put that decision-making into context by showing which organizations, goals. KPIs and processes are involved.
  2. Our experience is that decision modeling also really helps with data sourcing. Not directly but because it helps clarify what data you really need, what is really used to make the decision, and so reduces rework and wasted work around data.
  3. Finally decision modeling helps operationalize analytics, showing how the data-driven analytic can and should be used to make decisions and defining the automation necessary to make the analytic work.

So, if these problems resonate with your analytic experience, check out decision modeling as a cure:

Last session for me at IBM’s World of Watson is the keynote from IBM CEO, Ginni Rometty. Another very slick video on the different ways IBM’s cognitive, cloud and analytic solutions are being used around the world got us started. And again, IBM emphasized “with Watson” as part of their ongoing positioning of Watson as additive to people, not a threat to them. Then it was time for Ginni Rometty. The market for better decisions, she says, is $2T by 2025 – and better decision making is the name of the game. Ginni wants to show us three things

  • That Watson is the best AI platform for business
  • That Watson will let you build a cognitive business
  • That this is going to transform industries

She points out at once that Watson has rapidly achieved a critical mass – with hundreds of millions people being impacted already through medicine, shopping, insurance, weather, education and more. IBM she says has made three critical choices:

  1. That Watson is about augmenting and extending expertise – augmented intelligence
  2. That your data matters and you should be able to learn from this data, not others
  3. That Watson was going to be at the heart of a rich ecosystem with others also delivering value on Watson not just IBM – and that IBM would develop industry-specific infrastructure to make this work

Five areas for competitive advantage she says:

  • Deeper human engagement – with customers, employees, partners
  • Scaling expertise – make everyone perform at their best level, especially when people are retiring. And scale imagination – fashion, movies, cooking, music.
  • Embedding in every product to make those products more effective, more reliable
  • Change the operations in your company – make processes real-time aware, real-time learners.
  • Anywhere there is discovery and research where Watson can find new connections

And companies she says are working in all these areas and across hundreds of projects they have four lessons:

  1. Better data, better outcomes – and you need new kinds of data too
  2. Training is not programming – more upfront work but then it keeps learning
  3. Cloud and cognitive go together and add more value used together
  4. You must address people’s concerns around ethics, transparency, jobs etc.

Ginni was joined onstage by Mary Barra CEO of GM. GM has been developing OnStar for many years, connecting some of its cars to services. Automotive is being transformed by the explosion in connectivity, the electricification of cars, autonomous vehicles and the sharing economy. GM is trying to bring all these things together as part of reinventing itself moving forward. While some of these may change the relationship people have to car ownership and usage but, regardless, the vehicles themselves are going to be changed completely. People spend 46 minutes a day in their car and GM want to safely give people this time back. OnStar Go is the new service that GM and IBM are working on together. This takes the connectivity and monitoring of OnStar and adding Watson capabilities to act as a personalized assistant around things like picking up prescriptions or groceries, get gas, pre-order drive-through and more. And in 2017 model cars not some distant future.

Next up is the impact of Watson on education. Teacher Adviser uses Watson to become an individualized assistant for teachers – providing best practices but also learning about a teacher’s individual preferences and their students. John King Jr, Secretary of Education, came up to join Ginni. John began with the good news – highest high school graduation rate, more minority students going to college, more Pre-K and more school’s have STEM teachers and broadband connections. All good but there’s more to do – poverty continues to be a predictor of poor education outcomes and there is a disconnect between skills students learn and what companies need. There are many things that companies can do and ways thing can be improved but one possibility is something like Teacher Adviser. There’s a lot of advice out there but its hard to navigate and its tough to match specific best practices to specific students. Teachers, he says, really like the ability to use it to find the right thing to help specific students. He also likes the way it will learn from each teacher to help all teachers. And the ability to drive achievement broadly and deeply is a critical factor moving forward. Very inspiring conversation.

Healthcare is, Ginni says, IBM’s next moonshot and is therefore a real focus for IBM and its Watson business. As a data point, she says, it takes 12 years to get a drug to market at a cost of $1Bn and that few make it to market at all. Dr Yitzhak Peterburg of Teva Pharmaceuticals joined Ginni to discuss this. Teva is the world’s #1 generics company and serves 200M people. Patients are also customers and thinking of them as such – meeting their consumer expectations – is a challenge. They want a personalized, appropriately priced, convenient and transparent treatments.

Teva sees big data opportunities in things like biosensor data and the potential for analytic/cognitive technologies like Watson to personalize medicine and find new uses for existing drugs among much more. For instance, identifying patients at risk of asthma attacks using a cloud-connected inhaler can really help people manage this chronic disease. Alerting them in advance so they can find their inhaler and be prepared to use it or even take steps to prevent it. And the investment in a cloud, cognitive, big data infrastructure is designed to create more opportunities like this. In the future, they can even see 3D printing to deliver tailored doses, identify that a pill has been taken or even control the dose a pill releases – all controlled by the system in a personalized way.

A clip from the 60 Minutes show on cognitive discussed how Watson tracks all the papers in cancer research and advises doctors on what care to suggest to patients for whom existing treatments had failed. The back test found 99% of their suggestions matched by Watson but another 30% had additional recommendations.

Continuing with the healthcare theme, Ginni is joined by Prof Miyano from Japan to discuss their use of Watson in cancer treatment. They have been scanning the genomes of patients, identifying mutations and then using Watson to find the best fit. This is hard because a typical genome has many mutations anyway and cancers can introduce many more. Huge numbers of new papers discuss treatments that might be suitable but there are millions of mutations. Watson makes it possible to find the right treatment, even in very difficult cases, giving the medical team critical information and insight.

Final section was a little lighter, with a discussion of hit songs. Alex Da Kid has been working with Watson to see how one might develop a hit song. Alex likes to ask people about very personal, very emotionally powerful moments and then use those to drive songs. He’s collaborating with Watson to analyze what people have been writing, lyrics from billboard songs and musical patterns. The result is a new song – Not Easy – that is doing well and he says he has several more resulting from the collaboration. A great change of pace.

And Ginni wrapped it up – we are, she said, changing the world and we are just getting started.

Continuing in the analyst program at IBM World of Watson we got an update on the evolution of IBM’s Cloud Platform from Bill Karpovich. Cloud, he says, is strategic. Vendors must disrupt with cloud or be disrupted by it. Over the last few years, cloud has evolved from efficient public cloud infrastructure to new applications built on hybrid clouds that leverage new and existing assets to a future where processes will be reimagined to leverage new services that are really only available in the cloud. This last step is happening faster than IBM expected and focusing on cognitive, analytics, higher value packaged cloud services etc. Today IBM sees the various types of cloud solution (IaaS, PaaS, SaaS etc) as increasingly less relevant individually – what customers want now is a cloud platform that aggregates and simplifies the consumption of new innovation (whether new hardware, new data or new cognitive capabilities).

IBM is now laser focused on cognitive solutions and cloud platform. IBM Brands, R&D, acquisitions, open source and partners are increasingly centered around building the IBM Cloud Platform. Hundreds of new features being added each quarter as more existing capabilities migrate and new ones are created or acquired.

IBM Bluemix is now the unifying brand for IBM. It supports the layers:

  • Domain and industry solutions built on
  • Developer Services delivered on
  • Infrastructure Services for Compute, Storage and Network delivered on
  • Physical Infrastructure – public, dedicated and local for hybrid clouds.
  • All secured and managed using methods and services

cloudstackIBM sees processors are 8x faster in the last 5 years, networks are 500x faster and storage is 1000x. IBM therefore sees tremendous opportunity for a new level of cloud computing – supercomputing for all. In addition the amount of capability available on Bluemix – the number of APIs, the number of developers – is exploding as a result.

Key elements of this new cloud include data and analytics services as well as Watson and cognitive services. In addition, IBM is trying to develop developer productivity with a choice of approaches with consistency. So bare metal services can be provided, virtual servers, containers, open PaaS environments or “serverless” or event-driven apps. This range of services allows customers to trade control against speed and ease of development as appropriate.

In general IBM is taking an open-first approach throughout this stack. If they can take an open source project, build on it and submit some improvements they will. If not they might donate a chunk of work to create a new project such as Open Whisk.

Increasingly the most pragmatic and practical mode for enterprises is a hybrid cloud strategy. This allows them to leverage data and IT assets that already exist and bring new cloud services as well as moving some existing functionality to the cloud. To support this IBM offers

  • Bluemix Public, multi-tenant on IBM cloud
  • Bluemix Dedicated, single-tenant on IBM Cloud
  • Bluemix Local, single-tenant on Premise

These are designed to create a borderless environment, allowing companies to mix and match. Beyond Bluemix IBM Cloud includes all their SaaS applications as well as some other capabilities like object storage. IBM is also working with VMware to help companies move existing capabilities deployed on VMware onto a new cloud platform with the minimum impact. Similarly working with NetApp for storage.

This unification under the Bluemix brand is a journey, of course, with IBM gradually bringing all of their services under a single brand and with integrated catalog, purchasing, billing, reporting and management. New capabilities continue also to be added and IBM continues to invest in developing a next generation of cloud capabilities. One cloud platform, with data, analytics and other high value cognitive services that is designed for hybrid deployment and is both flexible and enterprise class.

One of the announcements at IBM’s World of Watson is of the new Watson Machine Learning Service. I got a chance to ask a few questions about this new capability. A couple of key elements emerged regarding the current platform and the immediate announcement.

First, some context. The Watson Data Platform (also announced at World of Watson) is designed to allow multiple roles to collaborate in developing advanced analytics. It provides a shared environment as well as capabilities such as annotations, versions, collaboration and edit history across all the various elements of the platform. One of the roles is the Data Scientist. Within the platform, the Data Scientist’s primary environment is a notebook metaphor (using the Jupyter notebook). The notebook environment in The Data Scientist Experience offers open source coding as well as some IBM extensions such as support for CPLEX.

The Watson Machine Learning Service is designed to both extend this notebook metaphor and allow a canvas to be used that is similar in style to SPSS Modeler – a more classic predictive analytic / data mining environment in which a set of nodes are linked together to define how an analytic model should be built. This more drag-and-drop environment will extend the ability to use these Machine Learning capabilities to audiences that are not programmers or data scientists. This canvas is a new web-based component. It leverages all the data integration and connection work done in the Watson Data Platform, allowing those developing analytic models to share the data ingestion and analysis being done on the platform. Initially the service and canvas offer Apache SparkML. Other open source algorithms as well as SPSS/IBM algorithms will be added moving forward.

Analytic models developed on the Watson ML service can be accessed throughout the Watson Data Platform, deployed as a Bluemix service with a REST API,  or teams can generate a PMML model for deployment to other environments. When deployed, any pre-processing or characteristic generation defined in the canvas is included so that the service takes the raw data defined and completes all the analytic processing required. These deployment options ensure that analytic models developed are not limited to use in other analytic tools but can be deployed for use in systems and processes throughout the organization.

Second keynote at IBM’s World of Watson is on the role of cloud and data as the foundation for cognitive systems and businesses. Bob Picciano from IBM’s analytics business kicked things off. Bob began with a discussion of how IT’s value has changed – from a focus on processing and how fast/cheap it can be done to a focus on how much insight can be generated. From a process economy to an insight economy. Unless you get insight out of your IT investment then you are going to fall behind. Watson and Cognitive, he says, changes  fundamentally how you get insight from your data. It changes the kinds of data that can be used and how quickly and effectively that data can be turned into insight.

As this change happens, cloud and data technology must evolve too. IBM is announcing many new DB2 features for analytic processing in cloud deployments as well as an ever increasing set of analytic services on Bluemix, IBM’s cloud platform. Plus IBM is investing heavily in data and analytics open source programs like Apache Spark.

IBM is also taking advantage of the ever increasing compute power. Tools like Watson Analytics leverage this compute power and Watson’s cognitive approach to make it easier for folks – citizen analysts as they call them – to analyze their data and make analytic decisions. Not only making it easier for people to analyze the data for themselves but also visualizing and packaging it up to share as storybooks. IBM is now focusing on data engineers and data scientists as well as app developers to see how these technologies can help them also.

With that, Bob announced the IBM Watson Data Platform, the on-ramp for a cognitive business. It is designed to change how people interact with their data by using Watson to drive a new kind of data platform:

  • Multiple professionals can collaborate on a common platform (metadata, governance) while still using tools for their specific role
  • Cloud makes this scalable and accessible and delivers tremendous power
  • Open Source to leverage core open source programs and be open to integration with others to ensure every approach can be pulled together.

The platform is driven by cognitive and AI to deliver an immersive data analysis experience based on an open and open-source platform. The platform contains a set of services for data as well as higher level services like Watson Analytics.

Rob Thomas demonstrated the new platform. A data scientist can use The Data Science Experience to pull in various data sources, analyze them and visualize the result, in 3D if he wants. He can annotate and extend this as he works. The platform allows a real-time collaboration so he can show his boss, using the visualization as a frame for the discussion.

  • First the platform presents a palette to access data and ingest it with built in metadata and lineage creation.
  • Second, data discovery tools allow this data to be explored
  • Third, collaborative tools allow this to be shared
  • Fourth, this needs to be deployed and so IBM has launched a new Watson ML service.

Watson ML is available to the Data Science Experience as well as more generally on Bluemix and through integration with Watson Analytics.

Stories about the way the Toronto Raptors and RSGmedia use the new platform followed, focused particularly on collaboration and easy access to a wide range of data presented using compelling visualizations – maximizing the ways data can be used. RSGmedia summarized their four keys to success:

  • Data Layer based on IBM Watson Data Platform
  • Modeling and algorithms with Spark and Python
  • Interactive user engagement with Watson Analytics
  • Packaged up as offerings customers can consume

IBM sees a high performance, cloud-first platform that supports collaboration is critical to success with data – to making data accessible and usable.

Next the CEO of The Weather Company came up to talk about data, especially to fast moving streaming data. He brought on American Airlines to talk about the importance of weather data. Weather, of course, is a huge deal for an airline. American uses the data and services of the Weather Company (and embedded people) to predict weather patterns and see when to stop, when to re-start etc. No obvious use of Cognitive here – just a description of Weather Company services, though he says they are starting to include more cognitive and predictive analytic capabilities in their services such as providing better bots for weather e.g. in Facebook Messenger.

For some reason, the Weather Company bit of IBM is working on the new Watson Ad capability. This is designed to provide interactive ads powered by the Watson engine for conversation and natural language handling. This is designed to deliver a more conversational interface for advertising, making them more interactive and compelling. An interesting approach that might make ads a lot more engaging. Not sure why Weather powers these Ads though….

Last section is about the role of cloud in all of this.  Robert LeBlanc comes on stage to talk about the IBM cloud and how it is designed to support this new cognitive era. In conversations with clients, he says, the focus is not about cheap any more, not about cost reduction, but has shifted to adding value. The ability to rapidly build cognitive applications on the Bluemix cloud is a key focus, with new services being added to cope with new data (such as video) and provide new ways to analyze this data. Bluemix is designed to provide cloud-based processing while ensuring that companies continue to own and get value from their own data.

IBM is also delivering an increasingly wide variety of services on a wide variety of hybrid clouds as well as on-premise or completely in the public cloud. IBM offers 48 global data centers to make it easy to host and access your data where you want. As applications have to be reworked to take advantage of cloud and analytics, IBM has tried to focus on open APIs so that you won’t need to rework things. The Bluemix garage methodology is also available to help organizations make the most of these capabilities. SAP joined IBM on stage to talk about how the IBM cloud is being leveraged by SAP with hundreds of clients moving their SAP capabilities to the IBM cloud.

During the last 10 minutes, an IBMer has been working away adding image processing to an existing app. A quick demo followed showing how the image recognition was able to identify hail damage from drone photos and then estimate repair costs. He trained Watson using photos of damage and no damage in a few minutes. Once Watson could recognize the hail damage it could be included in the app using Bluemix and available APIs.

And that’s a wrap for the keynotes.

It’s opening keynote time at IBM’s World of Watson 2016 and we kicked off with a video history of Watson from Jeopardy to today. Dr John Kelly of IBM got us started, emphasizing how rapidly interest in Watson has grown over the last year or two.  In August 2007, he says, a small team of researchers proposed to build an AI/Cognitive system – they felt they had the key techniques and technology developed to succeed even if many (all) previous attempts had failed. 5 years later Watson won its Jeopardy appearance.

Now, he says, Watson is starting to fundamentally change the way decision-making is done across many industries. In a few years, he says, things are going to really change – people making complex decisions will all want to consult a cognitive system to help them make a better one. Whether they are discussing mergers, considering a difficult cancer patient or something completely different. And beyond that, he thinks, Watson will begin to predict not just assist. But his focus remains on how Watson can and will help people make better decisions.

Tom Friedman, the author of The World is Flat, joined Dr Kelly on stage. He has been working on a new book called “Thank You For Being Late” – all about the value of giving people the time to pause in an era of acceleration. He told a great story about meeting a guy at a parking garage and interacting with him about how to write a column. He likes to provoke or illuminate with his columns which takes understanding one’s position on the world, how one things the world works and what you think about this.

Right now he sees the way the world “machine” works changing – the digital globalization of the market, Moore’s Law and technology and the rapid change of nature due to climate change and population growth. All three are hockey-stick graphs and all three interact with each other.  As he was looking at this he saw that 2007 was an important year – the iPhone, Facebook, Google buying Youtube, the price of sequencing a genome or solar power fell off cliffs, Intel moved off Silicon and much more – and IBM’s researchers started Watson. He thinks 2007 was a technology inflexion point – and we missed it because of the crash of 2008. In addition, the political impact of this was that much of the social and legal framework needed to cope with this change did not get built and so there is a major disconnect.

All of this technology change gets lumped into the cloud but he finds this too “soft” a word as it is really a supernova of change. Storage, compute power, connectivity all came together in 2007 to deliver an invisible technology platform. This changes four things – it changes:

  • Power of 1 person to build or break is greater than ever
  • Power of many, of groups,  to change the world is greater
  • Power of flow as ideas flow around the world
  • Power of machines

And the power of machines was demonstrated by Watson winning Jeopardy in 2011. And the world was never the same since…

Politics, geopolitics, the workplace, ethics and community are all being dramatically changed by these trends of market, nature, and technology. The challenge is how to reimagine them. He talked briefly about three of them:

  • For the workplace, for instance, has to figure out how machine technology and AI are going to change things – how to use intelligent assistance to change people’s jobs. This means new skills, continuous learning by employees and much more.
  • Politics is being blown up as things change – the parties were structured around old problems and not about these challenges of climate, technology and globalization. He used nature as an example of coping with change – sustainable, experimental, fill niches, patient, willing to kill failures etc. Politics, he says, is going to be overrun by the pace of change and only parties that can be adaptive to this new world will survive…
  •  Ethics is also going to change. As everything we do becomes digital – friendships, relationships, work and much more – we need to rethink value systems to work in this connected but not hierarchical environment. The power of one person in this environment is completely different – to make or to break everything. This means that how people think and act – their ethical view – really matters. We have to scale the golden rule – do unto others as you would have others do to you – to include everyone. Family, values, teaching, ethics all really matter.

A great speaker and a great speech. David Kenny, GM of Watson, drew the short straw and had to follow Tom.

David reiterated the focus on augmentation – that AI is augmented intelligence rather than artificial intelligence. We have a long history of using technology to augment our cognitive capabilities. As the world becomes awash in data the need to apply analytics and cognitive to make better decisions becomes even more important. David reiterated the four elements that IBM sees supporting this:

  • Cloud – the IBM cloud and hybrid cloud particularly
  • Content – managing both structured data and unstructured text and content
  • Compute – algorithms and services to understand and extract value from this content
  • Conversation –  human ways to work with these elements in conversational applications

Watson must be able to understand language, reason at scale, interact naturally – this last includes some new announcements for Apple iOS applications that can be connected to Watson. David used a few customer stories to illustrate Watson:

  • Bradesco, a South American bank, recently used Watson to support mobile applications and connected it to their legacy applications so employees could support customers more effectively.
  • Staples illustrated their Watson powered “easy button” – offering a chat bot or text or audio to help their staff and their customers. It handles more and more transactions, freeing customer service staff to work on more complex problems.
  • GSK – Theraflu – uses Watson to power an interactive tool for helping people find out how over the counter medicine might help or if they need something more. By answering questions they hope to help but also build brand loyalty.
  • KPMG uses Watson in its audit practice using it to help auditors find very detailed information about loan portfolios for instance and presenting it in a traditional format and with explanation/justification.
  • OmniEarth discussed how they work with municipalities to converse water, especially outdoor watering. Satellite and other image data is processed. Watson is used to process the images and classify surfaces to see how much water is being used.

Pearson (publisher of both Smart (Enough) Systems and Decision Management Systems) came up next to announce a partnership with IBM. As the number of students explodes the challenge is making sure these students get a great teacher – how can higher ed be scaled. The partnership is about using Watson to help teachers and help students be more prepared.

Sebastian Thrun of Udacity wrapped up the session. He began by talking about teaching AI at Stanford when it was still a niche topic and the shift, 5 years ago, when the class went online and 100,000 students took it. He uses self-driving cars to illustrate a critical point about cognitive technology – that everyone benefits when it learns from a mistake. People find it nearly impossible to learn from the mistakes of others, but self-driving cars and other cognitive systems can. Anything repetitive can be improved therefore more rapidly by cognitive systems. And this ability is going to help cognitive systems accelerate past people in many tasks at an ever increasing rate. Udacity is partnering with IBM to deliver a nanodegree program on artificial intelligence.

Continuing in the analyst program at IBM’s World of Watson event with Beth Smith, GM Offerings and Technology for IBM Watson, introducing some Watson elements for Conversation – one of the four C’s of Watson (Cloud, Content, Compute and Conversation).

Watson, at its core, is about finding knowledge in noisy data at enormous scale. Watson listens to signals, uses machine learning and deep learning to find patterns and then makes recommendations that it can explain. For instance, in the Conversation piece, there is Watson Conversation for developers (designed to be used with other content and services) and a configurable apps – Watson Virtual Agent for customer service (built on top of the developer service).

Both kinds of products – developer services and configurable apps – are delivered continuously as cloud solutions with new capabilities being added. This kind of development is increasingly done with the interactions the deployed services are managing.

Watson Conversation is a service that is free for developers to engage with. It comes with the tone analysis service integrated. The service has four main pillars:

  • The intents
    Each intent can have lots of strings representing different ways to say the intent. The system uses these as a base line set of ways to identify the intent but will also learn other ways to identify the intent.
  • The entities
    Can define new ones and can use system entities like date and time, percentage etc.
  • The dialog
    A flow can be defined for a dialog using steps and links
  • Improvement
    Can see interactions by intent, entities etc so can rapidly see what would help it be better.

It’s worth noting that there is no additional training step – as you add things to the definitions they are part of the system’s behavior. At any point the developer can use a try panel to see how a particular string is handled.

Watson Virtual Agent is preconfigured on top of the Conversation service with pre-defined intent, entities etc. The service is configured using a more business-oriented UI based on tiles. There are various handlers for each tile – redirect, invoke your own workspace, escalate to a human agent, give a text response. This allows the business user to configure one of the 90+ predefined cross-industry intents (there are also a number of Telco-specific ones) with the option to link to a custom solution.

The app has some lightweight metrics and reporting built in around intents and entities – for instance what are the intents that lead to human interaction most often. In addition all the data can be exported for analysis elsewhere.

Another key piece of tooling is the Watson Knowledge Studio, designed to support exploration and discovery. This is designed to move beyond these kinds of structured conversations to a more general understanding of a domain. This allows a Watson service to apply a more domain-specific view of the content. Examples are viewed and mapped to defined entities. Relations can be defined simply by linking them graphically – allowing the organization that manufactures a product to be shown, for instance. The engine will then use patterns in the example to find similar patterns and so identify additional entities. It also uses the patterns to avoid over-classifying things based on simple format or data type. Domains defined in this way can then be applied using the Watson Alchemy Language Service or Watson Explorer.

In summary, the ability to merge and layer these services, using a rich set of APIs, as well as the ability to customize their behavior and apply customer domain knowledge, are critical to scaling Watson services.

Watson also processes in a wide variety of languages and more are being added – these are not translations but learning done in a different language. In addition, there are starter kits, demos, sample code etc on the Watson Developer Cloud.

I am attending IBM’s World of Watson and will be blogging as much as I can. First up is a two-part session on Advanced Analytics and how you can put advanced analytics at the heart of a Cognitive strategy. Paul Zikopoulos kicked things off talking about the potential for data to transform business and the huge amount of data being generated 24×7. Yet, he points out, 24×7 real-time decisioning not so much. The IoT, of course, will explode even the current levels of data and that, he says, leads you to need Cognitive capabilities.

Self service, he says, has largely failed in most organizations. People are not really self-serving, they are still using other people’s creations – the new tools made it easier to build things but did not really change the paradigm. Self service though requires data that can be trusted, tools that allow business and IT to collaborate, ways to check and manage bias and much more. This complexity is what led IBM to create Watson Analytics. And at the end of the day it’s all about improving the quality of decision-making.

Mark Altshuller joined him to discuss the vision for analytics at IBM:

  • Expand the user base to include citizen analysts
  • Rethink the UI around IBM’s design thinking principles across products
  • Make it easier to connect and use data
  • Smarter self service

He presented the data to insight lifecycle – Operational Reports to Data Discovery/Predictive Analytics to Enhance/Operationalize to Smarter Decisions and repeat. This is going to frame the discussion in the session, he says. I like the focus on decision-making but I prefer to start with the decision and work back to the data 🙂

A short video of the new capabilities as part of the continuous delivery of IBM’s analytics platform followed with lots of new UI and functions. The new UI is also becoming more embeddable and customizable with new visualization capabilities (shared between Cognos and Watson Analytics), geospatial mapping and data management capabilities.

The new geospatial mapping capabilities are focused on a wide range of geospatial problems, up to and including displaying real-time data streams on maps (demonstrated by Mapbox, one of the new partners), analyzing tweets to see who is a tourist and how they move relative to locals etc. Indeed processing some data, like phone operating systems, can show the map without the map and show divides like gentrification of city based on Apple v Android. More accurate processing allows the lanes on roads to be identified and analyzed separately.

Data discovery was introduced in Watson Analytics and the usage of this original interface was instrumented and analyzed, allowing IBM to simplify and streamline the UI in more recent versions. This usage also identified some very common and strong use cases to be identified to allow these to be made much easier also. Most recently, Cognos data packages can be included in Watson Analytics. Moving forward, Watson Analytics is focused on adding new algorithms, new uses for Cogntive.

For operationalization, Ritika Gunar came up to discuss IBM’s commitment to Apace Spark as the “Analytics Operating System”. This original commitment has led IBM to become a major Apache contributor. The new Analytics IDE – the Data Science Experience – was next and has been widely extended with partners. This IDE is focused on three things:

  • Built-in learning because things are changing fast
  • Create using open source or commercial add-in
  • Collaborate around the data and analytics

Very familiar modeling workflow UI has been used to make it easy for people to use the machine learning capabilities in the new environment. The environment also has versioning, collaboration tools, notebooks that include multiple environments and job scheduling. SPSS models can be integrated and new models can be built in a drag and drop canvas. All on Apache Spark. This can be integrated with Watson ML for deployment.

The CIO of Ameritas came up to tell a customer story. Their focus was on self-service (no data scientists – business and actuarial people), cloud for agility and a vendor with longevity. They use Watson Analytics, SPSS Modeler on the Cloud and Cognos Analytics on the Cloud.  And they are watching Cognitive a lot as they see the application of Cognitive to analytics as a critical next step.

Alistair Rennie introduced some of IBM’s planning and forecasting capabilities. A Cognitive business, he says, is a thinking and agile business. It’s not just about improving decision-making but being able to share and extend this drives agility. In particular he says, this really changes the performance management and planning environment. The platform increasingly connects sales, operations and finance, delivering capabilities for planning and forecasting that are more integrated.

Bill Guilmart introduced some clients: Zions Bancorporation came up to discuss their use of the integrated planning and compensation management tools as well as Watson Analytics. They particular liked the more integrated, more real-time/on-demand approach as well as better explanations. GCI Corporation (Alaskan Telecommunications) also came up and talked about more integrated capital investment management, project tracking and more. Bill gave a quick run through of the new collaboration and visualization capabilities in the planning and management capabilities.