≡ Menu

Daniel Hernandez kicked things off with a discussion of data for AI. AI adoption, IBM says, is accelerating with 94% of companies believing it is important but only 5% are adopting aggressively. To address perceived issues, IBM introduced its ladder to AI

  • Trust
  • based on Automate (ML)
  • based on Analyze (scale insights)
  • based on Organize (trusted foundation)
  • based on Collect (make data simple and accessible)

This implies you need a comprehensive data management strategy that captures all your data, in rest and in motion, in a cloud-like way (COLLECT). Then it requires a data catalog so the data can be understood and relied on (ORGANIZE). Analyzing this data requires an end to end stack for machine learning, data science and AI (ANALYZE). IBM Cloud Private for Data is designed to deliver these capabilities virtually everywhere and embeds the various analytic and AI runtimes. This frames the R&D work IBM is doing and where there expect to deliver new capabilities. Specifically:

  • New free trial version available at a very long URL I can’t type quickly enough. This lets you try it.
  • Data Virtualization to allow users to query the entire enterprise (in a secure, fast, simple way) as though it was a single database.
  • Deployable on Red Hat OpenShift with a commitment to certify the whole stack on the Red Hat PaaS/IaaS.
  • The partnership with Hortonworks has been extended to bring Hadoop to Docker/Kubernetes on Red Hat.
  • Working with Stackoverflow to support ai.stackexchange.com

A demo of ICP for Data in the context of a preventative maintenance followed. Key points of note:

  • All browser based of course
  • UI is structured around the steps in the ladder
  • Auto discovery process ingests metadata and uses AI to discover characteristics. Can also crowdsource additional metadata
  • Search is key metaphor and crosses all the sources defined
  • Supports a rich set of visualization tools
  • Data science capabilities is focused on supporting open source frameworks – also includes IBM Research work
  • All models are managed across dev, staging and production and support rolling updates/one-click deployment
  • CPLEX integrated into the platform also for optimization

 

IBM is hosting an event on its AI strategy.

Rob Thomas kicked off by asserting that all companies need an AI strategy and that getting success out of AI – 81% of projects fail due to data problems – involves a ladder of technology with data at the bottom and AI at the top. It’s also true that many AI projects are “boring”, automating important but unsexy tasks, but Rob points out that this builds ROI and positions you for success.

To deliver AI, IBM has the Watson stack – Watson ML for model execution, Watson Studio to build models (now incorporating Data Science Experience DSX and SPSS Modeler), APIs and packaged capabilities. The business value, however, comes from applications – mostly those developed by customers. And this remains IBM focus – how to get customers to succeed with AI applications.

Getting Watson and AI embedded into all their applications, and instrumenting their applications to provide data to Watson, is a long term strategy for IBM.

Time for the first session.

Digital Insurance’s Insurance Analytics and AI event is coming to New York, November 27-28 (new venue and date) Austin September 27-28, 2018. This is going to be a great place to learn how to use analytics, machine learning, data science and AI to modernize your insurance business. I’ll post more as the details firm up but I’m really excited about one session I know is happening.

I am participating in a panel on the Role of AI and Analytics in the Modernization of Insurance. I’m joining Craig Bedell, an IBM Industry Academy member for Insurance, as well as two analytic leaders – Hamilton Faris of Northwestern Mutual and Tom Warden of EMPLOYERS. We’ll be sharing our advice on modernizing insurance decision making across sales, underwriting, pricing, claims and much more. We’ll explore the Analytics and AI innovation journey and highlight how insurance firms that combine these efforts with operational decision management efforts are far more likely to succeed in digital transformation. It’s going to be great.

Register here – and do it before August 17 to get the Early Bird rate! See you in NY.

I am giving a webinar on Delivering the Business Value of Analytics, August 14th at 10am Pacific/1pm Eastern.

Many organizations still struggle to get a business return on their investment on advanced analytics. The biggest barrier? An inability to integrate analytics, especially predictive analytics, into frontline systems and business processes. Work with a number of global companies has revealed three critical success factors. By adopting a more decision-centric approach to analytics, changing the way requirements and business understanding are defined, and considering advanced analytics as one of a set of decision-making technologies, organizations can tie their analytics investments to business results and deliver the business value they are looking for.

This is a live webinar based on one of the highest rated sessions at Predictive Analytics World 2018. Attend to ask your questions live or, if you would like to attend but can’t make the webinar time, register to receive a copy of the presentation and a link to the recording.

Register here.

I’m a big believer in decision models using the DMN industry standard notation and Decision Management Solutions uses it on all our projects – we’ve modeled over 3,000 decisions and trained over 1,000 people. But we don’t use executable decision models very often and strongly disagree with those that say the only good decision model is an executable one.

This week I wrote a set of posts over on our corporate blog about the three reasons we don’t – business user engagement, maintenance, analytics/AI – and about our vision of a virtual decision

Check them out.

Karl Rexer of Rexer Analytics is at Predictive Analytics World this week (as am I) and he gave some quick highlights from the 2017 Rexer Analytics Data Science Survey. They’ve been doing survey since 2007 (and I have blogged about it regularly) and the 2017 is the 8th survey with 1,123 responses from 91 countries. Full details will be released soon but he highlighted some interesting facts:

  • Formal data science training is important to respondents (75% or so) with particular concerns about data preparation and misinterpreting results when people don’t have formal training.
  • Only about one third have seen problems with DIY analytic and data science tools, which is pretty good and getting better.
  • Most data scientists use multiple tools – an average of 5, still – with SQL, R and Python dominating across the board outside of academia.
  • R has shown rapid growth over the last few years with more usage and more primary usage every year and RStudio is now the dominant environment.
  • While there’s lots of interest in “deep learning”, 2/3 have not used deep learning at all with only 2% using it a lot so it’s not really a thing yet.
  • Job satisfaction is good and most data scientists are confident they could find a new job – not a big surprise.
  • People agree that a wide range of skills are needed with domain knowledge scoring very highly as important. Despite this recognition everyone still wants to learn technical skills first and foremost!

Looking forward to getting the full results.

We are relaunching our newsletter, in a GDPR-compliant way, with a focus on DecisionsFirst Digital Transformation. We were probably GDPR-compliant before but better safe than sorry.

We send emails about every 3 or 4 weeks with information about resources, events and articles on Decision Management topics. For example, case studies in digital transformation; decision modeling as a best practice for getting business value from your technology investments in business rules, predictive analytics, AI, and machine learning; upcoming webinars and events; training and more.

If you used to get it, or you want to start getting it, please sign up here.

An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. This is the third of three blog posts about how to become an analytic enterprise:

  1. Focus on business decision-making.
  2. Move beyond reporting to predict, prescribe, and decide.
  3. Use analytics to learn, adapt, and improve (this post).

Analytic enterprises don’t focus on big wins but on using analytics to learn what works, to adapt decision-making, and continuously improve results.

An analytic enterprise collects data about how it makes decisions and about the outcome of those decisions. It records the data that drove its analytics and the analytics that drove its decisions. Business outcomes may be recorded as structured data, as social media posts, as unstructured text, as web activity or even sensor data. This varied data is matched to decision-making and analytics so that the true impact of the analytics that drove those decisions can be assessed.

This continuous monitoring identifies new opportunities for analytics, which analytics need a refresh, where ML and AI might be valuable and much more. An analytic enterprise learns from this data and moves rapidly ensure that its decision-making is updated effectively. Its analytic platform links data, analytics and outcomes so it can close the loop and continuously improve.

Check out this video on how analytic enterprises learn, adapt and improve and download the new white paper Building an Analytic Enterprise (sponsored by the Teradata Analytics Platform).

An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. This is the second of three blog posts about how to become an analytic enterprise:

  1. Focus on business decision-making
  2. Move beyond reporting to predict, prescribe, and decide(this post).
  3. Use analytics to learn, adapt, and improve.

Historically, most enterprises have focused on analytics for reporting and monitoring. Success as an analytic enterprise means using analytics to enable better, more data-driven decisions. This means shifting to predictive analytics that identify what is likely to happen in the future and prescriptive analytics that suggest the most appropriate decision or action. Predictive analytics give analytic enterprises a view ahead, so they can decide in a way that takes advantage of a fleeting opportunity or mitigates a potential risk.

Many of the decisions best suited to analytic improvement are operational decisions that are increasingly automated and embedded in IT infrastructure. If these decisions are to be improved, multiple predictive and prescriptive analytics must often be combined in a real-time system.

An analytic enterprise needs an analytic platform and data infrastructure that supports both predictive analytics and automation. It uses its analytic platform to embed predictive and prescriptive analytics in highly automated, real-time decisions.

Check out this video on how analytic enterprises predict, prescribe and decide and download the new white paper Building an Analytic Enterprise (sponsored the Teradata Analytics Platform).

An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. This is the first of three blog posts about how to become an analytic enterprise:

  1. Focus on business decision-making (this post).
  2. Move beyond reporting to predict, prescribe, and decide.
  3. Use analytics to learn, adapt, and improve.

Success in analytics means being business-led, not technology-led. Analytic projects that focus on data or algorithms prioritize being able to start quickly over business value. In contrast, a focus on improving business decision-making keeps business value front and center.

A focus on decision-making also acts as a touchstone, preventing a chase after the next shiny object. It provides a business justification for the data, tools, and algorithms that will be required.

Modeling the decision-making in a visual way, using a notation like the industry standard Decision Model and Notation (DMN), breaks down complex decisions and shows what analytic insight will help make the decision more effectively. These models show the impact of expertise, polices and regulations while also clearly showing what data is used in the decision.

When a decisions first approach is combined with a flexible analytic platform, analytic teams are released from the constraints of their current tools or siloed data to focus on business value.

Check out this video on how analytic enterprises put business decisions first and download the new white paper Building an Analytic Enterprise (sponsored by the Teradata Analytics Platform).

A recent article on Dig-in talked about How insurers can think strategically about AI. It contained a killer quote from Chris Cheatham of RiskGenius:

A lot of times people jump in and try AI without understanding the problem they’re trying to solve. Find the problem first, then figure out if AI can solve it, and what else you need to get the full solution.

Exactly. We have found that focusing on decisions, not a separate AI initiative, delivers business value and a strong ROI. Only if you can define the decision you want to improve with AI, and build a real model of how that decision works – a decision model – can you put AI to work effectively. Check out our white paper on the approach – How to Succeed with AI – or drop us a line to learn more.

John Rymer and Mike Gualtieri of Forrester Research have just published a new piece of research – The Dawn Of Digital Decisioning: New Software Automates Immediate Insight-To-Action Cycles Crucial
For Digital Business. This is a great paper – not only does it mention some Decision Management Solutions’ clients as examples, it makes some great points about the power of Decision Management and some great recommendations about how best to approach Digital Decisioning.

Digital decisioning software capitalizes on analytical insights and machine learning models about customers and business operations to automate actions (including advising a human agent) for individual customers through the right channels.

In particular I liked the paper’s emphasis on keeping business rules and analytics/ML/AI integrated and its reminder to focus first on decisions (especially operational decisions) and not analytic insights. These are key elements in our DecisionsFirst methodology and platform and have proven themselves repeatedly in customer projects – including those mentioned in the report.

Our DecisionsFirst approach begins by discovering and modeling these operational decisions, then automating them as decision services and finally, as also noted in the report, creating a learn and improve feedback loop.
As Mike and John suggest, we combine business rules, analytics and AI into highly automated services for decision-making then tie this to business performance using decision models.

It’s a great report and one you should definitely read. I’ll leave you with one final quote from it:

Enterprises waste time and money on unactionable analytics and rigid applications. Digital decisioning can stop this insanity.

You can get the paper here if you are a Forrester client.

ACTICO has just released ACTICO Modeler 8 – the latest version of the product previously known as Visual Rules for Finance (see most recent review here). ACTICO Modeler is a project-based IDE. ACTICO users can now select whether to create a “classic” Rule Modeling project or a Decision Model and Notation (DMN) project. The DMN modeler supports Decision Requirements Diagrams, Business Knowledge Models (BKMs) and the full FEEL syntax.

Decision Requirements Diagrams are built using drag and drop or by working out from existing diagram elements. When a Decision, Input Data, Knowledge Source or BKM is selected its properties can be filled out and this includes linking to other objects, like organizational units, that are managed in the project. A decision model supports multiple diagrams on which objects can be reused – users can drag existing model objects from the project repository structure or search for them. Decisions, Input Data, Knowledge Sources and BKMs are genuinely shared across all the diagrams in a model’s project. Any change on one diagram is immediately reflected on all other diagrams.

Existing DMN models can be imported simply by dropping DMN XML files into the environment. As DMN 1.1 models don’t have diagrams, users can simply add a new diagram to an imported project and drag elements on to it as needed.

All boxed expressions and full FEEL are supported – literal expressions, contexts, invocations, lists, relations, function definitions and decision tables. Validation is applied as syntax is edited using the classic squiggly red underline and supporting hints to correct it. A problems view summarizes all the problems in the current model and this is dynamic, updating as the model is edited. The core FEEL validations are in the product already and more are planned in coming releases.

Decision services can be defined using their own diagram, allowing the user to show which decisions should be included in the decision service and which ones are invokable. All the information requirements that flow across the decision service boundary are defined. Each decision service has its own diagram and the relevant decisions are dragged from the project to create the decision service. The decision service can be invoked from the ACTICO classic rule representation. This allows, for instance, test cases to be reused and allows new DMN models to be deployed and managed using the existing server architecture. Individual decisions and BKMs can be tested using the same mechanism.

A view of the ACTICO DMN Modeler showing a Decision Requirements Diagram and a Decision Table for one of the Business Knowledge Models displayed.

You can get more information on the ACTICO DMN Modeler here.

Jim Sinur of Aragon Research recently published a new blog Mounting Pressure for Better Decisions. He argues, correctly, that decision making is under pressure because there is more data available than ever before, a need for faster change in the way organizations make decisions to respond to evolving circumstances and a general need for speed in handling transactions.

We help companies improve decision-making by applying our DecisionsFirst Decision Management approach and by building Decision Management Systems for them. Combining decision models (built using the Decision Model and Notation or DMN standard) with powerful business rules management systems, advanced analytics (machine learning, predictive analytics) and AI, we help companies see a set of unique benefits:

  • Improved consistency
    Decision models enable consistent decision making across channels and people without imposing mindless consistency.
  • Increased Agility
    The systems we build are easy for the business to change in response to new business conditions because the business understand the decision models and own the business rules that drive the system.
  • Reduced Latency
    The combination of business rules and advanced analytics enables higher rates of straight through processing (automation) while also ensuring more clarity and less confusion for the transactions that must be handled manually.
  • Lower Cost
    Decision Management Systems reduce costs by ensuring less waste and rework, more STP and fewer manual touches.
  • Better Accuracy
    Decision Management Systems operationalize data-driven, analytical decisions throughout the organization to improve the accuracy of decisions everywhere.

If you are interested in learning more about Decision Management and the technology available for it, check out our Decision Management Systems Platform Technology Report or contact us for a free consultation.

I am co-hosting the TDWI Solution Summit Putting Big Data and Analytics to Work in Your Organization with Fern Halper, June 3-5 in Coronado, California. TDWI Solution Summits are great events and I am looking forward to it.

The agenda is focused on how to get full business value and organizational advantage from big data and analytics. Modern environments for analytics are complex but they are also an opportunity to develop a data-driven business.

I am kicking off the first full day with a session on Taking Advanced Analytics from Technical Investment to Business Value while helping Fern host the event and moderating a panel on building a successful program.

This is an invite only event – apply here by April 6 if you are interested.

This year Predictive Analytics World is coming to Las Vegas June 3-7, 2018, for the largest Predictive Analytics World event ever. There’s a packed agenda and I am kicking off the Analytics operationalization and management track, which is full of great sessions focused on how to take analytics (and AI) and get real business value from them.

My session is on Delivering the Business Value of Analytics and I’m followed by some great sessions by Northwestern Mutual on overcoming obstacles and Quicken Loans on closing the communication gap. On the second day I am moderating a panel on Operationalizing Machine Learning: How to Ensure Value-Driven Deployment. There are sessions on recruiting, training and managing your data science practice – including one from my friends at Cisco – as well as discussions of analytic maturity -including one by Bill Franks, Chief Analytics Officer of the International Institute for Analytics (of which I am a faculty member).

There’s a ton of good material here, especially for those of you more interested in how to integrate analytics and data science into your business and IT architecture (and perhaps less interested in the mechanics of building models – something well covered in the other tracks).

I hope to see you there. Register here and use the code SPEAKERtaylor to get a 20% discount.

Maureen Fleming of IDC presented at IDC Directions on How Does Decision-Centric Computing Drive Digital Transformation? She kindly shared this presentation with me. Decision-centric computing, she says:

continuously receives and analyzes data to predict when decisions need to be made, systematically learns how to automate those decisions, and acts on each decision to improve performance.

Exactly. We call these Decision Management Systems but the concept is the same.

While the presentation focused on IoT and streaming scenarios, the concepts can be applied more generally – after all, many business scenarios are heading to a streaming solution. The most interesting piece was this graph titled “Predictive Analytics is Only a Piece of the Puzzle”
This graph shows that organizations that are mature in terms of predictive analytics use business rules a lot (70%), those that are in production with something use business rules a little (24%) and those that are stuck in development are not using them very much at all (5%).

This illustrates a point we make with analytics clients – a business rules management system is a great platform for deploying predictive analytics, especially when you apply Decision Management principles and decision modeling to do the rules in a decisions first way.

For IDC subscribers, Maureen has written Introducing Decision-Centric Computing which has another great quote:

Without a way to incorporate decision automation to make repetitive decisions, enterprises will find it increasingly difficult to justify their investments in advanced analytics and risk failure to materialize the anticipated benefits

Decision Management is a proven approach to delivering Decision-Centric Computing and using a Decisions First methodology effectively combines business rules and predictive analytics using decision modeling. What are you waiting for?

Jim Sinur, VP of Research and Aragon Fellow at Aragon Research recently posted “Better Decisions with Decision Management” to his blog. Jim begins by describing Decision Management as “another discipline that will help consistently deliver better decisions”, especially when added to analytics and AI.

It’s great to have Jim’s focus turn to a Decision Management Framework and a Decision Management Platform – we are excited to see what he comes up with.

Of course we use Decision Management on all our projects, applying our unique Decisions First approach to ensure success. Check out the Decision Management Manifesto for our philosophy and if you want our take on a Decision Management Platform, check out the Decision Management Systems Platform Technologies Report with lots of detail on current technology and approaches, use cases and more.

Analytics are only valuable if your enterprise’s decision making changes for the better. You need to build an analytic enterprise that leverages analytics to inform strategy, empower people, and especially drive systems.  An analytic enterprise uses analytics to solve its most critical run-the-business problems, and uses increasingly advanced and diverse analytics to maximize its ability to get value from data.

There are three critical success factors for building an analytic enterprise -focusing on business decisions, moving to predictive and prescriptive analytics and focusing on continuous improvement not one-time big wins. You can learn more about how and why to become an analytic enterprise in this white paper Building An Analytic Enterprise and the associated webinar recording here.

This research was sponsored by Teradata.

Customers, IBM says, are moving to the cloud but they are transitioning through a hybrid solution. IBM is investing heavily in its cloud in terms of partnerships, technology, patents, volume, data centers etc. They announced two new partnerships this week – Cloudlfare and New Relic.

The One Cloud architecture is particularly focused on AI and analytics enablement – cloud infrastructure that assumes you want to use the data on the cloud to drive analytics and AI. It’s also very API-centric and designed to be managed programmatically. Plus the Watson APIs are fully integrated along with the various data capabilities IBM has been developing for its cloud.

IBM Cloud Private is IBM’s key platform for modernizing applications. They are adding capabilities around application transformation, developer tools and the data cloud. Integration across multiple clouds and deployment automation /management are focus areas also.

Transformation Advisor scans existing applications to assess the complexity of migrating an existing application to a container-based environment. If possible it will automate the transformation to containers. Once containerized the IBM Cloud Private catalog allows these applications and standard ones to be deployed to multiple instances and provides monitoring for them once deployed. Applications can also be pushed to public clouds and monitored there also. Plus of course there’s a command line interface for all this.

All good stuff. Of course you should also think about replacing all that hard-wired code with decision-centric business rules too….