≡ Menu

Karl Rexer of Rexer Analytics is at Predictive Analytics World this week (as am I) and he gave some quick highlights from the 2017 Rexer Analytics Data Science Survey. They’ve been doing survey since 2007 (and I have blogged about it regularly) and the 2017 is the 8th survey with 1,123 responses from 91 countries. Full details will be released soon but he highlighted some interesting facts:

  • Formal data science training is important to respondents (75% or so) with particular concerns about data preparation and misinterpreting results when people don’t have formal training.
  • Only about one third have seen problems with DIY analytic and data science tools, which is pretty good and getting better.
  • Most data scientists use multiple tools – an average of 5, still – with SQL, R and Python dominating across the board outside of academia.
  • R has shown rapid growth over the last few years with more usage and more primary usage every year and RStudio is now the dominant environment.
  • While there’s lots of interest in “deep learning”, 2/3 have not used deep learning at all with only 2% using it a lot so it’s not really a thing yet.
  • Job satisfaction is good and most data scientists are confident they could find a new job – not a big surprise.
  • People agree that a wide range of skills are needed with domain knowledge scoring very highly as important. Despite this recognition everyone still wants to learn technical skills first and foremost!

Looking forward to getting the full results.

We are relaunching our newsletter, in a GDPR-compliant way, with a focus on DecisionsFirst Digital Transformation. We were probably GDPR-compliant before but better safe than sorry.

We send emails about every 3 or 4 weeks with information about resources, events and articles on Decision Management topics. For example, case studies in digital transformation; decision modeling as a best practice for getting business value from your technology investments in business rules, predictive analytics, AI, and machine learning; upcoming webinars and events; training and more.

If you used to get it, or you want to start getting it, please sign up here.

An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. This is the third of three blog posts about how to become an analytic enterprise:

  1. Focus on business decision-making.
  2. Move beyond reporting to predict, prescribe, and decide.
  3. Use analytics to learn, adapt, and improve (this post).

Analytic enterprises don’t focus on big wins but on using analytics to learn what works, to adapt decision-making, and continuously improve results.

An analytic enterprise collects data about how it makes decisions and about the outcome of those decisions. It records the data that drove its analytics and the analytics that drove its decisions. Business outcomes may be recorded as structured data, as social media posts, as unstructured text, as web activity or even sensor data. This varied data is matched to decision-making and analytics so that the true impact of the analytics that drove those decisions can be assessed.

This continuous monitoring identifies new opportunities for analytics, which analytics need a refresh, where ML and AI might be valuable and much more. An analytic enterprise learns from this data and moves rapidly ensure that its decision-making is updated effectively. Its analytic platform links data, analytics and outcomes so it can close the loop and continuously improve.

Check out this video on how analytic enterprises learn, adapt and improve and download the new white paper Building an Analytic Enterprise (sponsored by the Teradata Analytics Platform).

An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. This is the second of three blog posts about how to become an analytic enterprise:

  1. Focus on business decision-making
  2. Move beyond reporting to predict, prescribe, and decide(this post).
  3. Use analytics to learn, adapt, and improve.

Historically, most enterprises have focused on analytics for reporting and monitoring. Success as an analytic enterprise means using analytics to enable better, more data-driven decisions. This means shifting to predictive analytics that identify what is likely to happen in the future and prescriptive analytics that suggest the most appropriate decision or action. Predictive analytics give analytic enterprises a view ahead, so they can decide in a way that takes advantage of a fleeting opportunity or mitigates a potential risk.

Many of the decisions best suited to analytic improvement are operational decisions that are increasingly automated and embedded in IT infrastructure. If these decisions are to be improved, multiple predictive and prescriptive analytics must often be combined in a real-time system.

An analytic enterprise needs an analytic platform and data infrastructure that supports both predictive analytics and automation. It uses its analytic platform to embed predictive and prescriptive analytics in highly automated, real-time decisions.

Check out this video on how analytic enterprises predict, prescribe and decide and download the new white paper Building an Analytic Enterprise (sponsored the Teradata Analytics Platform).

An analytic enterprise uses analytics to solve its most critical run-the-business problems. It takes advantage of new tools and new data sources while ensuring analytic results are used in the real-world. This is the first of three blog posts about how to become an analytic enterprise:

  1. Focus on business decision-making (this post).
  2. Move beyond reporting to predict, prescribe, and decide.
  3. Use analytics to learn, adapt, and improve.

Success in analytics means being business-led, not technology-led. Analytic projects that focus on data or algorithms prioritize being able to start quickly over business value. In contrast, a focus on improving business decision-making keeps business value front and center.

A focus on decision-making also acts as a touchstone, preventing a chase after the next shiny object. It provides a business justification for the data, tools, and algorithms that will be required.

Modeling the decision-making in a visual way, using a notation like the industry standard Decision Model and Notation (DMN), breaks down complex decisions and shows what analytic insight will help make the decision more effectively. These models show the impact of expertise, polices and regulations while also clearly showing what data is used in the decision.

When a decisions first approach is combined with a flexible analytic platform, analytic teams are released from the constraints of their current tools or siloed data to focus on business value.

Check out this video on how analytic enterprises put business decisions first and download the new white paper Building an Analytic Enterprise (sponsored by the Teradata Analytics Platform).

A recent article on Dig-in talked about How insurers can think strategically about AI. It contained a killer quote from Chris Cheatham of RiskGenius:

A lot of times people jump in and try AI without understanding the problem they’re trying to solve. Find the problem first, then figure out if AI can solve it, and what else you need to get the full solution.

Exactly. We have found that focusing on decisions, not a separate AI initiative, delivers business value and a strong ROI. Only if you can define the decision you want to improve with AI, and build a real model of how that decision works – a decision model – can you put AI to work effectively. Check out our white paper on the approach – How to Succeed with AI – or drop us a line to learn more.

John Rymer and Mike Gualtieri of Forrester Research have just published a new piece of research – The Dawn Of Digital Decisioning: New Software Automates Immediate Insight-To-Action Cycles Crucial
For Digital Business. This is a great paper – not only does it mention some Decision Management Solutions’ clients as examples, it makes some great points about the power of Decision Management and some great recommendations about how best to approach Digital Decisioning.

Digital decisioning software capitalizes on analytical insights and machine learning models about customers and business operations to automate actions (including advising a human agent) for individual customers through the right channels.

In particular I liked the paper’s emphasis on keeping business rules and analytics/ML/AI integrated and its reminder to focus first on decisions (especially operational decisions) and not analytic insights. These are key elements in our DecisionsFirst methodology and platform and have proven themselves repeatedly in customer projects – including those mentioned in the report.

Our DecisionsFirst approach begins by discovering and modeling these operational decisions, then automating them as decision services and finally, as also noted in the report, creating a learn and improve feedback loop.
As Mike and John suggest, we combine business rules, analytics and AI into highly automated services for decision-making then tie this to business performance using decision models.

It’s a great report and one you should definitely read. I’ll leave you with one final quote from it:

Enterprises waste time and money on unactionable analytics and rigid applications. Digital decisioning can stop this insanity.

You can get the paper here if you are a Forrester client.

ACTICO has just released ACTICO Modeler 8 – the latest version of the product previously known as Visual Rules for Finance (see most recent review here). ACTICO Modeler is a project-based IDE. ACTICO users can now select whether to create a “classic” Rule Modeling project or a Decision Model and Notation (DMN) project. The DMN modeler supports Decision Requirements Diagrams, Business Knowledge Models (BKMs) and the full FEEL syntax.

Decision Requirements Diagrams are built using drag and drop or by working out from existing diagram elements. When a Decision, Input Data, Knowledge Source or BKM is selected its properties can be filled out and this includes linking to other objects, like organizational units, that are managed in the project. A decision model supports multiple diagrams on which objects can be reused – users can drag existing model objects from the project repository structure or search for them. Decisions, Input Data, Knowledge Sources and BKMs are genuinely shared across all the diagrams in a model’s project. Any change on one diagram is immediately reflected on all other diagrams.

Existing DMN models can be imported simply by dropping DMN XML files into the environment. As DMN 1.1 models don’t have diagrams, users can simply add a new diagram to an imported project and drag elements on to it as needed.

All boxed expressions and full FEEL are supported – literal expressions, contexts, invocations, lists, relations, function definitions and decision tables. Validation is applied as syntax is edited using the classic squiggly red underline and supporting hints to correct it. A problems view summarizes all the problems in the current model and this is dynamic, updating as the model is edited. The core FEEL validations are in the product already and more are planned in coming releases.

Decision services can be defined using their own diagram, allowing the user to show which decisions should be included in the decision service and which ones are invokable. All the information requirements that flow across the decision service boundary are defined. Each decision service has its own diagram and the relevant decisions are dragged from the project to create the decision service. The decision service can be invoked from the ACTICO classic rule representation. This allows, for instance, test cases to be reused and allows new DMN models to be deployed and managed using the existing server architecture. Individual decisions and BKMs can be tested using the same mechanism.

A view of the ACTICO DMN Modeler showing a Decision Requirements Diagram and a Decision Table for one of the Business Knowledge Models displayed.

You can get more information on the ACTICO DMN Modeler here.

Jim Sinur of Aragon Research recently published a new blog Mounting Pressure for Better Decisions. He argues, correctly, that decision making is under pressure because there is more data available than ever before, a need for faster change in the way organizations make decisions to respond to evolving circumstances and a general need for speed in handling transactions.

We help companies improve decision-making by applying our DecisionsFirst Decision Management approach and by building Decision Management Systems for them. Combining decision models (built using the Decision Model and Notation or DMN standard) with powerful business rules management systems, advanced analytics (machine learning, predictive analytics) and AI, we help companies see a set of unique benefits:

  • Improved consistency
    Decision models enable consistent decision making across channels and people without imposing mindless consistency.
  • Increased Agility
    The systems we build are easy for the business to change in response to new business conditions because the business understand the decision models and own the business rules that drive the system.
  • Reduced Latency
    The combination of business rules and advanced analytics enables higher rates of straight through processing (automation) while also ensuring more clarity and less confusion for the transactions that must be handled manually.
  • Lower Cost
    Decision Management Systems reduce costs by ensuring less waste and rework, more STP and fewer manual touches.
  • Better Accuracy
    Decision Management Systems operationalize data-driven, analytical decisions throughout the organization to improve the accuracy of decisions everywhere.

If you are interested in learning more about Decision Management and the technology available for it, check out our Decision Management Systems Platform Technology Report or contact us for a free consultation.

Maureen Fleming of IDC presented at IDC Directions on How Does Decision-Centric Computing Drive Digital Transformation? She kindly shared this presentation with me. Decision-centric computing, she says:

continuously receives and analyzes data to predict when decisions need to be made, systematically learns how to automate those decisions, and acts on each decision to improve performance.

Exactly. We call these Decision Management Systems but the concept is the same.

While the presentation focused on IoT and streaming scenarios, the concepts can be applied more generally – after all, many business scenarios are heading to a streaming solution. The most interesting piece was this graph titled “Predictive Analytics is Only a Piece of the Puzzle”
This graph shows that organizations that are mature in terms of predictive analytics use business rules a lot (70%), those that are in production with something use business rules a little (24%) and those that are stuck in development are not using them very much at all (5%).

This illustrates a point we make with analytics clients – a business rules management system is a great platform for deploying predictive analytics, especially when you apply Decision Management principles and decision modeling to do the rules in a decisions first way.

For IDC subscribers, Maureen has written Introducing Decision-Centric Computing which has another great quote:

Without a way to incorporate decision automation to make repetitive decisions, enterprises will find it increasingly difficult to justify their investments in advanced analytics and risk failure to materialize the anticipated benefits

Decision Management is a proven approach to delivering Decision-Centric Computing and using a Decisions First methodology effectively combines business rules and predictive analytics using decision modeling. What are you waiting for?

Jim Sinur, VP of Research and Aragon Fellow at Aragon Research recently posted “Better Decisions with Decision Management” to his blog. Jim begins by describing Decision Management as “another discipline that will help consistently deliver better decisions”, especially when added to analytics and AI.

It’s great to have Jim’s focus turn to a Decision Management Framework and a Decision Management Platform – we are excited to see what he comes up with.

Of course we use Decision Management on all our projects, applying our unique Decisions First approach to ensure success. Check out the Decision Management Manifesto for our philosophy and if you want our take on a Decision Management Platform, check out the Decision Management Systems Platform Technologies Report with lots of detail on current technology and approaches, use cases and more.

Analytics are only valuable if your enterprise’s decision making changes for the better. You need to build an analytic enterprise that leverages analytics to inform strategy, empower people, and especially drive systems.  An analytic enterprise uses analytics to solve its most critical run-the-business problems, and uses increasingly advanced and diverse analytics to maximize its ability to get value from data.

There are three critical success factors for building an analytic enterprise -focusing on business decisions, moving to predictive and prescriptive analytics and focusing on continuous improvement not one-time big wins. You can learn more about how and why to become an analytic enterprise in this white paper Building An Analytic Enterprise and the associated webinar recording here.

This research was sponsored by Teradata.

Customers, IBM says, are moving to the cloud but they are transitioning through a hybrid solution. IBM is investing heavily in its cloud in terms of partnerships, technology, patents, volume, data centers etc. They announced two new partnerships this week – Cloudlfare and New Relic.

The One Cloud architecture is particularly focused on AI and analytics enablement – cloud infrastructure that assumes you want to use the data on the cloud to drive analytics and AI. It’s also very API-centric and designed to be managed programmatically. Plus the Watson APIs are fully integrated along with the various data capabilities IBM has been developing for its cloud.

IBM Cloud Private is IBM’s key platform for modernizing applications. They are adding capabilities around application transformation, developer tools and the data cloud. Integration across multiple clouds and deployment automation /management are focus areas also.

Transformation Advisor scans existing applications to assess the complexity of migrating an existing application to a container-based environment. If possible it will automate the transformation to containers. Once containerized the IBM Cloud Private catalog allows these applications and standard ones to be deployed to multiple instances and provides monitoring for them once deployed. Applications can also be pushed to public clouds and monitored there also. Plus of course there’s a command line interface for all this.

All good stuff. Of course you should also think about replacing all that hard-wired code with decision-centric business rules too….

 

Building a data-driven culture, where evidence-based decisions support bottom-line business objectives and AI is embedded into workflows across your organization. Ensure data is secure and accessible, wherever it lives, and get insights from data and turn them into competitive advantage. Use the entire spectrum of data science, artificial intelligence and machine learning to lay a foundation for a fast-approaching future where AI isn’t just an advantage, it’s essential.

Rob Thomas, GM Analytics for IBM, kicked off a session on putting data to work with AI. Rob began talking about the impact standard shipping containers had on the shipping industry and how a similar move is required in data – something that will make it easy to combine and analyze data in a standard way. And that only this kind of data landscape can support systematic application of analytics and AI.

ING came on stage to talk about their information architecture – one that addresses regulatory issues but also makes it possible for everyone to access, understand and use data for better decisions. They pulled all their data into a data lake architecture and then mapped the core of this to a standard set of corporate data models/vocabulary based on industry models. On to this they layered governance etc. Plus this supports the application of AI both to improve the data and its metadata AND to improve decision-making.

IBM has a new solution offering – IBM Cloud Private for Data. This is designed to provide an out of the box environment for managing an organization’s data and supporting its broad an deep application of AI and analytics. It makes it easy to bring on-premise and cloud data, tracks machine learning models running against the data and provides integrated search and preview across the metadata for all this data.

Beth Smith came on stage to add Watson and AI into this mix. Lots of organizations lack the AI skills they need so IBM is launching IBM Watson Studio to help AI teams collaborate around the data an organization has, working easily with the new IBM Cloud Private for Data. It’s open, supporting open source as well as IBM-specific AI capabilities like the pre-trained Watson APIs. It’s underpinned by a catalog that combines data and any analytics you have built against it. It also supports and automates many of the experimentation and training runs that good ML and AI models require – helping reduce the manual load on data scientists – while providing a rich visual interface for much of the work. It’s designed to make it easier to build, easier to run and easier to share the tasks needed for AI.

IBM has also been investing in the services support that companies need and launching the Data Science Elite Team to deliver initial free workshops to help companies get over the hump and get started with more sophisticated analytics and AI.

Nice to see the investment in making AI and analytics easier. Wish IBM would include its Business Rules Management System Operational Decision Manager as part of this stack – would make operationalizing the result much easier.

Ginni Rometty kicks off the main event with her opening keynote focusing on putting smart to work. Her premise is that everything could be changing now because business and technology architectures are changing at the same time – something that does not happen very often. The opportunity is for exponential change across all businesses thanks to the combination of data and AI. And she further argues that the fact that so much of the data that is needed is INSIDE companies makes it possible for established companies to compete – to disrupt and not just be disrupted.

Digital Platforms are the key to this. She emphasized that multiple platforms are going to matter -no-one is going to use just one. These platforms will allow you to embed intelligence in every process across the organization.  She feels that AI is going to be in combination with people when it is used most effectively. And she encourages companies to go on offense – to use this intelligence to  not just fix things but to really grow exponentially. Plus IBM’s business model is not to monetize their clients data but to help their clients do so.

Social disruption is possible too – everyone needs to focus on trust, jobs/skills and inclusion. If AI is a complement to human intelligence then IBM thinks that all jobs will be disrupted – some will be eliminated, some will be created, all will be changed.

Lots of announcements coming she says around cloud, especially making it easy to integrate private cloud into public ones, around strategic partners, and around Watson, especially around making it easier to use Watson and embed it in work.

Customers up next to help reinforce Ginny’s points. Verizon CEO first talking about 5G, about strategic partners in the API economy. In particular they want to build better ecosystems around their core transmission capability. He also emphasized the importance of data management and trust, especially for a network. Key point – building a platform but partnering to build things on top of it.

Maersk – one of the world’s largest container companies – came up next to talk about how they worked with IBM to use blockchain to disrupt the way shipping works. Shipping companies are coming on board to digitize the way they share information about containers and vessels/vehicles. Using blockchain to make it easier to share and update information in a trusted way. And the organizations participating include government agencies, insurance, ports and much more. A good example of the value of making an open platform not just a company one.

RBC – Royal Bank of Canada – came up next. One of the biggest changes, the CEO says, is the way people look for the financial services they need – they go online where before they would have come to a bank branch. Mobile and internet payment platforms mean that people don’t see the brand any more – they set the card up in an app once. And mobile is changing the way they run their back office systems. All of this puts pressure on their ability to develop everything – especially AI – so they are partnering and moving to cloud. And of course because its money you can’t just push something out there and see if it works – people want it fast but they want it secure and reliable too. In particular, RBC sees using AI to really improve customer service and customer engagement.

Ginny came back to re-emphasize that this an inflection point as simultaneous business architecture and technology architecture changes create a once in a lifetime opportunity to become “an incumbent disrupter”.

DecisionCAMP 2018 in in Europe – Luxembourg to be precise – September 17-19. This is a great event and well worth your time if you are interested in the nuts and bolts of decisioning technology, Decision Management or decision modeling. Last year’s event in London was great with a wide range of presentations and lots of great content. Plus you get to meet with a bunch of folks really committed to decision-making approaches and technologies.

Anyway, its time to submit papers – the Call for Papers is here. If you have something to say about decision modeling, the use of business rules and analytic or AI technology for decision automation, optimization, how decision management and blockchain can deliver smart contracts, or really anything else interesting and decision-centric, please go ahead and send a proposal. Like the rest of the committee I am looking forward to seeing some great topics again this year.

Get those submissions in by March 25 if you can – or at least let us know you plan to!

The Decision Management Community is trying to establish a most influential people list for Decision Management specifically. The plan is to have members of the community vote based on nominations provided here. So if you have someone you think has demonstrated leadership, engagement and innovation in the Decision Management community, why not go ahead and nominate them? And if you are not already a member of the DM Community, why not register so you can vote and stay in touch with the articles and news the site collates.

AI is a decision-making technology. A focus on decisions, not a separate AI initiative, delivers business value and a strong ROI.

A recent HBS survey of executives adopting Artificial Intelligence (AI) provides critical context for companies considering how best to invest in AI:

  • Few companies have made much progress to date — most are experimenting. You still have time to consider how best to invest in AI.
  • AI works best in companies that have already invested in digitizing their business as it enhances digital channels, digital decisions and digital processes.
  • While there is plenty of hype, AI works when it is implemented correctly.

As companies invest in AI technologies, it is clear a technology-led approach does not work. To get business value from AI, companies should focus AI efforts on improving business decisions. We have just published a new brief that lays out a clear, straightforward approach to succeeding with AI by leading with business decisions. It can be applied if you have not yet begun or to focus and reset efforts that aren’t making the progress you desire.

Get the brief here.

OneClick.ai is a company taking advantage of the fact that many AI problems use similar approaches to reduce the time and cost of individual AI projects. It was founded and received its initial funding in 2017, and launched the product last year. The company has a core team of 8 in the US and China with 40 active enterprise accounts supporting over 20,000 models.

OneClick.ai uses AI to build AI and so help companies get into AI more quickly and more cheaply. The intent is to get them fault-tolerant scalable APIs for custom-built AI solutions in days or even hours instead of weeks and months. They aim to automate the end-to-end development of AI solutions based on deep learning. They use meta-learning to design and evaluate millions of deep learning models to find the best ones. They are also working on capabilities to explain how those models work, to address one of the concerns of deep learning, the lack of interpretability.

The product is aimed at non-technical users with a chatbot interface to allow experts to interact with the trained models. Users can choose from public cloud, private cloud or hosted versions and software vendors have the access to an OEM version to integrate the technology into customized solutions. A wide range of AI use cases are supported, including classic predictions (weekly and monthly sales or equipment failure) to image recognition (recognize brands in shelf images to see how much shelf space they have), classification (putting complaint emails into existing categories and identifying new problems) and semantic search (find the most helpful supporting material for a fault). Several of their existing customers were already trying to use AI and have found OneClick.ai significantly quicker to get to an accurate model.

The tool is browser-based and supports multiple projects. Each project has a chatbot that can answer data science questions. Data is provided by uploading flat files that contain a learning data set – numeric, categorical, date/time, text or images. Raw data is enough but users can add domain-specific features if they have domain knowledge that a feature will likely be helpful. Users can develop classification, regression, time-series forecasts, recommendations or clustering models and target various measures of precision depending on the type of model – accuracy, mean absolute error etc.

The engine builds many models and presents the best from which the user can select the one they prefer (based on their preferred metric and the latency of the deployed model, which is calculated for each model). The engine automatically keeps 20% out for testing and uses the other 80% for training. Under the covers, the engine keeps refining the techniques it uses based on the previous training results. Once built the chatbot can answer various questions about the models such as usage tips and model comparison. Users can deploy the models as an API for real-time access with few clicks. A future update will also allow model updates and deployment through an SDK.

You can find out more here.

March 27-29 I am teaching a 3-part online live training class that will prepare you to be immediately effective in a modern, collaborative and DMN standards-based approach to decision modeling.

ExampleYou’ll learn how to identify and prioritize the decisions that drive your business, see how to analyze and model these decisions, and understand the role these decisions play in delivering more powerful information systems.

Each step in the class is supported by interactive decision modeling work sessions focused on problems that reinforce key points. All the decision modeling and notation in the class conforms to the DMN standard, future-proofing your investment in decision modeling. DMN-based decision modeling works for business rules projects using a BRMS, predictive analytic or data science projects, manual or automated decisions and even AI.

Click here for more information and registration. Early bird pricing is available through March 1, 2018 so book now!