≡ Menu

Another product/solution focused session, this time on Cybersecurity. This is a relatively new product for SAS but they have production customers and an aggressive development plan. The core focus for this detecting attackers who are on a network before they execute their attacks. For instance in the Sony hack the attackers were probably on the network for 90 days downloading data and more days before then doing reconnaissance. The challenge in doing this comes from a set of issues:

  • Detection avoidance by criminals
  • Limits of signatures and rules that are time consuming and complex to manage
  • Economics of data scale given the amount of data involved
  • Analyst fatigue caused by false positives

NIST talks about five steps

  • Identify
  • Protect and Detect
    Lots of technology focused here like firewalls, identify management etc.
  • Respond
    More technology here focused on generating alerts and having analysts prioritize and focus on the most serious
  • Recover

The key problem is that this still focuses on a “chase” mindset where everything is analyzed post-fact.

SAS Cybersecurity ingests network traffic data in real time and enriching it with business context such as that from a configuration management database (location, owner etc). This is used to identify peer groups.  In-memory behavioral analytics are applied and presented through the investigation UI for analysts to focus on the most serious problems.

Critical to this is identifying the normal baseline (so you can see anomalies) when the number of devices is in the thousands and all the devices could be communicating with each other. A network of 10,000 devices might product nearly 100,000,000 relationships for instance. With this baseline you can detect anomalies. Machine learning can be used to learn what is causing these anomalies before driving analytically-driven triage so that analysts target the most serious problems.

Customer Intelligence is a core focus for SAS. Over the last year, real-time next best action, optimization, marketing efficiency are driving investments in Customer Intelligence in the SAS customer base. More organizations have initiatives focused on improving the customer experience, integrating digital silos for a digital experience and big data.

The Customer Intelligence product is designed to handle inbound and outbound channels through a common customer decision hub that handles the rules, actions and insight (analytics) for about customer treatment. The current product has a strong history in banking and financial services but also has retail, communication and insurance.

Four big themes are driving Customer Intelligence:

  • Optimizing marketing
    Differentiated analytics and an optimized marketing processes
  • Customer Journey
    Engage customers the way companies and their customers want across channels and devices
  • Unify customer data
    Even as the range of data increases and specifically across multiple channels
  • Digital ecosystem
    Support the huge array of marketing players – means being cloud, API-driven etc.

This leads to an extension of the customer intelligence suite deeper into additional channels – mobile and web content, email and advertising that are customized, analytic and learn from new data.

A detailed walkthrough of how a marketing analyst might refine their targeting of their customers showed some nice task management, designing flows for offer targeting, analytic tools for comparing A/B test results, integrating anonymous behavior across multiple channels with profiles to drive interactions and much more.

SAS is taking its existing customer data hub and marketing operations investments and extending them deeper into newer digital channels and adding more sophisticated and yet more accessible analytics. Integrating with commercial marketing systems and content delivery systems in a more open way is a critical component so that the intelligence can be embedded into a typical heterogeneous marketing environment.

Next up at the SAS Inside Intelligence event are some technology highlights, each based around a day in the life of a particular role. Much of this is under NDA of course.

Ryan Schmiedl kicked off with a quick recap of last year’s technology – 150 significant releases across the SAS focus areas. In analytics for instance Factory Miner was rolled out (review here), Hadoop was a big focus in the data management area while Visual Analytics and Visual Statistics delivered new visualization capabilities and much more.  Customers, he says, are asking for simplicity with governance, new methods, real-time analytics, solutions that work for big and small problems and new use cases. They want a single integrated, interactive, modern and scalable environment. And that’s what SAS is planning to deliver. With that he introduced the first day in the life presentation – Data Scientist.

SAS loves Data Scientists, they say, and Data Scientists need three things:

  • The right analytic methods – a broad and deep array of these – that are scalable and readily available on premise or in the cloud.
  • A good user experience so they can exploit these methods. Organizations need this to work also for both experienced data scientists and new entrants.
  • Access to these methods in the programming language they prefer. They also need to be able to mix visual and interactive tools with this programming plus they need to be able to automate tasks – to scale themselves.

Business Analysts are the second role to be considered. SAS Visual Analytics is SAS’ primary tool for business analysts with BI, discovery and analytics capabilities in an easy to use UI. As was noted earlier, new visual interfaces for data wrangling as well as new data visualization capabilities are coming in the product along with suggestions to help analysts when they get stuck. Mobile interfaces are popular with users for consuming analysis and making it easy for business analysts to deliver reports or visuals that work on every UI. Meanwhile the Visual Analytics UI is being simplified.

Next up is a new one – Intelligence Analyst. These folks sit between data scientist and business analysts and are increasingly found in fraud and security prevention where an automated environment uses analytics to flag items for investigation and those investigating also need to be able to do analytics interactively as part of their investigation. Providing a combined interface for these analysts is a key capability for the new fraud and security environment. This handles text analytics, search, network analysis and a bunch of other SAS capabilities in a nice integrated and configurable environment.

Final role-based demo is for IT Analysts. IT are focused on how fast they can fix problems, making sure problems stay fixed and on keeping costs under control. New tools for managing the SAS environment and the events generated by it are designed to make it easy to find out about problems, program automated responses and do investigation of persistent problems.

A bonus demo focused on IoT – the Internet of Things. IoT has use cases in everything from connected cars to manufacturing, from retail to healthcare. IoT requires analytics – to turn all that data into something actionable – and it requires real-time, streaming analytics. IoT means access to data from devices, filtering and transformation of this data at source before transmitting it, analytics applied to the streaming data, storing and persisting the right data, and actively monitoring and managing deployed models as data changes. And then you need to be able to do ad-hoc analysis to see what changes you need to make moving forward.

There was a lot of new stuff demonstrated but it was not 100% clear what was under NDA and what was not so I was pretty conservative about what I blogged.

I am at the SAS Inside Intelligence event in Steamboat getting the annual update on all things SAS. First session of the day is the Executive Viewpoint. Jim Goodnight and Randy Guard kicked things off.

Creating a single global organization was a big part of last year with legal, finance, sales, marketing and more becoming global efforts. Marketing and sales in particular were rather too country-local. Marketing and sales globalization focused on GTM alignment, sales enablement, regional and global services and brand/creative direction. SAS has refocused its marketing in particular away from a channel-specific approach to a more customer-journey focused one (using, of course, SAS software). Each product line has been integrated into this approach to create a more coherent, global GTM plan. Added to this has been a set of industry templates, sales plays and use cases designed to make SAS and its partners more able to sell the capabilities it has by focusing on particular use cases. More advertising, new and expanded events also driving this message harder into the market.

Sales in 2015 were strong – over $3.0B – with a 8-17% growth rate worldwide for first year license with increasing deal sizes and the number of large deals both up as well. Overall revenue grew a little slower – 5-11% – but also pretty strong. Risk and data management were particularly strong with business visualization also showing good results. Modernizing existing customers to the latest and greatest was also a focus and apparently went well. Partner supported revenue grew 57% showing an increase in partner engagement. NOt much change from a sales perspective for 2016.

This modernization program is, I think, really important. Getting customers off the old versions of software and the old servers they run on is critical to sustaining these customers. Modernizing means customers are using the latest, scalable technology (like grid and the high performance analytics) as well as the newer tools like Enterprise Miner and Visual Analytics. Good story from some customers seeing dramatic increases in performance especially thanks to SAS Grid.

The history of the R&D program to improve performance runs back to 2009 and includes the core High Performance Architecture, the SAS LASR server and SAS CAS – Cloud Analytics Server – a massively parallel architecture developed in 2013 that balances in-memory and cloud. This new server has load balancing, fail over, easy install and a highly scalable architecture to deliver elastic compute as well as this support for managing datasets that won’t fit in memory. This will ship in Q2 and then add REST, Java, Python and LUA interfaces in the fall so that it can be integrated into a modern IT environment.

SAS is also planning to fight back against the growth of R  in university. SAS Analytics University edition is free and complete for academics and has 500,000+ downloads and provided an on-demand (40,000+) and AWS (3,000 users) versions.  SAS has also partnered with 37 Masters of Analytics programs and over 30 new joint certificate programs were added in 2015.

SAS continues to grow internationally with offices opening in Dublin and Paris as well as some new offices in the US (like Detroit). Plus the Cary campus is getting another new building. It continues to rank well on the great places to work surveys and to have local offices and presence.

A few additional facts on the business

  • 49% in Americas, 38% in Europe and 13% in Asia Pacific
  • 26% in Banking, 15% in Government, 10% in Insurance. Interestingly only 5% in retail.
    • Banking growth being driven a lot by risk with the expansion of stress testing and regulatory requirements. Fraud also drove growth in Banking and Government.
    • Life Sciences at 7% started to include more sales and marketing not just R&D and this growth also came with a willingness to use the cloud.
    • Manufacturing is at 6% and IoT is an increasingly big deal for SAS in this area as manufacturers start to instrument their products.
  • SAS consistently invests heavily in R&D – 25% of revenue v an industry average of 12.5%.
  • Partnering is an increasing focus:
    • They wanted in 2015 to become the analytics partner of choice.
    • Their target is to have partners participate in 35% of new revenue by the end of 2018 while driving 25% of new deals.
    • For 2015 they hit 30% partner participation in new sales and 18% led by the partner, so good progress.
    • Partner resell revenue grew 3x with 200 resellers, 2 new OEM agreements and 9 Analytic Service Provider agreements.
  • SAS is investing more in its brand this year, building on the confidence people have in SAS products and adding a focus on clarity and compassion.

Driving forces for the SAS business are pretty obvious:

  • Data growth, new sources, new types
  • Analytics – consumable by everyone from data scientists to business user/application users
  • Self service and discovery and the enthusiasm for this in companies – expanding from visualization and into data wrangling/data blending.
  • Connected everything but so what…

And this results in a set of 6 focus areas for SAS

  • Analytics
  • Data Management
  • Business Visualization
  • Customer Intelligence
  • Risk Management
  • Fraud and Security Intelligence

Plus some emerging areas including Cybersecurity and the Internet of Things.

Enabling Technologies for all this include

  • Data+Processing Power+hadoop – put processing close to the data
  • Event Stream Processing as more data is “in flight”
  • In-memory Processing
  • Visualization

All of this being brought together with a strong focus on common user experiences and integration across products.

Lots of interesting additional news and some good choices by SAS presented under NDA. More on the technology later in the day.

Predictive analytics is a powerful tool for managing risk, reducing fraud and maximizing customer value. Those already succeeding with predictive analytics are looking for ways to scale and speed up their programs and make predictive analytics pervasive. But they know there is often a huge gap between having analytic insight and deriving business value from it – predictive analytic models need to be added to existing enterprise transaction systems or integrated with operational data infrastructure to impact day-to-day operations.

Meanwhile the analytic environment is getting increasingly complex with more data types, more algorithms and more tools including, of course, the explosion of interest in and use of R for data mining and predictive analytics. Getting value from all this increasingly means executing analytics in real-time to support straight through processing and event-based system responses.

There is also increasing pressure to scale predictive analytics cost-effectively. A streamlined, repeatable, and reliable approach to deploying predictive analytics is critical to getting value from predictive analytics at scale. This must handle the increasingly complex IT environment that contains everything from the latest big data infrastructure like Hadoop / Storm / Hive / Spark to transactional mainframe systems like IBM zSystems.

PMML – Predictive Model Markup Language – the XML standard for interchanging the definitions of predictive models is a key interoperability enabler, allowing organizations to develop models in multiple tools, integrate the work of multiple teams and deploy the results into a wide range of systems.

I am working on a new paper on this and if you are interested you can get a copy by signing up for our forthcoming webinar – Predictive Analytics Deployment to Mainframe or Hadoop – on March 3 at 11am Pacific where I will be joined by Michael Zeller of Zementis.

 

DMNBookFrontCoverJan Purchase of LuxMagi and I are working away on our new book, Real-World Decision Modeling with DMNReal-world Decision Modeling with DMN will be available from MK Press in print and Kindle versions in the coming months and I wanted to take a moment to talk about why Jan and I are the right people to be writing this. Our aim is to provide a comprehensive book that explains decision modeling as well as the DMN standard and that gives practical advice based on real projects.

Regular readers of the blog will have a perspective on me and why I am in a position to write such a book but for those without the history, here are some highlights:

  • I have been working in Decision Management since we first started using the phrase back in 2002 – there is at least one witness that claims I came up the phrase – and have really done nothing else since then. Throughout this period one of the key challenges has been how best to structure, manage and document a decision so it can be effectively managed. We tried rule-centric approaches, process-centric approaches and document-centric approaches but until we started using decision modeling none of them we really satisfactory. This context makes me really value decision modeling and gives me a wide range of counter-examples!
  • As I got interested in decision modeling, I wrote a chapter for Larry and Barb’s book on their decision modeling approach, wrote the foreword to Alan Fish’s book on decision modeling and included the basic outline of what would become the DMN standard in my book Decision Management Systems.
  • Decision Management Solutions was one of the original submitters when the OMG requested proposals for a decision modeling standard and I have been an active participant in every step of the DMN process, both the original and subsequent revisions.
  • Our work with clients has involved building decision models for rules projects and for predictive analytics projects as well as for manual decision-making and dashboard efforts. We have built models in financial services, insurance, healthcare, telco, manufacturing and travel. We have also taught hundreds of people across dozens of companies how to do decision modeling.
  • My personal work with decision management technology vendors has exposed me to their clients too, providing a huge pool of experiences with decisioning technology on which to draw.
  • Plus I have written and co-written several books, including most recently working with Tom Debevoise to write the first book to introduce DMN – The MicroGuide to Process and Decision Modeling in BPMN/DMN

So why Jan? Jan too has a depth of experience that makes him a great choice for this book:

  • Jan has spent the last 13 years working with business rules and business rules technology. While structuring and managing business rules is not the only use case for decision modeling, it is a very powerful one and the one that is the primary focus of this book. Jan’s long time focus on business rules gives him a huge array of examples and experiences on which to draw.
  • Part of this experience is with lots of different Business Rules Management Systems. Like me, Jan has seen the industry evolve and used multiple products giving him a breadth of perspective when it comes to how business rules can be presented to business owners, how SMEs can be engaged in managing business rules and much more.
  • Jan’s experience is intensely practical, working to develop the business rules directly as well as mentoring others who are developing business rules, providing training in decision modeling and business rules best practices and acting as a reviewer and advisor.
  • Jan has spent 19 years working in finance and has worked with 8 of the top 15 investment banks, for instance. He has worked on everything from liquidity to compliance, accounting to loans and asset classification – he has tremendous experience in one of the most complex, heavily regulated industries out there. Decision modeling has a critical role to play in a modern regulatory architecture so this experience is invaluable.
  • Before working with DMN on projects Jan worked with The Decision Model extensively giving him a perspective on decision modeling influenced by the two major approaches out there.

Between the two of us we have a depth of experience we believe can make Real-world Decision Modeling with DMN not just a book about the notation and how to use it but a genuine best practices guide to decision modeling.

To learn more and to sign up to be notified when it is published, visit http://www.mkpress.com/DMN/.

This year DecisionCAMP will be hosted by the International Web Rule Symposium (RuleML) on July 7, 2016 at Stony Brook University, New York, USA. This year’s event will aim to summarize the current state in Decision Management with a particular focus on the use of the Decision Model and Notation (DMN) standard. As always it will show how people are building solutions to real-world business problems using various Decision Management tools and capabilities – everything from Business Rules Management Systems to data mining tools, optimization and constraint-based environments to machine learning and prescriptive analytics. DecisionCAMP gives you a chance to:

  • Learn about new trends in Decision Management technologies, and how they can be used to address your business problems
  • Share practical results of using of various decision management technologies in business settings
  • Exchange best practices for using DMN and decision management technologies.

Right now we are looking for some great presentations so if you want to present at this event please submit the abstract of your presentation using EasyChair.

If you don’t feel you have something to share then at least make sure you put it on your calendar. See you there.

My friends at TransUnion have an interesting job opening for someone with a background in decision management and consulting. They are looking for a Sr. Director, Decision Services CoE

The Sr. Director, Decision Services Center of Excellence (CoE) will be responsible for developing and driving strategy to grow the Decision Services business for TransUnion’s International business. The Sr. Director will leverage a direct and matrixed group of business and product professionals to meet business goals and drive superior internal and external customer satisfaction.

The job is based in Atlanta (I think) but focused on their international business and specifically on their use of their DecisionEdge decision management platform to deliver solutions around the world. TransUnion, for those who might think of them only as a credit bureau, is “dedicated to finding innovative ways information can be used to help people make better and smarter decisions. As a trusted provider of global information solutions, our mission is to help people around the world access the opportunities that lead to a higher quality of life, by helping organizations optimize their risk-based decisions and enabling consumers to understand and manage their personal information.”

Details here – apply direct with them.

Decision Management Solutions joined the OneDecision.io consortium back in September and we have been working with them ever since both within the Decision Model and Notation (DMN) standards process and to provide some integration between the OneDecision.io Java-based reference implementation for DMN execution (which supports basic decision tables, JSON data types, and the standardized DMN XML interchange format) and DecisionsFirst Modeler our decision requirements modeling platform.

We believe that the best way to integrate execution-oriented environments like OneDecision.io (or IBM Operational Decision Manager and other commercial Business Rules Management Systems) is by linking the decision requirements diagrams you build to the matching implementation in your target environment. We have now completed the initial prototype for the integration of DecisionsFirst Modeler Enterprise Edition with OneDecision.io and you can see the results in the video.

If you are interested in this integration, or any others, please get in touch – info@decisionsfirst.com.

DMNBookFrontCoverSmallI am working on a new book – Real-World Decision Modeling with DMN – with Jan Purchase and he recently posted another great post – How DMN Allows Business Rules to Scale. While decision modeling with DMN is not JUST about writing business rules (as I noted earlier), this is a great use case for it. Jan does a nice job outlining why it can be hard to scale business rules projects and how it gets especially hard when you start thinking about how to structure them.

I would add a couple of things:

  • We have found that the many:many relationship between process tasks and business rules is best managed using a decision model. While simply grouping rules into decision tables or rulesets helps with simple decisions, complex ones can end up smeared across multiple tasks if you are not careful. Structuring the decision explicitly using a decision model really helps.
  • When doing decision models we regularly identify rules and facts (information) that are not in the end needed. The SMEs say they need this piece of information to make a decision or tell you that such and such is a rule. However building a decision model forces real choices and we often find that the way they REALLY make the decision does not use that information and the rule, while potentially true in the general case, is not relevant to the specific project at hand. The decision model acts as a lens, focusing you on what you need to know to get something done – a decision to be made.
  • Decision models don’t assume business rules are the objective, allowing you to build them even if you are not sure you can/will document the business rules. As I said in this post on reasons to model decisions, there are many reasons to model decisions. This allows you to start with a decision model without having to know where you are going to end up.

Decision modeling is a powerful tool and one you should be considering, especially if you are working in business rules.

I got an email today from a doctoral student trying to complete their dissertation. They are looking for 10 or so participants to complete data collection for a study on The Role of Data Governance Mechanism in Managing the Big Data Environment.

If you’re interested, please contact Stephanie by email for more information:
Stephanie Cutter
sstreich@capellauniversity.edu

As regular readers know I have been working on a new decision modeling book – Real World Decision Modeling with DMN – with Jan Purchase. While you wait for this book from Meghan-Kiffer press you might want to check out Questioning BPM? This was a fascinating exercise in which Paul Harmon and Roger Tregear asked a whole bunch of us – about 30 – to answer a set of questions about business process management.

I wrote on a couple of topics for this book:

  • Should BPM include decisions or not?
    When initiating a BPM project or setting up a BPM competency, organizations often wonder if they should include decisions, and business rules, in these BPM initiatives. The answer, as it so often seems to be, is both Yes and No.
  • Why do we need a separate modelling notation for decisions?
    The Object Management Group has long had a modeling notation for business processes – the Business Process Model and Notation. This has recently been joined by a decision modeling notation – the Decision Model and Notation or DMN (as well as the Case Management Model and Notation or CMMN). Why do those modeling processes need to know this new notation and how should they use it alongside their process models?

The book is available now from amazon.com and as a Kindle edition. There’s a great summary on the MK Press site too – Questioning BPM?

If you want to learn more about decision modeling check out our white paper or sign up for our training.

 

TheBigAnalytics

Besides working on Real World Decision Modeling with DMN  I recently contributed a piece on the use of decision modeling for framing predictive analytics to The Big Analytics Book, a book being produced by the folks at AnalyticsWeek. AnalyticsWeek rounded up 60+ thought leaders in analytics, including yours truly, and got us all to write a piece of advice. Mine was focused on the potential for decision modeling to accurately and usefully frame analytics projects:

Framing analytics projects matters because it is easy to build a great analytic that does not truly impact business results – that does not improve decision-making. In the words of one of our customers “there’s too much white space between analytic success and business success”. Linking your metrics to the decisions that make a difference and then modeling these decisions really helps ensure your analytic projects are on-target.

It’s a fun book and you should check out the press release here and sign up for a copy at thebiganalytics.com.

If you want to learn more about decision modeling for analytics check out our white paper or sign up for our training.

DMNBookFrontCoverSmallAs you may have noticed I am working on a new book – Real-World Decision Modeling with DMN – with Jan Purchase. Yesterday Jan had a great blog post – Why Decision Modeling? (In 1000 Words). Jan makes some great points, emphasizing the value of  decision modeling with DMN in:

  • Transparency of the logic
  • Separation  of concerns between processes and decisions
  • Managing Complexity, Maintaining Integrity
  • Agile change of the way we make decisions
  • Standardization of representation
  • Traceability of our implementation

Jan and I are enjoying writing the book. One of the reasons it’s so much fun is that we both agree and bring slightly different perspectives – specifically I tend to spend more of my time building decision requirements models while Jan spends more time drilling the model all the way down to the level of tabular decision logic – decision tables. Reading his post I thought I would add a little additional perspective on the value of decision requirements models.

OverallModel

Decision requirements models are represented with one or more decision requirements diagrams like the one to the right. These show your decision (rectangle), the sub-decisions into which that decision can be decomposed – the other decisions that are required to be made first, the input data (ovals) provided to the decision and the knowledge sources (documents) that contain the know-how you need to make the decisions.

These diagrams are a key element for several of the benefits Jan identified:

  • Transparency: The diagrams are much easier to read than a set of rules would be. I recently built a model to match some existing business rules in a BRMS demo and it was immediately clearer what the decision-making was.
  • Complexity: The diagrams manage complexity in your logic by breaking it down into self-contained pieces that are combined to make a decision.
  • Traceability: Tracing the impact of a change to a policy, shown as a knowledge source, is made much easier as you can walk through the model using the diagram relationships.

But they can do more too.

  • Risk Management
    Even if you don’t plan to write the decision logic, the business rules, for your decision the diagram brings clarity to the way you make decisions. This can be really valuable in a risk management context as it allows a clear definition of the decision-making approach that can be discussed, agreed and even shared with regulators.
  • Training
    We have had several customers take decision requirements models and use them to train front-line decision-makers like claims adjusters or underwriters. The models are more robust and easier to follow than the original document describing the decision.
  • Framing analytics
    When teams are going to use analytics, especially predictive analytics, to improve decision-making it is really important to frame the problem accurately and a model of the decision-making to be improved is perfect for this.
  • Orchestrating Decision Management Systems
    Decision Management Systems often involve business rules, predictive analytics, constraint-based optimization and adaptive control or A/B testing capabilities. How these pieces are being orchestrated to deliver value can be hard for non-technical people to understand – a decision requirements model makes it clear. In our decision modeling tool, DecisionsFirst Modeler, you can even build explicit links from the model to the implementation components.
  • Automation Boundaries
    One of the biggest challenges in automating decisions is determining what to automate and what to leave as a manual decision. A decision requirements model let’s you discuss and agree the decision-making and then consider what makes sense in terms of automation.

The book covers how to build these diagrams, as well as how to write decision logic, and discusses best practices for using these diagrams in all these different situations. If you want to know when Real-World Decision Modeling with DMN is available – and I hope you do – sign up for notification here. If you want something to read in the meantime we have a white paper on decision modeling with DMN, some upcoming online training. We also offer services in decision management and decision modeling and you can schedule a free consultation.

Many organizations have buried their operational decision making in business processes and information systems, making it hard to optimize how these decisions are made. This matters because more and more of the value created by business processes is associated with these kinds of decisions. As more processes are digitized and automated to keep pace with today’s consumers the role of decision-making in these processes is growing and many business processes are essentially “decision processes” in this new digitized world – especially in the key business areas of risk management and customer centricity.

Those of us who work in this space have noted a couple of things:

  • The complexity of decision-making processes can be addressed by externalizing and modeling decisions
  • Managing risk and customer centricity require both process and decision innovation.
  • Combining processes and decisions drives transformation and innovation in business operations creating real business value.

I am giving a webinar on this – How to Innovate Risk Management and Customer Centricity – with Roger Burlton of the Process Renewal Group , February 17th at 11am Pacific. Roger and I both work with leading organizations striving to deliver excellence in risk management and customer centricity and our experience makes it clear that a combined process and decision focus is the most effective way to improve business processes. You can register here.

DMNBookFrontCoverI am delighted to announce a collaboration with Jan Purchase of LuxMagi on a new book, Real-World Decision Modeling with DMN. You can read the full announcement here and Real-world Decision Modeling with DMN will be available from MK Press in print and Kindle versions. As Richard Soley, who has graciously agreed to write a foreword for us, says:

“A well-defined, well-structured approach to Decision Modeling (using the OMG international DMN standard) gives a repeatable, consistent approach to decision-making and also allows the crucial ‘why?’ question to be answered—how did we come to this point and what do we do next?” said Richard Mark Soley, Ph.D., Chairman and CEO, Object Management Group, Inc. ”The key to accountability, repeatability, consistency and even agility is a well-defined approach to business decisions, and the standard and this book gets you there.”

Our aim is to provide a comprehensive book with both a complete explanation of decision modeling as an approach/technique and the DMN standard itself. Plus some solid advice on the business benefits of using it as well as lots and lots of examples and best practices developed on real projects. Jan and I have been using decision modeling for years on all sorts of projects and we want to distill that experience to a book that will help new decision modelers get up to speed quickly while still helping those with more experience with crucial patterns and advice.

The two companies have used decision modeling on many projects and between us we have probably taught over 1,000 people decision modeling and DMN. We have worked with business analysts, process analysts, developers, data scientists and subject matter experts – all of whom have found DMN an accessible yet precise way to describe business decisions. Decision modeling lets you capture, communicate and facilitate agile improvement in even the most complex of business decisions. It’s been critical for our clients’ compliance efforts and for staying up to date with regulations. We are going to bring this breadth of perspective and deep experience to the book so companies and individuals can successfully adopt decision modeling.

I wish I could tell you the book was ready but it’s not – Jan and I are working hard on it and have a tremendous amount of great material already done but we really want to produce a pretty definitive guide – not just to DMN but to decision modeling using DMN.  As we work on finishing it, Jan and I will be posting about the book and the thoughts about decision modeling that writing it has provoked as well as asking questions to make sure we cover everything we need to and more. Watch this blog, Jan’s blog and the various LinkedIn groups we belong to for more updates. I hope you’ll engage with us and that you won’t find the wait too arduous.

To learn more and to sign up to be notified when it is published, visit http://www.mkpress.com/DMN/.

I have just finished updating Enterprise Scale Analytics with R with new data from the Rexer Analytics Survey for 2015.

As R has become more popular, the role of analytics has become increasingly important to organizations of every size. Increasingly, the focus is on enterprise-scale analytics—using advanced, predictive analytics to improve every decision across the organization. Enterprise-scale adoption of analytics requires a clear sense of analytic objectives; an ability to explore and understand very large volumes of data; scalable tools for preparing data and developing analytic models; and a rapid, scalable approach for deploying results.

According to the widely cited Rexer Analytics Survey, R usage has steadily increased in recent years. Organizations using R to develop analytic models face particular challenges when trying to scale their analytics efforts at an enterprise level. Complex data environments can make integrating all the data involved difficult. The typical R package is single-threaded and memory limited, creating challenges in handling today’s increasingly large data sets. These same limitations can mean it takes too long to analyze and develop models using this data. When all the analysis is done, deploying the results can add a final hurdle to achieving business value at scale.

Solutions such as Teradata Aster R that combine commercial capabilities with open source R offer a way to address these challenges. This paper introduces R, explores the challenges involved in scaling analytics across the enterprise, identifies the specific issues when using open source R at scale, and shows how Teradata Aster R can help address these issues.

You can download this white paper, sponsored by Teradata, here.

I am pleased to announce a new Decision Table Modeling online training offering taught by Professor Jan Vanthienen, a leading decision table expert. We are running a pilot of this class February 2-4 an you can get details, and a great price, here. To give you a taste of Jan’s approach, here’s a guest article by him.

The Value of Good Decision Table Modeling

By Jan Vanthienen, KU.Leuven

Managing and modeling decisions is crucial for business. The new DMN (Decision Model and Notation) standard emphasizes the importance of business decisions, and also offers a standard notation and expression for decision requirements and decision logic.

Advantages of Good Decision Tables
Decision tables look straightforward: a number of rows or columns containing decision rules about combinations of condition expressions with their respective outcome. The reason for their success is simply that every column or every row (depending on orientation) is about one specific condition. This fixed order of conditions allows a complete and easy overview of decision rules for a specific decision. It also allows grouping of related rules into tables, thereby providing an overview to a large number of decision rules.

The real advantage for business, however, is the ability to obtain consistency, completeness and correctness of the decision logic. Avoiding redundancy and overlapping rules is a key element in constructing and maintaining decision tables that offer true value for business.

Tables for Decision Logic Modeling
DMN provides constructs for both decision requirements and decision logic modeling. The requirements level allows for modeling and managing linked decisions, abstracting from the detailed logic of each decision. The decision logic level standardizes the way to express decision logic, e.g. in the form of decision tables. DMN provides a common model and notation that is readily understandable by all business users, from the business analysts needing to create initial decision requirements and then more detailed decision logic models, e.g. in the context of business processes, to the business people who will manage and monitor those decisions, and finally, to the developers responsible for implementing the decisions.

Decision logic modeling can take many forms, depending on the decision at hand, but decision tables are an important element. Most people know what decision tables look like: a number of rows or columns containing decision rules about combinations of condition expressions with their respective outcome. Decision tables have always been known for their ability to offer a compact, readable and manageable representation of decision logic. But many do not realize how the decision table concept has been refined throughout the years into a strict and powerful modeling technique (based on consistency by construction, normalization, completeness, correctness, etc.).

Decision Table Methodology
Different forms of decision tables exist in business practice, even under different names and with different semantics. What DMN offers is a standard notation and the ability to recognize and unambiguously interpret and exchange tables of rules in different forms. The core methodology to build sound decision tables is not part of DMN, but it still holds.

The decision table methodology offers:

  • Guidelines for composing effective decision tables (form, structure, meaning, etc.).
  • An overview of advantages and disadvantages of different types of decision tables (different hit policies).
  • A simple eight step method to construct good decision tables, starting from the description of the decision and leading to compact, normalized and optimized decision tables.
  • A sound decomposition of the decision structure.
  • Best practices on obtaining completeness, consistency, readability, maintainability.
  • A transition from the specification to design, implementation and maintenance.

Sometimes Scott Adams just nails it and late last year I saw this great strip on The Generic Graph. Work with analytics long enough and you see something akin to this – something Mychelle Mollot of Klipfolio called Building a One-size-fits-all Dashboard – one of the 6 mistakes she talks about in this article that she pithily summarizes as the “this sucks for everyone” problem. Mychelle goes on to propose that the solution is not to create a generic dashboard for the broadest possible audience but to create multiple dashboards targeted to specific roles within the organization. I would agree but go further – design dashboards to help specific roles make specific decisions.

Many roles, especially operational roles, have to make many decisions. How they make these decisions, the actions they take as a consequence, are what determine if they will meet their metrics. Displaying their metrics on a dashboard so they can see how well they are doing may be motivating but to actually improve those metrics you will need to help them make better, more profitable decisions. Yet most organizations, most dashboard projects, have never really thought about the decisions made by the people they are trying to help – at least not in any systematic way. In fact, instead of building a dashboard to support decision-making explicitly, most projects begin as Mychelle notes by being “data-centric” – pulling together all the data that might help. This creates a lot of visual confusion  and forces people to jump around multiple tabs or pages looking for the data they need right now to make the decision in front of them.

So how can you fix this problem? Well Mychelle lays out the first two steps:

  1. Figure out who your dashboards need to service
  2. Start with the more junior roles, those with an operational focus

Then move on to the more decision-centric steps:

  1. List the metrics or KPIs they care about
  2. Identify the decisions – the choices – they make that have an impact on these metrics
  3. Model these decisions (using the new Decision Model and Notation standard and a tool like DecisionsFirst Modeler) to understand how they make (or should make) these decisions
    This will give you a sense of the information and knowledge they need as well as how the decision-making breaks down into more granular decisions
  4. Use this model to layout your dashboards, one per decision, looking for opportunities to automate the lower level decisions while you are at it

Decision-centric dashboards really work. Go build some.

MobileReportCoverLast year BPM expert Sandy Kemsley and I did some research on the infrastructure you need to develop excellent, smarter, mobile apps. Mobile devices have gained enough traction with consumers and employees to require mobile applications as a part of an enterprise strategy. However, these mobile apps must be more than mere information presenters – they need to be “smarter” than typical enterprise applications. In the report we discuss the requirements for excellence in enterprise mobile and the challenges of using a traditional enterprise platform for delivering modern mobile apps. While most enterprise have realized that they need to add a mobile development capability, we also identify the decision and process management capabilities enterprises need to drive excellence in their mobile applications. Only by applying process and decision management on the back end, as well as mobile app development on the front end, can enterprises deliver the next generation of smarter mobile apps.

Report and webinar recording here.