I am giving a tutorial and a presentation on putting predictive analytics to work at the forthcoming Predictive Analytics World show in Washington DC (October 20th-21st with tutorials on the 19th). I always like to illustrate my points with real examples and I am going to be talking about Premier Bankcard, a SAS client. As I can only touch on them in my presentation, I thought I would blog about them too. Premier Bankcard is using predictive analytic models in a number of its production administrative systems to help detect fraud, manage credit and assess the value of customers. The four key scores are:
- SAS Fraud Score
Identifies which individual applications should get additional fraud control mitigation. An additional step in the credit approval/denial process scores the application for fraud risk and is sends an application down an additional research path based on specific score cut-points.
- SAS Credit Scores
These scores refine the standard 3rd party credit scores and generates a lift in Credit Risk Management.
- SAS Good Customer Score
This is an internal score that is used to rank and segment the customer portfolio. Applied monthly, this score is used in acquisition, retention, cross-sell, and credit risk management programs.
The first two are calculated in transactional systems as needed – over 5 million applications are processed per year in these systems. The good customer score is processed on every active customer record each month – something over 3 million a month.
Each score has a different development and deployment approach. As an example the fraud score was created using SAS Enterprise Miner and uses regression models to create a probability score. The model is deployed as SAS code on a production server and a custom Java application was created to integrate this service with the production applications. The credit manager system allows the business to specify the cut offs as a decision tree and these are used to route applications for the relevant additional fraud research.
The creation and integration of this score was driven entirely from the business side and originally failed to bring the IT people along – there was a lot of push back as there were no real experts on analytics and scoring in house – they had always depended on a third party. Initially there were challenges and it was clear that business users and project team needed to do more testing and make it clearer to IT what it would do – they needed a better understanding of the decision process. Additionally, because IT did not understand how the models worked they had trouble with testing. Test data that was not the same as the model data created a number of problems, for example. Premier learned that it was REALLY important to get everyone on the same page (business, analytics, IT)
Impact analysis is critical in deploying and updating models. Premier does this in SAS with analysts using the tools to do impact analysis on changes. But Premier also does a business impact assessment – a recommendation in writing to justify hiring/job changes etc that are a consequence of analytic changes. This kind of organizational impact can be missed when considering the impact of a model change so this is a very important step.
In general Premier found that the process needs to be led by the analytics group but must have a strong business perspective. The team must generate respect from IT and from the business if it is to be successful and this takes time.
Another great story about putting predictive analytics to work. Email me for a discount code to Predictive Analytics World if you want to learn more.