There was a fascinating piece in the Economist last week – Little things that mean a lot. This piece really resonated with me – even the title sounds a lot like my mantra of “Big Data, Little Decisions” (you can see a selection of the articles and webinars I have given on this topic here). So what were the critical points Schumpeter made?
First the point that constant experimentation and rapid iteration is critical when trying to get value from all this data. Experimentation is often the skill we tell clients they most need to develop and we regularly stress the importance of putting in place the infrastructure and processes for ongoing decision results analysis. To maximize the value of big data and become a data-driven organization you really need to understand how to conduct A/B and champion/challenger testing of your decision making not just your website layout or marketing messages.
Second, and most importantly, the column highlighted the value of very small incremental improvements to something that happens a lot. I would call this a slightly better operational decision – the Facebook example being the decision “what should we put in this user’s news feed?” These “small but useful improvements” generate significant value when applied at scale to an organization’s day to day business. We regularly advise customers to begin with these kinds of decisions and get value from the multiplying effect of many decisions rather than believe in the myth of the “aha” moment. The final comment in the column says it all
managers may have been misled into hoping it will give them massive, instant, Holy Grail solutions. But such discoveries are rare; and if they do exist, they have probably been made already. The reality is that big data produces lots of small advances—and that is good enough.
Comments on this entry are closed.