The biggest transformation in big data for the risk community will be the ability for interpretation.
Risk managers spend their days monitoring their company’s variation in performance – financial, operational, human, ethical – to keep the variation within acceptable limits. They have never been short of data so they can be can be forgiven for not getting too excited about the prospect of big data.
Big data will, however, inevitably change their world and they will need to have the right tools in place to be able to visualise vast amounts of data: hunting for the needle in the haystack, spotting the anomalies, looking for outliers and identifying weak signals of abnormality.
So what might big data actually mean for risk management? Could it enable better pre-emptive action to minimise the chances of a weak anomaly escalating into a crisis? Might it shift the balance from crisis management post-event to preventative management before it happens? Could it give managers greater confidence to detect signal from noise, or to understand the cost-benefits of making decisions probabilistically? Could it change the very nature of risk management itself?
Risk and uncertainty go together. Today, analytics is about getting good data to reduce uncertainty: “garbage in, garbage out”. Data theorists are now telling us that lots more data – even of much lower quality – can reduce uncertainty on estimation. Forget trying to clean up your modelling data. Just find more indicators of similar things and let your algorithms do the heavy lifting.
The first impact of big data for risk managers is the abundance of items they can monitor in greater detail: activities, interactions, signals, movement of goods, financial fluctuations. The need to sample subsets of information or take snapshots of partial views will be replaced by having data on all of these items – all of the time. More importantly, risk managers will change the way they analyse and interpret data by using ever more powerful computational resources. Risk management is about envisioning and contingency planning for possible future problems. Perhaps the risk managers of the future will generate billions of stochastic possible future projections. They may compare thousands of different opinions, or use every different scientific interpretation to interpret model uncertainty.
With the right tools they can look at connections between things, view the world as networks, detect time progressions and evolving patterns, and learn to construct and read data maps the way that early explorers invented cartography. One tool is MongoDB, used for its agile and scalable approach to data management, common uses for it include operational and analytical big data. RMS is using MongoDB to develop enterprise-wide access to comprehensive risk information and gain new big data insights to enable our clients to benefit from greater efficiency in underwriting portfolio management.
The dynamics of risk management is set to change significantly. Thinking through the opportunities that might offset cost is a vital part of sharing expertise between specialists. One group exploring what the information explosion will mean for the world of risk are leading risk management experts at Cambridge University’s Centre for Risk Studies. These experts are working with big data experts to understand how the democratisation of information access has provided enormous opportunities for individuals and organisations, while creating growing debate on its consequential use.
Dr Andrew Coburn is senior vice president for Liferisks at RMS.