All insights

Exponential Analytics

20 March 2026

Following last week’s successful Instech in London where Cameron Rye chaired a panel on day two, the Willis Re data and analytics team along with colleagues from Willis Research Network have reflected on the use of analytics in conversations with clients.

Our ability to understand and quantify risk is greater than ever before. Computing power allows millions of simultaneous calculations, fuelling AI that can interpret the numbers much faster than people could possibly hope to achieve by conventional means. It’s all underpinned by a massive increase in the availability and transferability of data.

Advances made have driven a risk-analytics arms race. Teams of data scientists sometimes now produce documents full of model outputs that run into the hundreds of pages. Brokers, keen to show that their analytics team has outperformed the competition and looked more closely than anyone else, can swamp cedants with a cascade of detail.

Analytics is good, but it is genuinely possible to have too much of a good thing. Even with all the necessary skills to interpret them, increasingly comprehensive, ever-thicker reports about analytical findings do not necessarily get us any closer to a solid understanding of risk. They might look impressive, but they might not provide higher-quality, actionable insights which would help cedants meet their risk transfer goals.

Data bias

Consider, for example, understanding hail risk. Surely, one might think, more data that has been better scrutinized should result in more accurate risk quantification. But the assumption is not necessarily correct.

Kelsey Malloy, an assistant professor at the University of Delaware, has spent a lot of time looking at reports of tornadoes and hailstorms. She mapped hail reports as part of a study supported by the Willis Research Network. Malloy found that hailstorms are more likely to occur along roads and in towns. At least, that is what all the data showed.

It is an incorrect conclusion, of course. As Assistant Professor Malloy eloquently points out, then proves beyond doubt, the incorrect finding is the result of a data bias. To be reported, a hailstorm must be seen. Since for obvious reasons more sightings occur along roads and in cities, where people pass through great expanses of open territory, the data is biased towards events that occur in locations adjacent to major roads. Clearly, having more of this biased data does not improve risk quantification.

Instead of simply heaping more data onto our clients and partners, we should think carefully about data bias, and consider ways to overcome it. We can then ensure we provide the right data to do what the client needs. And we really should consider the need to deliver three decimal points worth of precision, which denies the uncertainty of our craft.

Escape from model-land

Bias comes in another form: our ability to conflate modelled outputs with reality. This has been highlighted by the Willis Research Network research partner Erica Thompson, Associate Professor of Modelling for Decision Making at University College London, whose book Escape from Model Land shows how model creators inject their biases, perspectives and expectations into the tools. Worse, users may then begin to believe that the model output is reality, when in fact it is simply only one abstraction of many possible realities. To use models well, she argues, understanding their limits is vital.

Cat models are incredibly powerful tools that deliver outputs based on many variables. These outputs can be a huge asset when we must make decisions. However, before relying on outputs, and even before generating them, we really ought to consider what decisions need to be made, and why.

Unfortunately those critical details are often buried beneath reams of modelled scenarios. But we need not more data. Instead, we need better, more relevant data. That is what will help us to make better decisions.

Reduce output waste

Instead of piling on the variations and permutations, we should constrain the volume of output delivered to clients to include only exactly what they need. To understand the risk, the client may require only a handful of metrics presented in a few graphs, augmented by an explanation of the findings. If that is sufficient to support their own, fully informed decision, giving them more is simply wasteful and costly for everyone.

Critical to the evaluation of model outputs is an understanding of the values and expert judgements that went into building the models that generated them. When a client wishes to engage with the variations and permutations, we must be sure to provide all the information necessary to understand the differences between them. Our explanations will underpin the client’s informed decision. Decisions cannot be informed when based solely on the pages of tables many brokers are so keen to deliver.

Willis Re does not believe we should fuel the analytics arms race by sending potential clients a massive submission pack with the highest possible number of impressive graphs and tables. Instead, we provide the transparency necessary to make decisions aligned to each client’s unique risk appetite. Critically, that includes explaining the possible implications when model assumptions are proved wrong.

Analytics is a powerful tool to solve the clients’ articulated problems. It is not a form of sales collateral. Willis Re perhaps runs even more analytics than most (and they are there if you want them), but we prefer to think about what you are asking, get the gist of what you need, then present it in a crisp way.

Presentation is important. Studies show that a slide pack or a report engages your brain more than a spreadsheet. When we present risk in an immersive way, for example through a realistic disaster scenario, we engage more than we do with a simple numerical presentation, let alone with 46 slides showing various tail values at risk.

The narrative has power. Combined with the more cerebral approach of a loss exceedance curve, it has very great influential abilities. Together they help us to make better decisions. So when you meet with Willis Re, we may turn the tables on analytics and spread the sheets of data on thinly. Expect a conversation, with just enough analytics to get it right.

Cameron Rye, Director of Natural Catastrophe Analytics

If you’d like to discuss this topic in more detail, please get in touch with a member of our team at communications@willisre.com.