Expose the flaws in exposure data
In September 2007, I wrote an article entitled ‘Exposure Management; Back to the Future’. At the time, modelling was the primary focus of re/insurance analytics and companies were investing considerable sums in building catastrophe modelling teams. It was an exciting time to be in the cat modelling space.
The article was written in response to various discussions regarding over-reliance on modelling. Models were already sophisticated, but high-profile events had highlighted that they were never intended to predict event losses precisely, and that the market still needed to focus on the fundamentals of exposure management.
Of course, while many companies never lost sight of this, some were over-reliant on modelled loss estimates being ‘right’ to maintain or grow the P&L and, of utmost importance, to protect their balance sheet.
One natural response to model dominance was to ensure good old-fashioned exposure management remained at the heart of risk operations and all key metrics; to not be lured too far by the magic of the models. Stay true to the fundamentals; what business am I writing, and where is it?
A major shift in focus
Two huge and previously unconsidered catastrophes, 9/11 and Hurricane Katrina, are both seminal events in the development of exposure management; not just because of their devastating human and financial impact, but also because they both led to a much greater focus on the fundamentals of risk management.
The common denominators underpinning these two events are exposure analytics. Exposures for both events were catastrophically misunderstood. In the case of 9/11, addresses were being incorrectly placed nowhere near the actual buildings, and in Katrina the key vulnerability characteristics were completely wrong, with hundreds of thousands of risks incorrectly coded.
Localised events, such as 9/11, resulted in a range of analytical responses that would be embedded in key re/insurance workflows over the ensuing years. New tech companies popped up, the big brokers created exposure analytics tools, and as the variety and resolution of the models increased there was a sense that exposure management was back in vogue.
The longer-term impact of Katrina was a considerable increase in interest in flood exposure. Such spikes are always evident following major losses. I received my first call about terrorism modelling on 12 September 2001, while for Katrina the phone was ringing with similar flood enquiries before landfall. I was one of those in the market (metaphorically and in many cases literally) explaining what was and, critically, was not, included in the models used to assess Katrina losses.
A clear gap in understanding
Why was this important? Well, for all intents and purposes the different forms of flood modelling had been widely misunderstood up to this point. This was different to 9/11 as the issue there wasn’t the lack of terrorism modelling per se but the lack of understanding regarding exposure correlation; previous misunderstood concentrations were putting companies out of business.
The problem was clear; two of the three core components of all ground-up modelled loss estimates were simply not known. The hazard was fairly well understood, but the same could not be said for property locations – the geocoding – or the property characteristics – the vulnerability.
Today, have we filled these data gaps? No, definitely not. In fact, the problem is getting worse. As models – or the hazard element – and analytics have become increasingly high-resolution, the gaps in the data have become more exposed than ever.
There’s an assumption that high-resolution models are matched with equally high-resolution exposure data. That is not the case. I want to be perfectly clear on this – the industry is exposed to multiple balance sheet impairment issues due to a continued reliance on models and exposure data that are seriously imprecise and mismatched.
We started Insurdata in 2017 because we believed some of the fundamentals of re/insurance risk analytics were flawed. We knew we had quite a task ahead of us, as the flawed analytics are central to the reserving and pricing for most re/insurers, including publicly traded companies. Given the magnitude and sensitivity of the problem, central to building the right technology was a process of education and change management.
This process is very similar to the change-management activities related to significant model changes. Historically, many of the big modelling changes have been likened to a catastrophic event, given the impact on key metrics that some companies have experienced. At Insurdata we knew our technology could have a similar impact. In fact, we anticipated it was very likely.
Fast forward three years and numerous analyses of live exposure data have proved the impact of flawed exposure data beyond any doubt. Rather than pore through endless details, a snapshot of a recently completed project spanning a portfolio subset of ~1,000 properties located in Florida and California, highlights the impact:
- The average geocoding error was 385 metres. In other words, the insurer believed the properties were, on average, 400 metres from the actual location.
- Only 8 percent of the geocodes for the properties were correct.
- Analyses with a commercial model highlighted the annual average loss for ~12% of properties went from no apparent risk at point of underwriting to high risk or vice versa.
- In real terms, the impact for this portfolio subset was a change in premium of $21m.
Bear in mind that the above analysis was conducted on live policies; this isn’t hypothetical and this portfolio subset is most definitely not the worst we’ve seen.
Facing up to the data problem
Companies can no longer simply blame inherent volatility in modelled loss estimates, or hope that the next event will not uncover a systemic exposure management issue. As an industry, we must face up to the deficiencies in the exposure data we rely upon and grasp the technologies which now exist to bridge the gap between data quality and model capabilities.
This is a fundamental problem that simply won’t go away without action, and its potential market impact if not addressed head on is the equivalent of multiple cat 5 hurricanes about to make landfall.
This article was published in Insurance Day on Wednesday 9 September
IRB Brasil Re selects Insurdata platform
- Cat modelling team subscribes to exposure data portal
- Focus on enhancing exposure accuracy, operational efficiency and reducing model volatility
- Insurdata Portal seamlessly integrated into day-to-day workflow
London, 19 August 2020 – Insurdata, the award-winning insurtech firm which specialises in the augmentation of peril-specific exposure and risk data via its exposure platform, has today announced that IRB Brasil Re (IRB) has licensed its Insurdata Portal.
IRB, the largest reinsurance company in Latin America and with an expanding presence in the international market, will use the platform to generate precise property geocode information across its global exposures. The company’s catastrophe modelling team will integrate the Portal into its daily risk assessment workflow, boosting the resolution of existing data in real time using Insurdata’s wide range of proprietary technologies.
Capitalising on the enhanced exposure information across both their Latin American and international portfolios, IRB aims to use the technology to increase operational efficiency significantly and benefit from more precise exposure data to drive increased accuracy of catastrophe modelling outputs and exposure accumulation information. The increased data precision will also serve to reduce overall model volatility.
Commenting on the announcement, Luis Brito, Catastrophe Modelling Manager at IRB, said: “We are very excited to be working with Insurdata as we look to enhance the resolution of the global exposure data which supports our underwriting, pricing and portfolio management decisions. That data quality will enable us to maximise the analytical potential of our catastrophe modelling, reduce the associated volatility of modelled outputs and boost our overall underwriting efficiency.”
Jason Futers, CEO, Insurdata, added: “The decision by a company of the scale and sophistication of IRB to adopt our platform is a fantastic endorsement of our technology and exposure methodologies. By enhancing exposure data, our aim is to give underwriters greater confidence in their modelled loss estimates leading to improved risk selection and portfolio management, and ultimately more accurately priced products and stronger balance sheets. We’re delighted to work closely with IRB to integrate our Portal into their data processes, ensuring they fully capitalise on the potential it creates.”
Mercer joins Insurdata Advisory Board
• Mercer joins Nagel and Drattell on the Insurdata Advisory Board
• Mercer previously co-founder of Validus Re and founder of Ventus Risk Management
London, 06 August 2020 – Insurdata, the award-winning insurtech firm which specialises in the correction, creation and augmentation of peril-specific exposure and risk data via the Insurdata Platform, has today announced that Stuart Mercer has joined the company’s Advisory Board.
Mercer brings decades of industry experience to the Board with a career that spans both the re/insurance and ILS arenas. He has extensive knowledge of the value of precise, high-resolution data in the underwriting cycle having served as CEO and Founder of Ventus Risk Management, a technology and analytics-focused provider of coastal property insurance, and having previously co-founded Validus Re.
Mercer commented: “The re/insurance industry is changing rapidly and having had the opportunity to look at the Insurdata technology in detail, it’s clear to me that Insurdata is at the forefront of that change. The data platform creates the foundational layer upon which the industry can reduce volatility and ensure their key metrics for underwriting, pricing and risk transfer are precise. I’m really looking forward to working with Jason and the team.”
Mercer joins fellow Board members Eric Drattell and Marcus Nagel.
Drattell is an accomplished business leader, with extensive legal, business and compliance expertise across privacy, technology transactions, licensing and cloud/SaaS computing. He is currently General Counsel and Chief Compliance Officer for cloud-based FinTech company Roostify, and during his career was also Senior Vice President and General Counsel at risk modelling firm RMS.
Nagel is a hugely experienced insurance industry figure having held numerous senior positions at Zurich, including CEO of Zurich Germany, Chairman of Zurich Deutscher Herold and Global Head of IFA and broker business at Zurich Global Life.
Jason Futers, CEO of Insurdata, said: “I am delighted to welcome Stuart onto the Insurdata Advisory Board. He has been at the forefront of the evolution of underwriting data and analytics throughout his career and brings an incredible depth of market insight. With Eric, Marcus and Stuart we have a truly outstanding team guiding our development as we grow Insurdata’s market presence and boost our ability to enable our clients to optimise pricing, underwriting and risk assessment decisions.”
Insurdata wins “Insurtech Initiative of the Year” Award
· Award presented at Insurance Day London Market Awards 2019
· Entries judged by independent panel of industry leaders
· Award recognises initiative predicted to have greatest impact on London market
We are delighted to announce that Insurdata has been awarded the “Insurtech Initiative of the Year” award at the Insurance Day London Market Awards 2019.
The shortlist for this award boasted some outstanding initiatives spanning critical aspects of the London insurance market. However, the judging panel, which was composed of leading industry figures from organisations including the LMA, Aon, BCG and the MGAA, believed that Insurdata stood out as the initiative which demonstrated the clearest innovation, provided the most significant opportunity to enhance business practices, and had the greatest potential to impact the London Market.
Commenting on the award success, Jason Futers, CEO, Insurdata, said: “This is a fantastic achievement for a company which only launched in 2017 and recognises the huge opportunity which Insurdata provides to address head-on the fundamental data resolution challenge which the insurance industry faces.
“The last 24 months have been a very busy period for Insurdata. We have secured strong backing for the platform, analysed, created and augmented live risk data for over 30 re/insurers, and made it through the stringent process of the Lloyd’s Lab as part of the Cohort 3 programme. It’s been an incredible two years.”
Jeremy Sterns, Chief Technology Officer, added “This award recognises the immense effort that the Insurdata team has made to turn the original idea of creating a platform that would enable the creation of high-resolution data at the individual exposure level into a fully functioning reality. We’re excited about what the future has in store and are looking forward to continuing to roll out our API to syndicates, (re)insurers, brokers and modellers working to improve their exposure data.
Insurdata study reveals scale of exposure data challenge
Insurdata, the award-winning insurtech firm which specialises in the augmentation of peril-specific exposure and risk data via its Exposure Engine, has today released the findings of an exposure data resolution study for a sample portfolio of US-based insured properties.
- Over 10% of properties in study displaced by 500m-1,000m
- First-floor elevation model assumptions incorrect for 90% of locations
- Study based on sample of 4,500 US-based insured property assets
The findings revealed that of the 4,500 locations assessed in the report, over 40 percent (over 1,800 locations) were displaced by more than 10 metres, with 10 percent of the location data inaccurate by between 500-1,000 metres. The combined results showed an average displacement of 183 metres for the entire data set, with the largest displacement being 33 kilometres.
Additional analysis conducted by Insurdata at the individual company level has revealed that the net impact on modelled loss estimates of correcting location latitudes and longitudes can have a material effect on Average Annualised Loss estimates.
Commenting on the degree of displacement evident in the data, Paul Burgess, Head of Client Development at Insurdata, said: “It is important to consider this level of displacement in the context of standard accumulation zones. If a (re)insurer is applying a 200 metre accumulation zone for its terrorism exposure, for example, our study reveals that at a minimum 6 percent of the risks will fall outside of the accumulation zones they are believed to be in, while locations with smaller displacements could also be outside of the zone perimeters.”
The study also analysed the data sets to assess first-floor elevation (FFE) information against standard model assumptions regularly applied to assets where no FFE data is available. The findings showed that the FFE measurements for the properties were much lower than default modelled assumptions for over 90 percent of locations in the study.
According to Jason Futers, CEO of Insurdata: “The results from this particular data set show that there is a strong likelihood that the modelled loss estimates for flood will be underestimated for these particular locations. Flood exposure is one of the most data intensive risks in the (re)insurance market. Without key attribute data such as FFE the value of applying high-resolution models to such perils will be significantly reduced.”
He continued: “The misrepresentation of location data and a lack of detailed building parameter information can have a significant detrimental impact on the ability of (re)insurance companies to accurately assess exposure levels from the individual risk through to the property portfolio. Advances in technology now mean we can generate property-specific data at a much more granular level for every asset in a portfolio in a quick, efficient and accurate manner. Companies must therefore take steps to capitalise on this to improve risk selection and pricing and reduce portfolio volatility.”
The analysis was based on a sample set of approximately 4,500 insured property assets. The location data, which was sourced from a variety of re/insurance entities, was for US-wide risks. The geocoded property data, which included building and zip-code level information, was augmented using Insurdata’s Exposure Engine with updated building centroid information and additional building attribute data, including FFE and building perimeter dimensions.
Insurdata raises $3m in seed funding
London, 15 April 2019 – Insurdata, the award-winning insurtech firm which specialises in the augmentation of peril-specific exposure and risk data via its Exposure Engine Platform, has today announced that it has secured $3 million from a group of investors, led by Anthemis and Menlo Ventures.
- Funding led by Anthemis and Menlo Ventures
- Five new VC firms participate
- Capital will support next stage in Insurdata development
The investors are composed of both Venture Capitalists and Angels and also include: Alma Mundi Fund, Talis Capital, InsurTech Gateway, Ascend, Prototype Capital and the Baloise Group. Insurdata secured an initial tranche of funding of $1 million in October 2017.
Insurdata was launched in 2017 to address the lack of property-specific data available to the re/insurance market. The firm’s platform enables re/insurers to generate high-resolution, accurate, risk-specific data globally in real-time at all points in the underwriting workflow. This includes accurate geocode information, building attributes and first-floor elevation data.
By providing access to precise data, Insurdata aims to give underwriters greater confidence in modelled loss estimates and accumulation analyses, resulting in better risk selection and improved portfolio management. This in turn supports better, more accurately priced products, more resilient balance sheets and ultimately helps reduce volatility.
Commenting on the funding, Jason Futers, CEO, Insurdata, said: “We are delighted to have secured the backing of a fantastic range of investors who fully understand the criticality of the service that Insurdata provides to the re/insurance market. Our work to date has exposed material deficiencies in the quality and scope of information which underwriters are reliant upon, which in turn have a detrimental effect on their ability to accurately price risk and manage portfolios effectively.”
“Moving forward, we aim to capitalise on the fact that our Exposure Engine can be applied to any peril by working with re/insurers to introduce more refined data sets for a broad range of exposures, including flood, windstorm, earthquake, terrorism and cyber. We are also evolving the Insurdata Customer Portal to make access to high-resolution data as straightforward and speedy as possible.”
Ruth Foxe-Blader, Managing Director, Anthemis, said: “We are delighted to continue our support of Insurdata. As the re/insurance industry undergoes a period of readjustment in the aftermath of two major loss years, the quality and granularity of data that supports their underwriting decisions will be vital to their ability to manage volatility. The data consistency that Insurdata provides I believe will become an industry standard.”
Javier Santiso, Founder & CEO, Mundi, said: “Insurdata provides an excellent opportunity to invest in a deep technology company that refines and enhances the lifeblood of the re/insurance industry – data. Few other markets are as reliant upon such incredibly complex and comprehensive data sets. Ensuring that such information is of the highest possible quality and resolution to enable underwriters to make risk decisions with confidence should be a top priority for every organisation.”
Richard Chattock, CEO, InsurTech Gateway, said: “InsurTech is making major inroads into virtually every phase of the insurance process. Insurdata targets a core component of an insurer’s process – exposure data. In our view, the firm brings together the right expertise, the right technology and the right capabilities at an opportune time to make a crucial difference to the insurance market.”
Tom Williams, Principal, Talis Capital, said: “We are very excited to be partnering with Jason and his impressive team as they continue to provide a pioneering approach to risk data capture for insurers worldwide. With over 90% of modelled loss estimates for flood exposures currently underestimated by traditional methods due to a lack of accurate first-floor elevation data, this solution has the potential to vastly improve insurer analysis accuracy and significantly reduce their costs. We look forward to working with Insurdata as they boost their sales power and refine the technology to meet the ever-growing demand in the market”
 Source: Insurdata