hx welcomes Insurdata as a Renew Connect partner

For insurers in need of detailed geographical insights, Insurdata offers highly accurate geo-coding without the need for manual surveys or online research. By partnering with hx, customers can integrate this data into their pricing models with a few clicks.

It can be incredibly challenging for the insurance industry to collect high-quality position and risk attribute data, without spending hours manually searching online or performing on-site surveys. However, this information is essential for accurate, reliable pricing.  Take flood pricing, where the difference between a couple of inches in site height or slope can make all the difference between a claim or no claim following a flood-related event.  For this reason, hx is excited to announce that we will be partnering with Insurdata as part of our Renew Connect program to give our customers access to this information.

Insurdata delivers high-resolution, accurate property and risk attribute data to optimise pricing, underwriting and risk assessment decisions.  The Insurdata Exposure Engine has been designed for full peril coverage globally. The system generates specific attribute data for any type of exposure, including flood, earthquake, wildfire, hurricane, tornado and terrorism. 

hx and Insurdata were both members of cohort 3 of the Lloyd’s Lab which was “focussed fully on finding solutions with the potential to contribute to the ecosystem of services as part of the future at Lloyd’s vision, including ways to enhance data sharing and provide new sources of risk insight; pricing and risk models to help Lloyd’s market participants better understand threat scenarios; and ways to reduce the cost of processing claims as well as the burden of compliance and regulation”. 

We have continued our collaboration while growing rapidly since the conclusion of our time in the Lloyd’s Lab.

The integration between hx and Insurdata has been designed to be streamlined; by combining Insurdata’s expertise and data capabilities with Renew’s API-first architecture and rapid model deployment. Our customers can enhance their underwriting decision-making process in one centralised place and with no need for re-keying between systems.

“We are delighted to be working with the hx team as a Renew Connect partner. Through this partnership, organisations will now be able to capitalise on our high-resolution, property and risk attribute data to ensure data-driven decision making. We’re very much looking forward to helping hx customers to improve their knowledge of locations with a simple yet powerful integration of our technology.” Rosina Smith, Head of Client Success, Insurdata.

For our Renew clients, the Renew Connect section of our documentation now includes a complete annotated model which can be imported into your Renew environment to get you up and running with address search, geo-location and risk attribute querying in minutes.  The Insurdata APIs are all fully documented here and if you have any queries about anything in the Renew documentation please contact our support team.  You will need an Insurdata API key available on request.

If you are not a Renew client and you are interested in a demo of how Insurdata in Renew can assist your business, please contact hello@hyperexponential.com and we will arrange a session.

Insurdata and Canopius carry out exposure data resolution study

• Data modelled through Fathom’s flood model on the ModEx cat modelling platform
• Study reveals range of geocoding displacement
• Augmented data sets have marked impact on loss estimates

London, 26 February 2019 – Insurdata has today released the findings of a flood data study conducted with global speciality re/insurer Canopius, flood modelling firm Fathom and ModEx, a catastrophe modelling platform delivered by Simplitium.

The study focused on how augmented property exposure data altered the risk profile of an asset when modelled, and how the revised data sets would impact annual average loss forecasts and maximum event losses for a series of return periods.

“The main aim of the study,” explained Jason Futers, CEO, Insurdata, “was to establish the potential impact on loss estimates at both the individual property level and portfolio level when exposure data is modelled at a much higher resolution than currently available to many re/insurers.”

The sample data provided by Canopius represented a typical raw data set as available in the re/insurance market. The unaugmented address data was originally geocoded to a variety of resolutions from building to zip-code level.

The portfolio included data for approximately 1,000 properties in the Houston, Texas region, with a total insured value of $9.9bn spanning residential, commercial and industrial locations. Some locations had been impacted by the 2017 inland flooding resulting from Hurricane Harvey.

The initial phase of the study involved the creation of four specific data sets, the initial raw location data and three augmented sets which included updated geocode information, building perimeter points, and building attribute data, including first-floor elevation (FFE). These data sets were then modelled through Fathom’s high-resolution flood data model with average annual loss (AAL) and exceedance probability analysis conducted for all locations.

The geocode corrections saw a displacement of five metres or less for 384 exposures. 20 percent of the properties witnessed a geocode displacement of 50 metres or more, while 75 locations were displaced by one kilometre or more resulting in an overall location displacement of 276km for the portfolio. It should however be noted that not all original geocode data supplied was at street level.

Analysis of the augmented property data sets had a significant impact on the annual modelled loss figures:

• The revised geocodes (Set 2) resulted in an 84% increase in the loss estimate
• The addition of the building perimeter data (Set 3) reduced the increase to 76% compared to Set 1, or a 4% decline compared to Set 2
• The addition of the building attribute data, including FFE, (Set 4) saw a 2% decrease in the loss figure compared to Set 1, or a 44% decline compared to Set 3

In terms of the impact on the modelled estimates for the return period loss, for a 1-in-200-year event, the analysis revealed:

• Set 2 analysis resulted in a 42% increase in the estimated loss compared to Set 1
• Set 3 analysis saw an 11% decline compared to Set 2
• Set 4 analysis continued the downward trajectory with a further decline of 37% compared to Set 3

Commenting on the findings, Jason Futers, CEO, Insurdata, said: “As the analysis demonstrates, augmenting property location information with building perimeter data and first-floor elevation measurements, rather than relying upon a single geocode and a default FFE assumption, has a material impact on the modelled loss estimates.”

His comments were echoed by Dr Andrew Smith, Chief Operating Officer, Fathom, who said: “Flooding is one of the most spatially complex phenomena that insurers have to tackle, which makes the ability to map exposures accurately critical to their ability to robustly model such losses.” He continued: “There is no reason why re/insurers should not be using accurate geocode data in their risk analysis.”

Paul Wilkinson, Head of Catastrophe Management, Canopius, concluded: “Current market conditions and narrow profit margins are driving greater focus on risk differentiation and portfolio optimisation. The augmented data metrics generated in this study can help insurers improve risk selection and enhance portfolio makeup, support decision-making around risk quantification and policy conditions, and facilitate the development of new products and more flood-specific coverage. Canopius is fully committed to taking a leading position in efforts to enhance the resolution of the data which underpins our industry.”

A report on the findings entitled “Breaking down the flood data” is available here 

“The industry suffers from a sense of institutional amnesia as an industry - we operate a renewal business, we see the same risks year after year but tend to forget all of the enhancement we’ve made to all of the risk information each time we look at the risks”
Paul Nunn — Head of Catastrophe Modelling at SCOR