IRB Brasil Re selects Insurdata platform
- Cat modelling team subscribes to exposure data portal
- Focus on enhancing exposure accuracy, operational efficiency and reducing model volatility
- Insurdata Portal seamlessly integrated into day-to-day workflow
London, 19 August 2020 – Insurdata, the award-winning insurtech firm which specialises in the augmentation of peril-specific exposure and risk data via its exposure platform, has today announced that IRB Brasil Re (IRB) has licensed its Insurdata Portal.
IRB, the largest reinsurance company in Latin America and with an expanding presence in the international market, will use the platform to generate precise property geocode information across its global exposures. The company’s catastrophe modelling team will integrate the Portal into its daily risk assessment workflow, boosting the resolution of existing data in real time using Insurdata’s wide range of proprietary technologies.
Capitalising on the enhanced exposure information across both their Latin American and international portfolios, IRB aims to use the technology to increase operational efficiency significantly and benefit from more precise exposure data to drive increased accuracy of catastrophe modelling outputs and exposure accumulation information. The increased data precision will also serve to reduce overall model volatility.
Commenting on the announcement, Luis Brito, Catastrophe Modelling Manager at IRB, said: “We are very excited to be working with Insurdata as we look to enhance the resolution of the global exposure data which supports our underwriting, pricing and portfolio management decisions. That data quality will enable us to maximise the analytical potential of our catastrophe modelling, reduce the associated volatility of modelled outputs and boost our overall underwriting efficiency.”
Jason Futers, CEO, Insurdata, added: “The decision by a company of the scale and sophistication of IRB to adopt our platform is a fantastic endorsement of our technology and exposure methodologies. By enhancing exposure data, our aim is to give underwriters greater confidence in their modelled loss estimates leading to improved risk selection and portfolio management, and ultimately more accurately priced products and stronger balance sheets. We’re delighted to work closely with IRB to integrate our Portal into their data processes, ensuring they fully capitalise on the potential it creates.”
Mercer joins Insurdata Advisory Board
• Mercer joins Nagel and Drattell on the Insurdata Advisory Board
• Mercer previously co-founder of Validus Re and founder of Ventus Risk Management
London, 06 August 2020 – Insurdata, the award-winning insurtech firm which specialises in the correction, creation and augmentation of peril-specific exposure and risk data via the Insurdata Platform, has today announced that Stuart Mercer has joined the company’s Advisory Board.
Mercer brings decades of industry experience to the Board with a career that spans both the re/insurance and ILS arenas. He has extensive knowledge of the value of precise, high-resolution data in the underwriting cycle having served as CEO and Founder of Ventus Risk Management, a technology and analytics-focused provider of coastal property insurance, and having previously co-founded Validus Re.
Mercer commented: “The re/insurance industry is changing rapidly and having had the opportunity to look at the Insurdata technology in detail, it’s clear to me that Insurdata is at the forefront of that change. The data platform creates the foundational layer upon which the industry can reduce volatility and ensure their key metrics for underwriting, pricing and risk transfer are precise. I’m really looking forward to working with Jason and the team.”
Mercer joins fellow Board members Eric Drattell and Marcus Nagel.
Drattell is an accomplished business leader, with extensive legal, business and compliance expertise across privacy, technology transactions, licensing and cloud/SaaS computing. He is currently General Counsel and Chief Compliance Officer for cloud-based FinTech company Roostify, and during his career was also Senior Vice President and General Counsel at risk modelling firm RMS.
Nagel is a hugely experienced insurance industry figure having held numerous senior positions at Zurich, including CEO of Zurich Germany, Chairman of Zurich Deutscher Herold and Global Head of IFA and broker business at Zurich Global Life.
Jason Futers, CEO of Insurdata, said: “I am delighted to welcome Stuart onto the Insurdata Advisory Board. He has been at the forefront of the evolution of underwriting data and analytics throughout his career and brings an incredible depth of market insight. With Eric, Marcus and Stuart we have a truly outstanding team guiding our development as we grow Insurdata’s market presence and boost our ability to enable our clients to optimise pricing, underwriting and risk assessment decisions.”
Insurdata study reveals scale of exposure data challenge
Insurdata, the award-winning insurtech firm which specialises in the augmentation of peril-specific exposure and risk data via its Exposure Engine, has today released the findings of an exposure data resolution study for a sample portfolio of US-based insured properties.
- Over 10% of properties in study displaced by 500m-1,000m
- First-floor elevation model assumptions incorrect for 90% of locations
- Study based on sample of 4,500 US-based insured property assets
The findings revealed that of the 4,500 locations assessed in the report, over 40 percent (over 1,800 locations) were displaced by more than 10 metres, with 10 percent of the location data inaccurate by between 500-1,000 metres. The combined results showed an average displacement of 183 metres for the entire data set, with the largest displacement being 33 kilometres.
Additional analysis conducted by Insurdata at the individual company level has revealed that the net impact on modelled loss estimates of correcting location latitudes and longitudes can have a material effect on Average Annualised Loss estimates.
Commenting on the degree of displacement evident in the data, Paul Burgess, Head of Client Development at Insurdata, said: “It is important to consider this level of displacement in the context of standard accumulation zones. If a (re)insurer is applying a 200 metre accumulation zone for its terrorism exposure, for example, our study reveals that at a minimum 6 percent of the risks will fall outside of the accumulation zones they are believed to be in, while locations with smaller displacements could also be outside of the zone perimeters.”
The study also analysed the data sets to assess first-floor elevation (FFE) information against standard model assumptions regularly applied to assets where no FFE data is available. The findings showed that the FFE measurements for the properties were much lower than default modelled assumptions for over 90 percent of locations in the study.
According to Jason Futers, CEO of Insurdata: “The results from this particular data set show that there is a strong likelihood that the modelled loss estimates for flood will be underestimated for these particular locations. Flood exposure is one of the most data intensive risks in the (re)insurance market. Without key attribute data such as FFE the value of applying high-resolution models to such perils will be significantly reduced.”
He continued: “The misrepresentation of location data and a lack of detailed building parameter information can have a significant detrimental impact on the ability of (re)insurance companies to accurately assess exposure levels from the individual risk through to the property portfolio. Advances in technology now mean we can generate property-specific data at a much more granular level for every asset in a portfolio in a quick, efficient and accurate manner. Companies must therefore take steps to capitalise on this to improve risk selection and pricing and reduce portfolio volatility.”
The analysis was based on a sample set of approximately 4,500 insured property assets. The location data, which was sourced from a variety of re/insurance entities, was for US-wide risks. The geocoded property data, which included building and zip-code level information, was augmented using Insurdata’s Exposure Engine with updated building centroid information and additional building attribute data, including FFE and building perimeter dimensions.
A New Face and a New Customer Portal at Insurdata
We’re very excited to announce that we recently expanded the Insurdata team as well as our offering for clients…
Paul Burgess, Head Of Client Development
As our work to improve the integrity of property insurance data continues apace, we are delighted that Paul Burgess, long-time catastrophe risk specialist and lately responsible for client development throughout Asia for RMS, has come on board to lead our client-facing activities.
Paul says: “This issue of inaccurate data is a big problem that has existed in the insurance industry for 20+ years. I think the approach and the technology being adopted by Insurdata solves that problem using an innovative marriage of human expertise and technology support and I’m excited to help spread the message about what it can do.”
New Customer Portal
We also recently launched our Customer Portal allowing users to visualise and interact with their data as they apply our proprietary technology to augment exposure location information and property attributes, all in a peril-relevant way.
The portal complements our existing API technology launched last year. We have continued to demonstrate the material impact that exposure data correction and augmentation can have on accumulation zones and modelled loss estimates (see Canopius case study).
If you’d like to understand what this could mean for your exposures, please email firstname.lastname@example.org.
BITESIZE IMPACT 25: INSURDATA REVIEW
Insurdata is the youngest company on Oxbow Partners Impact 25 list. Chris Sandilands, Partner of Oxbow Partners highlights the benefits that Insurdata offers to (re)insurers and brokers to improve insights used in underwriting, pricing, risk assessment and portfolio management in the firms monthly blog. The below is an extract from Chris’s article. The full piece is available to read here.
Insurdata recently concluded a pilot with reinsurer SCOR in Florida which revealed that 44% of properties (equating to 50% of total insured losses) were incorrectly geocoded in the existing dataset. Expected losses per location changed by up to 80% as a result of Insurdata repositioning. The pilot also revealed significant changes in both annualised and return period losses. Another pilot with a different company revealed that over 90% of locations were inaccurately geocoded at an average displacement in excess of 50m.
The technology scales to any portfolio size, but founder Jason Futers says Insurdata is most effective for addressing high impact issues (such as poor geocoding and the absence of first-floor elevation data) through a ‘data on demand’ service. He says the platform has the greatest perceived value at the point of underwriting.
Insurdata can be integrated via API, which Jason says will significantly reduce the time it takes for data to find its way through the value chain: some processes currently can take months of manual re-keying.
The platform is currently being used by clients based in the US and Europe. Jason says dozens of conversations with potential partners – including brokers, MGAs, insurers and reinsurers – are ongoing.
Insurdata says that for every dollar spent on its technology, users generate $13 of corrected technical premium for residential property, or $50 for commercial property.
The company received $1.3m in funding from partners including Anthemis and Plug and Play in October last year and is currently in advanced discussions for its next round. Headcount could as much as double to around 20 in the next 12 months to drive growth in multiple markets.
While the focus of the business is initially flood, the platform applications can be widened to other catastrophe events (e.g. terror) and to other impacts including infrastructure. Jason adds that Insurdata could be used for other coverages including parametric triggers in the future, and could support community responses to catastrophic events by advising on how and where new properties are built.
THE OXBOW PARTNERS VIEW
Insurdata was founded in 2017 and is the youngest company on the Impact 25. We included the company because the team is strong (Jason is ex RMS) and emerging results were impressive. This is a business that has come out of the gates quickly.
Exposure modelling is big business. Revenues of RMS, the leading loss modelling platform, were £233m in 2017, and Jason estimates that the total exposure-based (re)insurance analytics market is close to $1bn. (Re)insurers’ investment in modelling infrastructure and capabilities, for example the Oasis framework, suggests that investment in improved risk insight will continue for years to come.
The full article is available to read here
Thanks to Chris Sandilands and the team at Oxbow Partners for publishing their article in Oxbow Partner’s June 2018 blog.
For further information contact Jason Futers on email@example.com
Don’t become shackled by the ball and (block)chain
Jason Futers’ comment piece in the Oasis article series warns against a ‘big picture’ blockchain approach and encourages re/insurers to deconstruct the potential benefits to identify the key use-case for their individual business.
This is a great time to be in re/insurance and an even greater time to be in Insurtech. There are many challenges ahead, but the excitement and potential fulfilment considerably outweigh these. There’s an almost palpable feeling that we’re surrounded by an extraordinary energy pulling together people, ideas and capital. More energy than at any other time in the industry history. Energy that will change re/insurance… and that’s a good thing.
The technology options and the magnitude of potential change seem to be overwhelming. There’s a lot of ‘noise’ and it makes it difficult for many to see the signal and take action. This noise is important to create interest, attract capital, and build momentum. However, the interest, capital and momentum have arrived now, and I believe it’s vital that we focus on creating a cycle of identifying and implementing small changes that can make a real difference. To move from the big picture and zoom in on the smaller details that will make a genuine difference to a key area of your business; focus on one small change, then another and another and so on.
Let’s take blockchain as just one example. Along with the likes of AI, AR, VR and other new technologies, there’s considerable noise around blockchain for sure, and of course recognition of the considerable benefits. These benefits include immutability of data (data can’t be changed), decentralisation (the blockchain is operated independently of any single entity), reliability (decentralisation means it is very resilient so doesn’t or shouldn’t! crash), transparency (by definition, a public blockchain transparent to everyone, and a protected blockchain is transparent to all members) and of course secure (data is encrypted).
When considering blockchain for your organisation, there are now many potential use cases to help identify the most relevant benefit to your business such as smart contracts for claims processing, purchase and sale of data or other ‘policy assets’, business workflows and compliance, and economic efficiency of crypto currency to transfer value instead of fiat currencies, to name but a few.
And as you explore your use-case, don’t be shy in asking apparently basic questions. Last summer I was asked a great question – what is the benefit of blockchain compared to technology built on a well-structured, secure database? My answer at the time was the MVA (Minimum Viable Answer) so I went away and thought about it. Really thought about it. At the time we were developing our blockchain strategy using both Hyperledger and Ethereum, so we already had at least some realworld experience. After this period of reflection, I concluded that the core advantage of blockchain to our users is the independent verification of data. This could change, but for now that’s the primary blockchain benefit for us – the independent verification of data.
And that’s not to be dismissive of other benefits. Independent verification of data is incredibly powerful. If you take exposure information today, regardless of the data type and use, in most if not all cases it has been produced or manipulated by someone with a vested interest and a focus on their role in the lifecycle of that data. As schedules of exposure information are created and transferred from the insured all the way through the carrier, reinsurer and into the capital markets, it could be touched by 20 stakeholders or more. Each has a purpose for the data, and anything not required is not cared for or is potentially left behind.
There’s nothing divisive in the processing or editing of the data; it’s actually rather pragmatic in many cases… take a look at schedule of values and try to avoid wondering what all these columns are for! When the data eventually arrives at its destination, it’s incomplete and in many cases not fit for purpose. The power of blockchain in this use case is independently verifying what data was available when. As simple as that.
Of course, one must bear in mind that independent verification of the data is very different from verification that the data is correct… or useful! It’s just verification that the data is what the creator says it is. And how is that useful in practical terms? In lots of ways. For example, to retrospectively understand what data was known when a claim is being assessed is very important… and to have that independently verified could make all the difference in settling a claim. Today, the independent verification is missing and blockchain can provide that. Currently, blockchain is the only way to provide that.
Continuing to focus on all that blockchain is, or all that it can become, goes against one of the core beliefs everyone in Insurtech should hold dear… move fast and break stuff. Waiting to implement blockchain for an entire process is unlikely to be successful. Deconstructing the potential benefits leads to the identification of the key use case for your business; use this as the basis for embracing the technology, allowing you to deliver your blockchain MVP, learn about the benefits and challenges, and then go again… and again. Realise the small changes with the right technology at the micro-level and before you know it you’ll have created positive change across parts of your business at the macro-level.
If you’d like to chat about this then drop me a note; I’d be delighted to discuss anything blockchain, API, ML, AR, or even AI if there’s coffee on offer.
The full article is available to download here
To contact Jason Futers please email firstname.lastname@example.org
Thanks to Dickie Whitaker and the team at Oasis for publishing the article in their May 2018 newsletter.