Banner Ad 1

Collapse

Announcement

Collapse
No announcement yet.

Geospatial (Territorial) Smoothing Algorithm

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Geospatial (Territorial) Smoothing Algorithm

    Hi all,

    I wanted to share my new algorithm for zip code geospatial (territory) smoothing with the actuarial community! If you have any data (i'm using a form of 'relativity' or 'actuarial rating factor' in my example) that look all over the map and you want them to successively cluster nicely around high exposure areas, this might help you. It's an iterative, algorithmic approach, as opposed to other model-based approaches that create many issues when it comes to sparse data at a zip code level (small insurance carriers and/or small coverages). Code, data, and white paper are on GitHub (link below).

    I hope it helps someone out there!

    WHITE PAPER ABSTRACT
    Jump Smoothing Algorithm
    A geospatial territory rating algorithm for smoothing relativities (actuarial factors) at a zip code level is demonstrated with code in R. The inputs are the relativities and exposure at a zip code level; the output is a smoothed relativity at a zip code level. The method is algorithmic, not model-based, which makes it easily implementable, interpretable, and explainable to the Departments of Insurance. It is particularly suitable small to medium sized insurance carriers who have thin, noncredible data at a zip code level. This is because the smoothing is not based on Pure Premium (Loss Cost) or Loss Ratio predictions, which is unreliable with very thin data especially at small coverages, but rather, each zip code relativity is weighted by the exposure-weighted average relativity of all its neighboring zip codes. R code and data available on GitHub.

    GITHUB LINK
Working...
X