Bayesian Additive Regression Trees: A Bayesian approach to machine learning in non-life insurance

Specialeforsvar ved Nikolai Klit Selch Jensen

Titel: Bayesian Additive Regression Trees -  A Bayesian approach to machine learning in non-life insurance

  

Abstract:  This thesis is an exploration of a Bayesian approach to machine learning. This is performed through the Bayesian Additive Regression Trees (BART) that is a Bayesian view on the combination of the sum-of-trees and mean-shift CART model. The BART model is defined through a set of prior distributions. The prior distributions define the distributions of key parameters in the model as well as dedicating a probability to an entire tree. The BART algorithm works by a combination of two Markov Chain Monto Carlo samples; the Gibbs sampler and the Metropolis-Hasting algorithm. The root tree initializes all of the trees. In each step of the Gibbs sampler, the Metropolis-Hasting algorithm updates the previous trees by applying one of four possible moves; grow, prune, swap or change.
The BART model is tested in a numerical setup where almost 6.000 unique configurations of the BART model are tested by cross-validation. The most apparent parameter trend is; the more trees used the better model up to a certain level. A second observed trend is; the best performing configurations are those who imply large trees by having a small depth penalization in the tree prior distribution. It is furthermore found that the BART model does not require any significant burn-in in the MCMC sampler, as the tree sizes stabilize quickly.
The “optimal” BART configuration is compared to the CART, Bagging, Boosting and Random Forest, that as well as the BART model are tree-based but non-Bayesian. The result was that the BART model in the setup of marine, non-life, claim size analysis performed better than the four competitors did. Even the worst-performing BART configuration performed better than the optimized CART model.

 

 

Vejleder:  Jostein Paulsen
Censor:    Mette M. Havning