# Parameter estimation of Bayesian quantile regression

D. Dichandra, I. Fithriani, S. Nurrohmah

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

## Abstract

Quantile regression is a regression method that modelling a relationship between quantile of variable response and one or more variable predictors. Quantile regression has advantages that linear regression does not have; it is robust against outliers and can model heteroscedasticity data. The parameters of quantile regression can be estimated using the Bayesian method. The Bayesian method is a data analysis tool derived based on the Bayesian inference principle. Bayesian inference is the process of studying data analysis inductively with the Bayes theorem. To estimate regression parameters with Bayesian inference, it is necessary to find the posterior distribution of the regression parameters where the posterior distribution is proportional to the product of the prior distribution and its likelihood function. Since the calculation of the posterior distribution analytically is difficult to do if more parameters are estimated, the Markov Chain Monte Carlo (MCMC) method is proposed. The use of the Bayesian method in quantile regression has advantages, namely the use of MCMC has the advantages of obtaining sample parameter values from an unknown posterior distribution, using computationally efficient, and easy to implement. Yu and Moyeed (2001) introduced Bayesian quantile regression using the likelihood function of errors with an Asymmetric Laplace Distribution (ALD) and found that minimizing parameter estimates in quantile regression is the same as maximizing the likelihood function of errors with an Asymmetric Laplace Distribution (ALD). The method used to estimate quantile regression parameters is Gibbs sampling from the ALD, which is a combination of the exponential and normal distributions. To find the parameters of the regression model by sampling the posterior distribution found in this thesis. The results obtained from Gibbs sampling are a sample sequence of estimated parameters. After obtaining the sample sequences, the sample lines are averaged to obtain an estimated regression parameter. The case study in this thesis discusses the effect of risk factors from motor vehicle insurance customers on the size of claims submitted by customers.

Original language English Proceedings of the 6th International Symposium on Current Progress in Mathematics and Sciences 2020, ISCPMS 2020 Tribidasari A. Ivandini, David G. Churchill, Youngil Lee, Yatimah Binti Alias, Chris Margules American Institute of Physics Inc. 9780735441132 https://doi.org/10.1063/5.0059103 Published - 23 Jul 2021 6th International Symposium on Current Progress in Mathematics and Sciences 2020, ISCPMS 2020 - Depok, IndonesiaDuration: 27 Oct 2020 → 28 Oct 2020

### Publication series

Name AIP Conference Proceedings 2374 0094-243X 1551-7616

### Conference

Conference 6th International Symposium on Current Progress in Mathematics and Sciences 2020, ISCPMS 2020 Indonesia Depok 27/10/20 → 28/10/20

## Keywords

• Asymmetric Laplace distribution
• Bayesian method
• Gibbs sampling
• Markov chain Monte Carlo
• quantile regression model

## Fingerprint

Dive into the research topics of 'Parameter estimation of Bayesian quantile regression'. Together they form a unique fingerprint.