Bayesian Gaussian finite mixture model

J. Mirra, S. Abdullah

Research output: Contribution to journalConference articlepeer-review

2 Citations (Scopus)

Abstract

In data analysis it is common to assume that data are generated from one population. However, due to some underlying factors, data might come from several sources that should be considered as several sub-populations. Partitioning methods such as k-means clustering, as well as hierarchical clustering, are two commonly used methods for identifying data grouping. The grouping is done based on distance between observations. We discuss alternative means of data grouping, based on data distribution, known as Finite Mixture Model. Parameter estimation will be done using the Bayesian approach. Incorporation of prior distribution on the parameter of interest will complete data information from the sample, resulting in updated information in the form of posterior distribution. Gaussian distribution is assumed for the sampling model. Markov Chain Monte Carlo with Gibbs Sampling algorithm is implemented for sampling from the posterior distribution. Data on wave sensitivity on monkeys' eyes were used to illustrate the method.

Original languageEnglish
Article number012084
JournalJournal of Physics: Conference Series
Volume1725
Issue number1
DOIs
Publication statusPublished - 12 Jan 2021
Event2nd Basic and Applied Sciences Interdisciplinary Conference 2018, BASIC 2018 - Depok, Indonesia
Duration: 3 Aug 20184 Aug 2018

Keywords

  • Monte Carlo
  • Posterior distribution
  • Prior distribution
  • Sampling model
  • Simulation

Fingerprint

Dive into the research topics of 'Bayesian Gaussian finite mixture model'. Together they form a unique fingerprint.

Cite this