Simultaneous Bayesian Clustering and Feature Selection Through Student's t Mixtures Model

  • Jianyong Sun*
  • , Aimin Zhou
  • , Simeon Keates
  • , Shengbin Liao
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

In this paper, we proposed a generative model for feature selection under the unsupervised learning context. The model assumes that data are independently and identically sampled from a finite mixture of Student's t distributions, which can reduce the sensitiveness to outliers. Latent random variables that represent the features' salience are included in the model for the indication of the relevance of features. As a result, the model is expected to simultaneously realize clustering, feature selection, and outlier detection. Inference is carried out by a tree-structured variational Bayes algorithm. Full Bayesian treatment is adopted in the model to realize automatic model selection. Controlled experimental studies showed that the developed model is capable of modeling the data set with outliers accurately. Furthermore, experiment results showed that the developed algorithm compares favorably against existing unsupervised probability model-based Bayesian feature selection algorithms on artificial and real data sets. Moreover, the application of the developed algorithm on real leukemia gene expression data indicated that it is able to identify the discriminating genes successfully.

Original languageEnglish
Pages (from-to)1187-1199
Number of pages13
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume29
Issue number4
DOIs
StatePublished - Apr 2018

Keywords

  • Bayesian inference
  • feature selection
  • robust clustering
  • tree-structured variational Bayes (VB)

Fingerprint

Dive into the research topics of 'Simultaneous Bayesian Clustering and Feature Selection Through Student's t Mixtures Model'. Together they form a unique fingerprint.

Cite this