Probabilistic inference of Bayesian neural networks with generalized expectation propagation

Jing Zhao, Xiao Liu, Shaojie He, Shiliang Sun

Research output: Contribution to journalArticlepeer-review

18 Scopus citations

Abstract

Deep learning plays an important role in the field of machine learning. However, deterministic methods such as neural networks cannot capture the model uncertainty. Bayesian neural network (BNN) are recently under consideration since Bayesian models provide a theoretical framework to infer model uncertainty. Since it is often difficult to find an analytical solution for BNNs, an effective and efficient approximate inference method is very important for model training and prediction. The generalized version of expectation propagation (GEP) was recently proposed and considered a powerful approximate inference method, which is based on the minimization of Kullback–Leibler (KL) divergence of the true posterior and the approximate distributions. In this paper, we further instantiate the GEP to provide an effective and efficient approximate inference method for BNNs. We assess this method on BNNs including fully connected neural networks and convolutional neural networks on multiple benchmark datasets and show a better performance than some state-of-the-art approximate inference methods.

Original languageEnglish
Pages (from-to)392-398
Number of pages7
JournalNeurocomputing
Volume412
DOIs
StatePublished - 28 Oct 2020

Keywords

  • Approximate inference
  • Bayesian neural networks
  • Generalized expectation propagation

Fingerprint

Dive into the research topics of 'Probabilistic inference of Bayesian neural networks with generalized expectation propagation'. Together they form a unique fingerprint.

Cite this