Approximate Bayesian Inference

Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis-Hastings algorithm of the Gibbs sampler. These algorithms target...

Full description

Saved in:
Bibliographic Details
Other Authors: Alquier, Pierre (Editor)
Format: Electronic Book Chapter
Language:English
Published: Basel MDPI - Multidisciplinary Digital Publishing Institute 2022
Subjects:
Online Access:DOAB: download the publication
DOAB: description of the publication
Tags: Add Tag
No Tags, Be the first to tag this record!

MARC

LEADER 00000naaaa2200000uu 4500
001 doab_20_500_12854_84560
005 20220621
003 oapen
006 m o d
007 cr|mn|---annan
008 20220621s2022 xx |||||o ||| 0|eng d
020 |a books978-3-0365-3790-0 
020 |a 9783036537894 
020 |a 9783036537900 
040 |a oapen  |c oapen 
024 7 |a 10.3390/books978-3-0365-3790-0  |c doi 
041 0 |a eng 
042 |a dc 
072 7 |a GP  |2 bicssc 
072 7 |a P  |2 bicssc 
100 1 |a Alquier, Pierre  |4 edt 
700 1 |a Alquier, Pierre  |4 oth 
245 1 0 |a Approximate Bayesian Inference 
260 |a Basel  |b MDPI - Multidisciplinary Digital Publishing Institute  |c 2022 
300 |a 1 electronic resource (508 p.) 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
506 0 |a Open Access  |2 star  |f Unrestricted online access 
520 |a Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis-Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC-Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented. 
540 |a Creative Commons  |f https://creativecommons.org/licenses/by/4.0/  |2 cc  |4 https://creativecommons.org/licenses/by/4.0/ 
546 |a English 
650 7 |a Research & information: general  |2 bicssc 
650 7 |a Mathematics & science  |2 bicssc 
653 |a bifurcation 
653 |a dynamical systems 
653 |a Edward-Sokal coupling 
653 |a mean-field 
653 |a Kullback-Leibler divergence 
653 |a variational inference 
653 |a Bayesian statistics 
653 |a machine learning 
653 |a variational approximations 
653 |a PAC-Bayes 
653 |a expectation-propagation 
653 |a Markov chain Monte Carlo 
653 |a Langevin Monte Carlo 
653 |a sequential Monte Carlo 
653 |a Laplace approximations 
653 |a approximate Bayesian computation 
653 |a Gibbs posterior 
653 |a MCMC 
653 |a stochastic gradients 
653 |a neural networks 
653 |a Approximate Bayesian Computation 
653 |a differential evolution 
653 |a Markov kernels 
653 |a discrete state space 
653 |a ergodicity 
653 |a Markov chain 
653 |a probably approximately correct 
653 |a variational Bayes 
653 |a Bayesian inference 
653 |a Markov Chain Monte Carlo 
653 |a Sequential Monte Carlo 
653 |a Riemann Manifold Hamiltonian Monte Carlo 
653 |a integrated nested laplace approximation 
653 |a fixed-form variational Bayes 
653 |a stochastic volatility 
653 |a network modeling 
653 |a network variability 
653 |a Stiefel manifold 
653 |a MCMC-SAEM 
653 |a data imputation 
653 |a Bethe free energy 
653 |a factor graphs 
653 |a message passing 
653 |a variational free energy 
653 |a variational message passing 
653 |a approximate Bayesian computation (ABC) 
653 |a differential privacy (DP) 
653 |a sparse vector technique (SVT) 
653 |a Gaussian 
653 |a particle flow 
653 |a variable flow 
653 |a Langevin dynamics 
653 |a Hamilton Monte Carlo 
653 |a non-reversible dynamics 
653 |a control variates 
653 |a thinning 
653 |a meta-learning 
653 |a hyperparameters 
653 |a priors 
653 |a online learning 
653 |a online optimization 
653 |a gradient descent 
653 |a statistical learning theory 
653 |a PAC-Bayes theory 
653 |a deep learning 
653 |a generalisation bounds 
653 |a Bayesian sampling 
653 |a Monte Carlo integration 
653 |a PAC-Bayes theory 
653 |a no free lunch theorems 
653 |a sequential learning 
653 |a principal curves 
653 |a data streams 
653 |a regret bounds 
653 |a greedy algorithm 
653 |a sleeping experts 
653 |a entropy 
653 |a robustness 
653 |a statistical mechanics 
653 |a complex systems 
856 4 0 |a www.oapen.org  |u https://mdpi.com/books/pdfview/book/5544  |7 0  |z DOAB: download the publication 
856 4 0 |a www.oapen.org  |u https://directory.doabooks.org/handle/20.500.12854/84560  |7 0  |z DOAB: description of the publication