Information Bottleneck Theory and Applications in Deep Learning

The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights in...

Full description

Saved in:
Bibliographic Details
Other Authors: Geiger, Bernhard (Editor), Kubin, Gernot (Editor)
Format: Electronic Book Chapter
Language:English
Published: Basel, Switzerland MDPI - Multidisciplinary Digital Publishing Institute 2021
Subjects:
Online Access:DOAB: download the publication
DOAB: description of the publication
Tags: Add Tag
No Tags, Be the first to tag this record!

MARC

LEADER 00000naaaa2200000uu 4500
001 doab_20_500_12854_76429
005 20220111
003 oapen
006 m o d
007 cr|mn|---annan
008 20220111s2021 xx |||||o ||| 0|eng d
020 |a books978-3-0365-0803-0 
020 |a 9783036508023 
020 |a 9783036508030 
040 |a oapen  |c oapen 
024 7 |a 10.3390/books978-3-0365-0803-0  |c doi 
041 0 |a eng 
042 |a dc 
072 7 |a KNTX  |2 bicssc 
100 1 |a Geiger, Bernhard  |4 edt 
700 1 |a Kubin, Gernot  |4 edt 
700 1 |a Geiger, Bernhard  |4 oth 
700 1 |a Kubin, Gernot  |4 oth 
245 1 0 |a Information Bottleneck  |b Theory and Applications in Deep Learning 
260 |a Basel, Switzerland  |b MDPI - Multidisciplinary Digital Publishing Institute  |c 2021 
300 |a 1 electronic resource (274 p.) 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
506 0 |a Open Access  |2 star  |f Unrestricted online access 
520 |a The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information-theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence. 
540 |a Creative Commons  |f https://creativecommons.org/licenses/by/4.0/  |2 cc  |4 https://creativecommons.org/licenses/by/4.0/ 
546 |a English 
650 7 |a Information technology industries  |2 bicssc 
653 |a information theory 
653 |a variational inference 
653 |a machine learning 
653 |a learnability 
653 |a information bottleneck 
653 |a representation learning 
653 |a conspicuous subset 
653 |a stochastic neural networks 
653 |a mutual information 
653 |a neural networks 
653 |a information 
653 |a bottleneck 
653 |a compression 
653 |a classification 
653 |a optimization 
653 |a classifier 
653 |a decision tree 
653 |a ensemble 
653 |a deep neural networks 
653 |a regularization methods 
653 |a information bottleneck principle 
653 |a deep networks 
653 |a semi-supervised classification 
653 |a latent space representation 
653 |a hand crafted priors 
653 |a learnable priors 
653 |a regularization 
653 |a deep learning 
856 4 0 |a www.oapen.org  |u https://mdpi.com/books/pdfview/book/3864  |7 0  |z DOAB: download the publication 
856 4 0 |a www.oapen.org  |u https://directory.doabooks.org/handle/20.500.12854/76429  |7 0  |z DOAB: description of the publication