Divergence Measures Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse...

Full description

Saved in:
Bibliographic Details
Other Authors: Sason, Igal (Editor)
Format: Electronic Book Chapter
Language:English
Published: Basel MDPI - Multidisciplinary Digital Publishing Institute 2022
Subjects:
Online Access:DOAB: download the publication
DOAB: description of the publication
Tags: Add Tag
No Tags, Be the first to tag this record!

MARC

LEADER 00000naaaa2200000uu 4500
001 doab_20_500_12854_84568
005 20220621
003 oapen
006 m o d
007 cr|mn|---annan
008 20220621s2022 xx |||||o ||| 0|eng d
020 |a books978-3-0365-4331-4 
020 |a 9783036543321 
020 |a 9783036543314 
040 |a oapen  |c oapen 
024 7 |a 10.3390/books978-3-0365-4331-4  |c doi 
041 0 |a eng 
042 |a dc 
072 7 |a GP  |2 bicssc 
072 7 |a P  |2 bicssc 
100 1 |a Sason, Igal  |4 edt 
700 1 |a Sason, Igal  |4 oth 
245 1 0 |a Divergence Measures  |b Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems 
260 |a Basel  |b MDPI - Multidisciplinary Digital Publishing Institute  |c 2022 
300 |a 1 electronic resource (256 p.) 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
506 0 |a Open Access  |2 star  |f Unrestricted online access 
520 |a Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled "Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems", includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures. 
540 |a Creative Commons  |f https://creativecommons.org/licenses/by/4.0/  |2 cc  |4 https://creativecommons.org/licenses/by/4.0/ 
546 |a English 
650 7 |a Research & information: general  |2 bicssc 
650 7 |a Mathematics & science  |2 bicssc 
653 |a Bregman divergence 
653 |a f-divergence 
653 |a Jensen-Bregman divergence 
653 |a Jensen diversity 
653 |a Jensen-Shannon divergence 
653 |a capacitory discrimination 
653 |a Jensen-Shannon centroid 
653 |a mixture family 
653 |a information geometry 
653 |a difference of convex (DC) programming 
653 |a conditional Rényi divergence 
653 |a horse betting 
653 |a Kelly gambling 
653 |a Rényi divergence 
653 |a Rényi mutual information 
653 |a relative entropy 
653 |a chi-squared divergence 
653 |a f-divergences 
653 |a method of types 
653 |a large deviations 
653 |a strong data-processing inequalities 
653 |a information contraction 
653 |a maximal correlation 
653 |a Markov chains 
653 |a information inequalities 
653 |a mutual information 
653 |a Rényi entropy 
653 |a Carlson-Levin inequality 
653 |a information measures 
653 |a hypothesis testing 
653 |a total variation 
653 |a skew-divergence 
653 |a convexity 
653 |a Pinsker's inequality 
653 |a Bayes risk 
653 |a statistical divergences 
653 |a minimum divergence estimator 
653 |a maximum likelihood 
653 |a bootstrap 
653 |a conditional limit theorem 
653 |a Bahadur efficiency 
653 |a α-mutual information 
653 |a Augustin-Csiszár mutual information 
653 |a data transmission 
653 |a error exponents 
653 |a dimensionality reduction 
653 |a discriminant analysis 
653 |a statistical inference 
653 |a n/a 
856 4 0 |a www.oapen.org  |u https://mdpi.com/books/pdfview/book/5550  |7 0  |z DOAB: download the publication 
856 4 0 |a www.oapen.org  |u https://directory.doabooks.org/handle/20.500.12854/84568  |7 0  |z DOAB: description of the publication