Information and Divergence Measures

The concept of distance is important for establishing the degree of similarity and/or closeness between functions, populations, or distributions. As a result, distances are related to inferential statistics, including problems related to both estimation and hypothesis testing, as well as modelling w...

Full description

Saved in:
Bibliographic Details
Other Authors: Karagrigoriou, Alex (Editor), Makrides, Andreas (Editor)
Format: Electronic Book Chapter
Language:English
Published: MDPI - Multidisciplinary Digital Publishing Institute 2023
Subjects:
GOS
LPI
n/a
Online Access:DOAB: download the publication
DOAB: description of the publication
Tags: Add Tag
No Tags, Be the first to tag this record!

MARC

LEADER 00000naaaa2200000uu 4500
001 doab_20_500_12854_113909
005 20230911
003 oapen
006 m o d
007 cr|mn|---annan
008 20230911s2023 xx |||||o ||| 0|eng d
020 |a books978-3-0365-8387-7 
020 |a 9783036583860 
020 |a 9783036583877 
040 |a oapen  |c oapen 
024 7 |a 10.3390/books978-3-0365-8387-7  |c doi 
041 0 |a eng 
042 |a dc 
072 7 |a GP  |2 bicssc 
072 7 |a PH  |2 bicssc 
100 1 |a Karagrigoriou, Alex  |4 edt 
700 1 |a Makrides, Andreas  |4 edt 
700 1 |a Karagrigoriou, Alex  |4 oth 
700 1 |a Makrides, Andreas  |4 oth 
245 1 0 |a Information and Divergence Measures 
260 |b MDPI - Multidisciplinary Digital Publishing Institute  |c 2023 
300 |a 1 electronic resource (282 p.) 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
506 0 |a Open Access  |2 star  |f Unrestricted online access 
520 |a The concept of distance is important for establishing the degree of similarity and/or closeness between functions, populations, or distributions. As a result, distances are related to inferential statistics, including problems related to both estimation and hypothesis testing, as well as modelling with applications in regression analysis, multivariate analysis, actuarial science, portfolio optimization, survival analysis, reliability theory, and many other areas. Thus, entropy and divergence measures are always a central concern for scientists, researchers, medical experts, engineers, industrial managers, computer experts, data analysts, and other professionals. This reprint focuses on recent developments in information and divergence measures and presents new theoretical issues as well as solutions to important practical problems and case studies illustrating the great applicability of these innovative techniques and methods. The contributions in this reprint highlight the diversity of topics in this scientific field. 
540 |a Creative Commons  |f https://creativecommons.org/licenses/by/4.0/  |2 cc  |4 https://creativecommons.org/licenses/by/4.0/ 
546 |a English 
650 7 |a Research & information: general  |2 bicssc 
650 7 |a Physics  |2 bicssc 
653 |a exponential family 
653 |a statistical divergence 
653 |a truncated exponential family 
653 |a truncated normal distributions 
653 |a double index divergence test statistic 
653 |a multivariate data analysis 
653 |a conditional independence 
653 |a cross tabulations 
653 |a extremal combinatorics 
653 |a graphs 
653 |a Han's inequality 
653 |a information inequalities 
653 |a polymatroid 
653 |a rank function 
653 |a set function 
653 |a Shearer's lemma 
653 |a submodularity 
653 |a empirical survival Jensen-Shannon divergence 
653 |a Kolmogorov-Smirnov two-sample test 
653 |a skew logistic distribution 
653 |a bi-logistic growth 
653 |a epidemic waves 
653 |a COVID-19 data 
653 |a Rényi's pseudodistance 
653 |a minimum Rényi's pseudodistance estimators 
653 |a restricted minimum Rényi's pseudodistance estimators 
653 |a Rao-type tests 
653 |a divergence-based tests 
653 |a Multivariate Cauchy distribution (MCD) 
653 |a Kullback-Leibler divergence (KLD) 
653 |a multiple power series 
653 |a Lauricella D-hypergeometric series 
653 |a statistical K-means 
653 |a academic evaluation 
653 |a statistical manifold 
653 |a clustering 
653 |a concomitants 
653 |a GOS 
653 |a FGM family 
653 |a Shannon entropy 
653 |a Tsallis entropy 
653 |a Awad entropy 
653 |a residual entropy 
653 |a past entropy 
653 |a Fisher-Tsallis information number 
653 |a Tsallis divergence 
653 |a bootstrap discrepancy comparison probability (BDCP) 
653 |a discrepancy comparison probability (DCP) 
653 |a likelihood ratio test (LRT) 
653 |a model selection 
653 |a p-value 
653 |a LPI 
653 |a radar waveform 
653 |a passive interception systems 
653 |a Kullback-Leibler divergence 
653 |a joint entropy 
653 |a Tsallis logarithm 
653 |a Kaniadakis logarithm 
653 |a weighted Tsallis divergence 
653 |a weighted Kaniadakis divergence 
653 |a geodesic 
653 |a Fisher information 
653 |a differential geometry 
653 |a transversality 
653 |a multivariate Gaussian 
653 |a n/a 
653 |a moment condition models 
653 |a divergences 
653 |a robustness 
856 4 0 |a www.oapen.org  |u https://mdpi.com/books/pdfview/book/7750  |7 0  |z DOAB: download the publication 
856 4 0 |a www.oapen.org  |u https://directory.doabooks.org/handle/20.500.12854/113909  |7 0  |z DOAB: description of the publication