Generalization of entropy based divergence measures for symbolic sequence analysis

Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measu...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Ré, Miguel Ángel, Azad, Rajeev K.
Formato: article
Lenguaje:Inglés
Publicado: 2021
Materias:
Acceso en línea:http://hdl.handle.net/11086/20315
https://doi.org/10.1371/journal.pone.0093532
Aporte de:
id I10-R141-11086-20315
record_format dspace
institution Universidad Nacional de Córdoba
institution_str I-10
repository_str R-141
collection Repositorio Digital Universitario (UNC)
language Inglés
topic Entropic distance
spellingShingle Entropic distance
Ré, Miguel Ángel
Azad, Rajeev K.
Generalization of entropy based divergence measures for symbolic sequence analysis
topic_facet Entropic distance
description Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.
format article
author Ré, Miguel Ángel
Azad, Rajeev K.
author_facet Ré, Miguel Ángel
Azad, Rajeev K.
author_sort Ré, Miguel Ángel
title Generalization of entropy based divergence measures for symbolic sequence analysis
title_short Generalization of entropy based divergence measures for symbolic sequence analysis
title_full Generalization of entropy based divergence measures for symbolic sequence analysis
title_fullStr Generalization of entropy based divergence measures for symbolic sequence analysis
title_full_unstemmed Generalization of entropy based divergence measures for symbolic sequence analysis
title_sort generalization of entropy based divergence measures for symbolic sequence analysis
publishDate 2021
url http://hdl.handle.net/11086/20315
https://doi.org/10.1371/journal.pone.0093532
work_keys_str_mv AT remiguelangel generalizationofentropybaseddivergencemeasuresforsymbolicsequenceanalysis
AT azadrajeevk generalizationofentropybaseddivergencemeasuresforsymbolicsequenceanalysis
bdutipo_str Repositorios
_version_ 1764820391911489536