Generalization of entropy based divergence measures for symbolic sequence analysis
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measu...
Guardado en:
| Autores principales: | Ré, Miguel Ángel, Azad, Rajeev K. |
|---|---|
| Formato: | article |
| Lenguaje: | Inglés |
| Publicado: |
2021
|
| Materias: | |
| Acceso en línea: | http://hdl.handle.net/11086/20315 https://doi.org/10.1371/journal.pone.0093532 |
| Aporte de: |
Ejemplares similares
-
Generalization of entropy based divergence measures for symbolic sequence analysis
por: Ré, Miguel Ángel, et al.
Publicado: (2021) -
Entropy for smart kids and their curious parents /
por: Ben-Naim, Arieh, 1934-
Publicado: (2019) -
A measure of dependence between discrete and continuous variables
por: Ré, Miguel Ángel, et al.
Publicado: (2025) -
General entropy-like uncertainty relations in finite dimensions
por: Zozor, Steeve, et al.
Publicado: (2014) -
Maximally entangled mixed states and conditional entropies
por: Batle, Josep, et al.
Publicado: (2005)