Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis

PDF Version Also Available for Download.

Description

Article on the generalization of entropy based divergence measures for symbolic sequence analysis.

Physical Description

11 p.

Creation Information

Ré, Miguel A. & Azad, Rajeev K. April 11, 2014.

Context

This article is part of the collection entitled: UNT Scholarly Works and was provided by the UNT College of Arts and Sciences to the UNT Digital Library, a digital repository hosted by the UNT Libraries. It has been viewed 130 times. More information about this article can be viewed below.

Who

People and organizations associated with either the creation of this article or its content.

Authors

Publisher

Provided By

UNT College of Arts and Sciences

The UNT College of Arts and Sciences educates students in traditional liberal arts, performing arts, sciences, professional, and technical academic programs. In addition to its departments, the college includes academic centers, institutes, programs, and offices providing diverse courses of study.

Contact Us

What

Descriptive information to help identify this article. Follow the links below to find similar items on the Digital Library.

Degree Information

Description

Article on the generalization of entropy based divergence measures for symbolic sequence analysis.

Physical Description

11 p.

Notes

Abstract: Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.

Source

  • PLoS One, 2014, San Francisco: Public Library of Science

Language

Item Type

Identifier

Unique identifying numbers for this article in the Digital Library or other systems.

Publication Information

  • Publication Title: PLoS One
  • Volume: 9
  • Issue: 4
  • Pages: 11
  • Peer Reviewed: Yes

Collections

This article is part of the following collection of related materials.

UNT Scholarly Works

Materials from the UNT community's research, creative, and scholarly activities and UNT's Open Access Repository. Access to some items in this collection may be restricted.

What responsibilities do I have when using this article?

When

Dates and time periods associated with this article.

Submitted Date

  • September 17, 2013

Accepted Date

  • March 4, 2014

Creation Date

  • April 11, 2014

Added to The UNT Digital Library

  • Sept. 12, 2014, 9:22 a.m.

Usage Statistics

When was this article last used?

Yesterday: 0
Past 30 days: 0
Total Uses: 130

Interact With This Article

Here are some suggestions for what to do next.

Start Reading

PDF Version Also Available for Download.

International Image Interoperability Framework

IIF Logo

We support the IIIF Presentation API

Ré, Miguel A. & Azad, Rajeev K. Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis, article, April 11, 2014; [San Francisco, California]. (https://digital.library.unt.edu/ark:/67531/metadc335313/: accessed March 15, 2025), University of North Texas Libraries, UNT Digital Library, https://digital.library.unt.edu; crediting UNT College of Arts and Sciences.

Back to Top of Screen