Identification of functional information subgraphs in cultured neural networks Metadata

Metadata describes a digital item, providing (if known) such information as creator, publisher, contents, size, relationship to other resources, and more. Metadata may also contain "preservation" components that help us to maintain the integrity of digital files over time.

Title

  • Main Title Identification of functional information subgraphs in cultured neural networks

Creator

  • Author: Gintautas, Vadas
    Creator Type: Personal
    Creator Info: Los Alamos National Laboratory
  • Author: Bettencourt, Luis
    Creator Type: Personal
    Creator Info: Los Alamos National Laboratory
  • Author: Ham, Michael I.
    Creator Type: Personal
    Creator Info: University of North Texas

Publisher

  • Name: BioMed Central Ltd.
    Place of Publication: [London, United Kingdom]

Date

  • Creation: 2009-07-13

Language

  • English

Description

  • Content Description: This paper accompanies an oral presentation on the identification of functional information subgraphs in cultured neural networks.
  • Physical Description: 1 p.

Subject

  • Keyword: neuronal networks
  • Keyword: subgraphs
  • Keyword: nodes

Source

  • Conference: Eighteenth Annual Computational Neuroscience Meeting: CNS, 2009, Berlin, Germany

Citation

  • Publication Title: BMC Neuroscience
  • Volume: 10
  • Issue: Suppl 1
  • Peer Reviewed: True

Collection

  • Name: UNT Scholarly Works
    Code: UNTSW

Institution

  • Name: UNT College of Arts and Sciences
    Code: UNTCAS

Rights

  • Rights Access: public

Resource Type

  • Paper

Format

  • Text

Identifier

  • DOI: 10.1186/1471-2202-10-S1-012
  • Archival Resource Key: ark:/67531/metadc122146

Degree

  • Academic Department: Center for Network Neuroscience

Note

  • Display Note: Abstract: We present a general information theoretic approach for identifying functional subgraphs in complex neuronal networks where the spiking dynamics of a subset of nodes (neurons) are observable. We show that the uncertainty in the state of each node can be written as a sum of information quantities involving a growing number of variables at other nodes. We demonstrate that each term in this sum is generated by successively conditioning mutual information on new measured variables, in a way analogous to a discrete differential calculus.