Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition

2017 | journal article; research paper. A publication with affiliation to the University of Göttingen.

Jump to: Cite & Linked | Documents & Media | Details | Version history

Cite this publication

​Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition​
Wibral, M. ; Finn, C.; Wollstadt, P.; Lizier, J. & Priesemann, V. ​ (2017) 
Entropy19(9) art. 494​.​ DOI: https://doi.org/10.3390/e19090494 

Documents & Media

entropy-19-00494-v2.pdf2.47 MBAdobe PDF

License

Published Version

Attribution 4.0 CC BY 4.0

Details

Authors
Wibral, Michael ; Finn, Conor; Wollstadt, Patricia; Lizier, Joseph; Priesemann, Viola 
Abstract
Information processing performed by any system can be conceptually decomposed into the transfer, storage and modification of information—an idea dating all the way back to the work of Alan Turing. However, formal information theoretic definitions until very recently were only available for information transfer and storage, not for modification. This has changed with the extension of Shannon information theory via the decomposition of the mutual information between inputs to and the output of a process into unique, shared and synergistic contributions from the inputs, called a partial information decomposition (PID). The synergistic contribution in particular has been identified as the basis for a definition of information modification. We here review the requirements for a functional definition of information modification in neuroscience, and apply a recently proposed measure of information modification to investigate the developmental trajectory of information modification in a culture of neurons vitro, using partial information decomposition. We found that modification rose with maturation, but ultimately collapsed when redundant information among neurons took over. This indicates that this particular developing neural system initially developed intricate processing capabilities, but ultimately displayed information processing that was highly similar across neurons, possibly due to a lack of external inputs. We close by pointing out the enormous promise PID and the analysis of information modification hold for the understanding of neural systems.
Issue Date
2017
Journal
Entropy 
Organization
Bernstein Center for Computational Neuroscience Göttingen
ISSN
1099-4300
eISSN
1099-4300
Language
English

Reference

Citations


Social Media