NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
The Generalization of Mutual Information as the Information between a Set of Variables: The Information Correlation Function Hierarchy and the Information Structure of Multi-Agent SystemsThe topic of this paper is a hierarchy of information-like functions, here named the information correlation functions, where each function of the hierarchy may be thought of as the information between the variables it depends upon. The information correlation functions are particularly suited to the description of the emergence of complex behaviors due to many- body or many-agent processes. They are particularly well suited to the quantification of the decomposition of the information carried among a set of variables or agents, and its subsets. In more graphical language, they provide the information theoretic basis for understanding the synergistic and non-synergistic components of a system, and as such should serve as a forceful toolkit for the analysis of the complexity structure of complex many agent systems. The information correlation functions are the natural generalization to an arbitrary number of sets of variables of the sequence starting with the entropy function (one set of variables) and the mutual information function (two sets). We start by describing the traditional measures of information (entropy) and mutual information.
Document ID
20040053469
Acquisition Source
Ames Research Center
Document Type
Other
Authors
Wolf, David R.
(NASA Ames Research Center Moffett Field, CA, United States)
Date Acquired
September 7, 2013
Publication Date
January 1, 2004
Subject Category
Numerical Analysis
Funding Number(s)
CONTRACT_GRANT: NAS2-14217
Distribution Limits
Public
Copyright
Work of the US Gov. Public Use Permitted.
No Preview Available