- No file added yet -
Transfer entropy as a log-likelihood ratio
journal contribution
posted on 2023-07-24, 09:40 authored by Lionel BarnettLionel Barnett, Terry BossomaierTransfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense. © 2012 American Physical Society.
History
Publication status
- Published
File Version
- Accepted version
Journal
Physical Review LettersISSN
0031-9007Publisher
American Physical Society (APS)Publisher URL
External DOI
Issue
13Volume
109Article number
138105Department affiliated with
- Informatics Publications
Peer reviewed?
- Yes
Usage metrics
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC