University of Sussex
Browse
- No file added yet -

Transfer entropy as a log-likelihood ratio

Download (381.97 kB)
journal contribution
posted on 2023-07-24, 09:40 authored by Lionel BarnettLionel Barnett, Terry Bossomaier
Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense. © 2012 American Physical Society.

History

Publication status

  • Published

File Version

  • Accepted version

Journal

Physical Review Letters

ISSN

0031-9007

Publisher

American Physical Society (APS)

Issue

13

Volume

109

Article number

138105

Department affiliated with

  • Informatics Publications

Peer reviewed?

  • Yes

Usage metrics

    University of Sussex (Publications)

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC