next up previous

Causality detection based on information-theoretic approaches in time series analysis

Katerina Schindler-Hlavackova
Commission for Scientific Visualization, Austrian Academy of Sciences, Donau-City Str. 1, A-1220 Vienna, Austria

Milan Palus, Martin Vejmelka
Institute of Computer Science, Academy of Sciences of the Czech Republic
Pod vodárenskou vezí 2, 182 07 Prague 8, Czech Republic
E-mail: mp@cs.cas.cz

Joydeep Bhattacharya
Commission for Scientific Visualization, Austrian Academy of Sciences, Donau-City Str. 1, A-1220 Vienna, Austria

Abstract:

Synchronization, a basic nonlinear phenomenon, is widely observed in diverse complex systems studied in physical, biological and other natural sciences, as well as in social sciences, economy and finance. While studying such complex systems, it is important not only to detect synchronized states, but also to identify causal relationships (i.e. who drives whom) between concerned (sub) systems. The knowledge of information-theoretic measures (i.e. mutual information, conditional entropy) is essential for the analysis of information flow between two systems or between constituent subsystems of a complex system. However, the estimation of these measures from a set of finite samples is not trivial. The current extensive literatures on entropy and mutual information estimation provides a wide variety of approaches, from approximation-statistical, studying rate of convergence or consistency of an estimator for a general distribution, over learning algorithms operating on partitioned data space to heuristical approaches. The aim of this paper is to provide a detailed overview of information theoretic approaches for measuring causal influence in multivariate time series and to focus on diverse approaches to the entropy and mutual information estimation.



Physics Reports 441 (1) (2007) 1-46 doi:10.1016/j.physrep.2006.12.004
Abstract -- Preprint PDF



Milan Palus 2007