next up previous
Next: Numerical study Up: On entropy rates of Previous: On entropy rates of

Entropy rates

Entropy rates will be considered as a tool for quantitative characterization of dynamic processes evolving in time. Let tex2html_wrap_inline509 be a time series, i.e., a series of measurements done on a system in consecutive instants of time tex2html_wrap_inline511 . The time series tex2html_wrap_inline509 can be considered as a realization of a stochastic process tex2html_wrap_inline515 , characterized by the joint probability distribution function tex2html_wrap_inline517 , tex2html_wrap_inline519 Pr tex2html_wrap_inline521 . The entropy rate of tex2html_wrap_inline515 is defined as [1]:

  equation39

where tex2html_wrap_inline525 is the entropy of the joint distribution tex2html_wrap_inline517 :

  equation49

Alternatively, the time series tex2html_wrap_inline509 can be considered as a projection of a trajectory of a dynamical system, evolving in some measurable state space. As a definition of the entropy rate of a dynamical system, known as the Kolmogorov-Sinai entropy (KSE) [2, 3, 4] we can consider the equation (1), however, the variables tex2html_wrap_inline531 should be understood as m-dimensional variables, according to a dimensionality of the dynamical system [5]. If the dynamical system is evolving in a continuous measure space, then any entropy depends on a partition chosen to discretize the space and the KSE is defined as a supremum over all finite partitions [2, 3, 4].

The KSE is a topological invariant, suitable for classification of dynamical systems or their states, and is related to the sum of the system's positive Lyapunov exponents (LE) according to the theorem of Pesin [6].

A number of algorithms (see, e.g., [7, 8, 9, 10] and references therein) have been proposed for estimation of the KSE from time series. Reliability of these estimates, however, is limited [11] by available amount of data, finite precision measurements and noise always present in experimental data. No general approach to estimating the entropy rates of stochastic processes has been established, except of simple cases such as finite-state Markov chains [1]. However, if tex2html_wrap_inline515 is a zero-mean stationary Gaussian process with spectral density function tex2html_wrap_inline537 , its entropy rate tex2html_wrap_inline539 , apart from a constant term, can be expressed using tex2html_wrap_inline537 as [12, 13, 14]:

  equation71

Dynamics of a stationary Gaussian process is fully described by its spectrum. Therefore the connection (3) between the entropy rate of such a process and its spectral density tex2html_wrap_inline537 is understandable. The estimation of the entropy rate of a Gaussian process is reduced to the estimation of its spectrum.

If a studied time series was generated by a nonlinear, possibly chaotic, dynamical system, its description in terms of a spectral density is not sufficient. Indeed, realizations of isospectral Gaussian processes are used in the surrogate-data based tests in order to discern nonlinear (possibly chaotic) processes from colored noises [15, 16]. On the other hand, there are results indicating that some characteristic properties of nonlinear dynamical systems may be ``projected'' into their ``linear properties'', i.e., into spectra, or equivalently, into autocorrelation functions: Sigeti [17] has demonstrated that there may be a relation between the sum of positive Lyapunov exponents (KSE) of a chaotic dynamical system and the decay coefficient characterizing the exponential decay at high frequencies of spectra estimated from time series generated by the dynamical system. Asymptotic decay of autocorrelation functions of such time series is ruled by the second eigenvalue of the Perron-Frobenius operator of the dynamical system [18, 19]. Lipton & Dabke [20] have also investigated asyptotic decay of spectra in relation to properties of underlying dynamical systems.


next up previous
Next: Numerical study Up: On entropy rates of Previous: On entropy rates of

Milan Palus
Mon Dec 16 09:47:50 EST 1996