Logofet D.O. Markov Chains as Succession Models: New Perspectives of the Classic
Paradigm // Forest Science (Lesovedenie). 2010. No 2. P. 46-59 (in Russian).
Markov chains as a simple kind of the discrete-state random processes
serve as a pertinent tool to formally describe a course of succession when the
chain states are identified with certain succession stages and a scheme of
transitions between these stages is known. The data on the duration of stages
and the likelihood of alternative transitions can be transformed to the
estimation of the transition probability matrix. An immanent property of
absorbing chains, such as the convergence to the final stable distribution of
its states, corresponds to the classic paradigm of the succession theory: the
regular successive movement from pioneer stages to the stable (poly) climax one.
The Markov model enables certain estimates of climax attainability times for
various initial states and the proper probability distribution in the case of
several climax states. Modern views of the forest ecosystem steady-state as a
dynamic mosaic of newly formed and permanently overgrowing gaps in the closed
forest canopy with the full-term distributions by species and age composition
well fit the formalism of nonabsorbing regular chains that suggests the
estimation of the relative area under various stages via the model steady-state
vector. A novel generation of Markov succession models – time-inhomogeneous
Markov chains – introduces some causality features into the pure
phenomenological description, which is typical of their homogeneous prototypes,
thus responding a challenge to models of long-term forest dynamics under the
climate change.
Succession scheme, Markov behavior, stage duration, probability of
transition, transition matrix, absorbing state, fundamental matrix,
attainability time, invariancy, ergodicity, non-Markov effects |