Читать книгу Informatics and Machine Learning - Stephen Winters-Hilt - Страница 42
2.6.3 Series
ОглавлениеA series is a mathematical object consisting of a series of numbers, variables, or observation values. When observations describe equilibrium or “steady state,” emergent phenomenon familiar from physical reality, we often see series phenomena that are martingale. The martingale sequence property can be seen in systems reaching equilibrium in both the physical setting and algorithmic learning setting.
A discrete‐time martingale is a stochastic process where a sequence of r.v. {X1, …, Xn} has conditional expected value of the next observation equal to the last observation: E(Xn+1 | X1, … Xn) = Xn, where E(|Xn|) < ∞. Similarly, one sequence, say {Y1,…, Yn}, is said to be martingale with respect to another, say {X1,…, Xn}, if for all n: E(Yn+1 | X1, … Xn) = Yn, where E(|Yn|) < ∞. Examples of martingales are rife in gambling. For our purposes, the most critical example is the likelihood‐ratio testing in statistics, with test‐statistic, the “likelihood ratio” given as: Yn = Πn i=1 g (Xi)/ f (Xi), where the population densities considered for the data are f and g . If the better (actual) distribution is f , then Yn is martingale with respect to Xn. This scenario arises throughout the hidden Markov models (HMM) Viterbi derivation if local “sensors” are used, such as with profile‐HMM's or position‐dependent Markov models in the vicinity of transition between states. This scenario also arises in the HMM Viterbi recognition of regions (versus transition out of those regions), where length‐martingale side information will be explicitly shown in Chapter 7, providing a pathway for incorporation of any martingale‐series side information (this fits naturally with the clique‐HMM generalizations described in Chapter 7). Given that the core ratio of cumulant probabilities that is employed is itself a martingale, this then provides a means for incorporation of side‐information in general (further details in Appendix C).