Published on *Computer Science Department at Princeton University* (https://www.cs.princeton.edu)

Finite growth models (FGM) are nonnegative functionals that

arise from parametrically-weighted directed acyclic graphs and a

tuple observation that affects these weights. The weight of a

source-sink path is the product of the weights along it. The

functional's value is the sum of the weights of all such paths.

The mathematical foundations of hidden Markov modeling (HMM) and

expectation maximization (EM) are generalized to address the problem of

functional maximization given an observation. Probability models such

as HMMs and stochastic context free grammars are examples that satisfy

a particular constraint: that of summing or integrating to one. The

FGM framework, algorithms, and data structures describe these and

other similar stochastic models while providing a unified and natural

way for computer scientists to learn and reason about them and their many

variations.

Restricted to probabilistic form, FGMs correspond to stochastic automata

that allow observations to be processed in many orderings and

groupings -- not just one-by-one in sequential order.

As a result the parameters of a highly general form of

stochastic transducer can be learned from examples, and the particular

case of string edit distance is developed.

In the FGM framework one may direct learning by criteria beyond simple

maximum-likelihood. The MAP (maximum a posteriori estimate) and

MDL (minimum description length) are discussed along with the

application of FGMs to causal-context probability models and

unnormalized noncausal models.

**Links**

[1] https://www.cs.princeton.edu/research/techreps/author/453

[2] https://www.cs.princeton.edu/research/techreps/author/360

[3] ftp://ftp.cs.princeton.edu/techreports/1996/533.ps.gz