Case Study: Mixture of Markov Models18 December 2003 

A colleague mentioned that he was interested in mixture models over Markov models: Given a set of sequences, S=[s_{0}, s_{1},...], over a finite alphabet, each sequence is believed to come from a Markov model but it is not known how many different components (Markov models) are involved, nor what their parameters are. The [200309/] Haskell code for inductive inference includes a basic estimator for mixture models (unsupervised classification, intrinsic classification, clustering), so it seemed that it should be easy to create a prototype estimator for mixtures of Markov models. The first requirement is for a
weighted^{[*]} estimator
for a Markov model.
A Markov model of A weighted estimator, estFiniteFunctionWeighted,
for a finite functionmodel already exists.
It can be used by feeding it the inputs
[s_{0,0},
s_{0,1},
...,
s_{0,s02},
s_{1,0},
...]
with the corresponding outputs
[s_{0,1},
s_{0,2},
...,
s_{0,s01},
s_{1,1},
...]
and the corresponding weights
[w_{0},
w_{0},
...,
w_{0},
w_{1},
...]
where w_{i} is the weight of The finite functionmodel, ff, leads to a predictor function for a finite list functionmodel, flf. flf is converted into a timeseries model by functionModel2timeSeries and thence into a model of sequences by timeSeries2model:
The estimator can be used with estMixture to investigate mixtures of two (or more) components, e.g.
It can be seen that there would be no great difficulty in
considering Markov models of order k=2 or more.
In fact unweighted estimators for
finite list functionmodels and for Markov models of arbitrary
Note that in the above statistical model, each sequence is
considered to have come from a single Markov model.
This is quite different from a situation where an individual sequence may be
heterogeneous, The exercise suggests that it might be more convenient if it were standard for an estimator for a timeseries model to work from a set (list) of dataseries, rather than from a single dataseries.


