Haskell Time-Series

Lloyd Allison, School of Computer Science & Software Engineering, Monash University, Clayton, Victoria, Australia 3800.

Abstract: The talk concerns 'statistical models' as used in artificial intelligence, data mining, inductive inference or machine learning. Many models are naturally polymorphic. There are useful operators, some known and some yet to be discovered*, for combining models to make new models. Overfitting is a potential problem in inductive inference but there is a natural combinational criterion, MML, to trade-off model complexity v. fit to data. I suggest that statistical models, MML and functional programming make a natural threesome. Examples are given from "time"-series analysis.   www.csse.monash.edu.au/~lloyd/Seminars/200412-TimeSeries/index.shtml

To: Prog. Lang. & Sys. (PLASMA) research group, Dept. of Computer Science, York University, 12.15-13.15, room CS202J, Wed. 15 Dec. 2004.

[*] (un)known unknowns?

p l a s m a, pic by L1oyd A11ison, 10/2004

What should this kind of programming be called?


Q: 'Function' is to 'functional programming' as

    'statistical model' is to <what>?


? Inductive programming ?

Model selection?

overfitting and the model selection problem in statistical inference
data - model "error" v. model complexity

Overfitting is a problem, unless...

... MML model selection!

minimum message length, MML, minimum description length, MDL curve AIC BIC Bayesian methods
data + model complexity

Math can be hard with (variable numbers of) discrete and continuous parameters.

Most important abilities of a (basic) model

class Model mdl where
  pr :: (mdl dataSpace) -> dataSpace -> Probability
  nlPr :: (mdl dataSpace) -> dataSpace -> MessageLength
  msg :: SuperModel (mdl dataSpace) => (mdl dataSpace) -> [dataSpace] -> MessageLength

A "superclass" for Statistical Models

class (Show sMdl) => SuperModel sMdl where
  prior :: sMdl -> Probability
  msg1 :: sMdl -> MessageLength
  mixture :: (Mixture mx, SuperModel (mx sMdl)) => mx sMdl -> sMdl

Function-Model (Regression)

class FunctionModel fm where
  condModel :: (fm inSpace opSpace) -> inSpace -> ModelType opSpace
  condPr :: (fm inSpace opSpace) -> inSpace -> opSpace -> Probability

I.e. Given the values of the input (exogenous) attributes (variables) make a conditional (model) prediction of the output (endogenous) attributes.

Time-Series Model

class TimeSeries tsm where
  predictors :: (tsm dataSpace) -> [dataSpace] -> [ModelType dataSpace]
  prs :: (tsm dataSpace) -> [dataSpace] -> [Probability]

I.e. A time-series model makes a sequence of (model) predictions for what comes next given the preceding context.


estMarkov k dataSeries =
    scan (d:ds) context = ...
    contexts = scan dataseries []
  in functionModel2timeSeries (estFiniteListFunction k contexts dataSeries)

Many useful functions such as functionModel2timeSeries :: (FunctionModel fm) => fm [ds] ds -> TimeSeriesType ds

Pairs of sequences

data OrBoth a b = JustL a | JustR b | Both a b deriving ...
modelOrBoth :: model Ops -> model a -> model b -> model (a,b) -> model (OrBoth a b)
tsmOrBoth :: (model a -> model b -> model (a,b)) -> tsm Ops -> tsm a -> tsm b -> tsm (OrBoth a b)
... when it is convenient to make a model (a, b) from the (model a) and the (model b).

(NB some tortured types.)

e.g. optimal alignment

   || ||| || ||
     ^   ^  ^
     |   |  JustL C
     |   Both A G
     JustR G

2-D dynamic programming algorithm.

dpa2D :: Int -> tsm (OrBoth a b) -> [a] -> [b] -> [OrBoth a b]

Note that  `a' and `b' can be different types, discrete, continuous, multivariate, even non-EQ types.

The time-series type, tsm, often has a "memory".


memory of length 1

Needs time-series models with "state".

[OrBoth a b] <---> ([Ops], [a], [b])LHS & RHS equivalent, if lengths match.

  [Ops] ---> (tsm Ops),
  [a] ---> (tsm a),
  [b] ---> (tsm b)
  ---> (new) tsm (OrBoth a b)
  ---> (new) [OrBoth a b] ... (EM) must converge.

Approximate Repeats

A generalised Lempel Ziv time-series model

data ApproxRepeats a = Gen a | Jump Int | Rpt (OrBoth a a) deriving...
tsmApproxRepeats :: (model a -> model a -> model (a,a)) -> tsm OpsAR -> tsm a -> tsm OpsOB -> tsm a
(and recall that tsmOrBoth has a (tsm a) parameter ... yes you can.-)


this is to that
subject of this talk inductive inference
list prelude list processing
{parser combinator} parsing
denotational semantics of L L

... before parser combinators etc. were invented.


We have ...

(the start of) a combinator library for machine learning,

a combinational denotational semantics of statistical models,

a way of "software engineering" much of AI & machine learning,

some tiny but very general programs for e.g. time-series analysis, [clustering, (decision-) classification-trees] and [Bayesian Networks] etc..

And if you have a good answer to the [question] -- please tell me.

Reading: [more] (and on approximate repeats [here]).

© L. Aλλison, School of Computer Science and Software Engineering, Monash University, Australia 3800.
Created with "vi (Linux & Solaris)",   charset=iso-8859-1