Skip to content

hmm_discrete

nickgillian edited this page Aug 14, 2016 · 4 revisions

#Hidden Markov Model (Discrete)

##Description This class implements a Hidden Markov Model (HMM) classifier.

The GRT HMM algorithm lets you select between using a Discrete HMM or a Continuous HMM. This page demonstrates how to use a Discrete HMM. If you require a Continuous HMM, you should read the [[HMM Continuous tutorial|hmm_continuous].

Hidden Markov Models are powerful classifiers that work well on temporal classification problems when you have a large training dataset. If you only have a few training samples, then the GRT HMM Continuous can work better than the HMM Discrete. Alternatively, the GRT DTW algorithm might work better.

This HMM algorithm in the GRT is a discrete HMM, which means that the input to the HMM algorithm must be a discrete integer value in the range [0 numSymbols-1]. You can convert N-dimensional continuous data into a 1-dimensional discrete data using one of the GRT Quantization algorithms, such as KMeansQuantizer.

The HMM algorithm is part of the GRT classification modules.

##Advantages Hidden Markov Models are powerful classifiers that work well on temporal classification problems when you have a large training dataset. If you only have a few training samples, then the GRT DTW algorithm might work better. Disadvantages

The main limitation of the HMM algorithm is that the accuracy of the classifier can vary dramatically by changing the HMM number of states and number of symbols parameters. Selecting these parameters can involve a large amount of trial and error. Further, selecting and training an appropriate quantization algorithm can also affect the accuracy of the classifier.

##Training Data Format You should use the TimeSeriesClassificationData data structure to train the HMM classifier.

##Example Discrete HMM Example