-
Notifications
You must be signed in to change notification settings - Fork 10
/
cvMapGauss.m
97 lines (93 loc) · 3.44 KB
/
cvMapGauss.m
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
function [Class g] = cvMapGauss(X, Mu, Sigma, P)
% cvMapGauss - Bayesian Decision for multi-class Multivariate Gaussian case
%
% Synopsis
% [Class, g] = cvMapGauss(X, Mu, Sigma, P)
%
% Description
% Bayesian decision or MAP (Maximum a posteriori) decision
% in the multivariate Gaussian case for multi-class classification
% problem.
%
% Inputs ([]s are optional)
% (matrix) X D x N matrix representing column classifiee vectors
% where D is the number of dimensions and N is the
% number of vectors.
% (matrix) Mu D x K array representing the mean vectors of
% K classes.
% (matrix) Sigma D x D x K array representing the covariance matricies
% of K classes.
% (vector) [P = uniform]
% K x 1 vector representing the prior probabilities for
% K classes.
% (proportion of number of feature vectors associated
% with the class cluster.)
% The default assigns as uniformly distributed.
% MAP works as Maximum Likelihood decision in the case.
%
% Outputs ([]s are optional)
% (vector) Class 1 x N vector containing intergers indicating the
% class labels for X. Class(n) is the class id for
% X(:,n).
% (matrix) [g] K x N matrix containing returned values by
% discriminant functions.
% (matrix) [W] D x D x K array used for discriminant functions.
% (matrix) [w] D x K array used for discriminant functions
% (vector) [w0] K x 1 vector used for discriminant functions
%
% Discriminant function is as
% argmax_i gi(x) = x'W(i)x + w(i)'x + w0(i)
%
% Example
% demo/cvMapGaussDemo.m
%
% See also
% cvMeanCov
% References
% [1] R. O. Duda, P. E. Hart, and D. G. Stork, "Chapter 2.1. Bayes
% Decision Theory," Pattern Classification, John Wiley & Sons, 2nd ed.,
% 2001.
%
% Authors
% Naotoshi Seo <sonots(at)sonots.com>
%
% License
% The program is free to use for non-commercial academic purposes,
% but for course works, you must understand what is going inside to use.
% The program can be used, modified, or re-distributed for any purposes
% if you or one of your group understand codes (the one must come to
% court if court cases occur.) Please contact the authors if you are
% interested in using the program without meeting the above conditions.
%
% Changes
% 11/01/2007 First Edition
[D, N] = size(X);
K = size(Mu, 2);
if ~exist('P', 'var') || isempty(P)
P = ones(K,1) ./ K;
end
%ClassLabel = 1:length(P);
%% Log posterior probability without normalization term
for i = 1:K
g(i,:) = cvGaussPdf(X, Mu(:,i), Sigma(:,:,i), 'nonorm', 'logp') + log(P(i));
end
%% Discriminant function is formed as 2nd order func for Gaussian case [1]
%% ToDo: Fix me! What's wrong?
% W = []; w = []; w0 = []; g = [];
% for i = 1:K
% InvSigma = inv(Sigma(:,:,i));
% W(:,:,i) = -0.5 * InvSigma; %% [1] (67)
% w(:,i) = InvSigma * Mu(:,i); %% [1] (68)
% w0(i) = -0.5 * Mu(:,i).' * InvSigma * Mu(:,i) ...
% - 0.5 * log(det(Sigma(:,:,i))) + log(P(i)); %% [1] (69)
% end
% W = W * -0.5;
% for n = 1:N
% for i = 1:K
% g(i,n) = -0.5 * X(:,n).' * W(:,:,i) * X(:,n);...
% + w(:,i).' * X(:,n) + w0(i); %% [1] (66)
% end
% end
%% MAP
[maxi, Class] = max(g, [], 1);
%Class = ClassLabel(Class);