Likelihood
The Likelihood class represents a likelihood function of the form $$f_k(y_k \,|\, x_k) \enspace,$$ where \(x_k \in \mathbb{R}^N\) denotes the system state and \(y_k \in \mathbb{R}^{M_k}\) a measurement. More precisely, the logarithm of \(f_k(y_k \,|\, x_k)\), i.e., its \(\log\)-likelihood \(\log(f_k(y_k \,|\, x_k))\), has to be implemented due to numerical reasons.
For a measurement model $$y_k = h_k(x_k, v_k)$$ with measurement noise \(v_k \in \mathbb{R}^{V_k} \) and PDF \(p(v_k)\), the corresponding likelihood function is given by $$f_k(y_k \,|\, x_k) = \int_{\mathbb{R}^{V_k}} \delta(y_k - h_k(x_k, v_k)) p(v_k) \operatorname{d}v_k \enspace,$$ where \(\delta\) denotes the Dirac-\(\delta\) distribution. For the special case of additive measurement noise, i.e., $$y_k = h_k(x_k) + v_k \enspace,$$ the corresponding likelihood function is given by $$f_k(y_k \,|\, x_k) = p(y_k - h_k(x_k)) \enspace.$$
The likelihood function for additive measurement noise is already implemented by the AdditiveNoiseMeasurementModel.
The following example code can be found in the toolbox's examples.
Usage
In order to write a specific likelihood function, you have to
create a new subclass of Likelihood and
implement the computation of the logarithm of \(f_k(\mathcal{Y}_k \,|\, x_k)\), i.e., \(\log(f_k(\mathcal{Y}_k \,|\, x_k))\), by implementing the abstract logLikelihood() method.
Example
We consider a 2D target described by the system state $$x_k = [p^x_k, p^y_k, \dot{p}^x_k, \dot{p}^y_k, \ddot{p}^x_k, \ddot{p}^y_k]^\top$$ and measure its position in polar coordinates according to $$y_k = h_k(x_k) + v_k = \begin{bmatrix} \sqrt{(p^x_k)^2 + (p^y_k)^2} \\ \operatorname{atan2}(p^y_k, p^x_k) \end{bmatrix} + v_k \enspace,$$ where \(v_k\) is zero-mean white Gaussian noise with covariance matrix \(\mathbf{R} = \operatorname{diag}(10^{-2}, 10^{-4})\).
The logLikelihood() method has to be implemented such that it can process an arbitrary number of \(L\) passed state samples \([x_k^{(1)}, \ldots, x_k^{(L)}]\) as it is done in the following. That is, logLikelihood() has to compute for each state sample \(x_k^{(i)}\) the corresponding \(\log\)-likelihood value \(l_k^{(i)} = \log(f_k(\mathcal{Y}_k \,|\, x_k^{(i)}))\) and return all values as row vector \([l_k^{(1)}, \ldots, l_k^{(L)}]\).
classdef LikelihoodExample < Likelihood methods function obj = LikelihoodExample() obj.noise = Gaussian(zeros(2, 1), [1e-2, 1e-4]); end function logValues = logLikelihood(obj, stateSamples, measurement) x = stateSamples(1, :); y = stateSamples(2, :); % Evaluate deterministic part of the measurement model h = [sqrt(x.^2 + y.^2) atan2(y, x) ]; % Compute differences y - h(x) diffs = bsxfun(@minus, measurement, h); % Evaluate the measurement noise logarithmic pdf logValues = obj.noise.logPdf(diffs); end end properties (Access = 'private') noise; end end