# Recursive Bayesian estimation

﻿
Recursive Bayesian estimation

Recursive Bayesian estimation is a general probabilistic approach for estimating an unknown probability density function recursively over time using incoming measurements and a mathematical process model.

Model

The true state $x$ is assumed to be an unobserved Markov process, and the measurements $z$ are the observed states of a Hidden Markov Model (HMM).

Because of the Markov assumption, the probability of the current true state given the immediately previous one is conditionally independent of the other earlier states.

:$p\left( extbf\left\{x\right\}_k| extbf\left\{x\right\}_\left\{k-1\right\}, extbf\left\{x\right\}_\left\{k-2\right\},dots, extbf\left\{x\right\}_0\right) = p\left( extbf\left\{x\right\}_k| extbf\left\{x\right\}_\left\{k-1\right\} \right)$

Similarly, the measurement at the "k"-th timestep is dependent only upon the current state, so is conditionally independent of all other states given the current state.

:$p\left( extbf\left\{z\right\}_k| extbf\left\{x\right\}_k, extbf\left\{x\right\}_\left\{k-1\right\},dots, extbf\left\{x\right\}_\left\{0\right\}\right) = p\left( extbf\left\{z\right\}_k| extbf\left\{x\right\}_\left\{k\right\} \right)$

Using these assumptions the probability distribution over all states of the HMM can be written simply as:

:$p\left( extbf\left\{x\right\}_0,dots, extbf\left\{x\right\}_k, extbf\left\{z\right\}_1,dots, extbf\left\{z\right\}_k\right) = p\left( extbf\left\{x\right\}_0\right)prod_\left\{i=1\right\}^k p\left( extbf\left\{z\right\}_i| extbf\left\{x\right\}_i\right)p\left( extbf\left\{x\right\}_i| extbf\left\{x\right\}_\left\{i-1\right\}\right)$

However, when using the Kalman filter to estimate the state x, the probability distribution of interest is associated with the current states conditioned on the measurements up to the current timestep. (This is achieved by marginalising out the previous states and dividing by the probability of the measurement set.)

This leads to the "predict" and "update" steps of the Kalman filter written probabilistically. The probability distribution associated with the predicted state is product of the probability distribution associated with the transition from the ("k" - 1)-th timestep to the "k"-th and the probability distribution associated with the previous state, with the true state at ("k" - 1) integrated out.

:$p\left( extbf\left\{x\right\}_k| extbf\left\{Z\right\}_\left\{k-1\right\}\right) = int p\left( extbf\left\{x\right\}_k | extbf\left\{x\right\}_\left\{k-1\right\}\right) p\left( extbf\left\{x\right\}_\left\{k-1\right\} | extbf\left\{Z\right\}_\left\{k-1\right\} \right) , d extbf\left\{x\right\}_\left\{k-1\right\}$

The probability distribution of updated is proportional to the product of the measurement likelihood and the predicted state.:$p\left( extbf\left\{x\right\}_k| extbf\left\{Z\right\}_\left\{k\right\}\right) = frac\left\{p\left( extbf\left\{Z\right\}_k| extbf\left\{x\right\}_k\right) p\left( extbf\left\{x\right\}_k| extbf\left\{Z\right\}_\left\{k-1\right\}\right)\right\}\left\{p\left( extbf\left\{Z\right\}_k| extbf\left\{Z\right\}_\left\{k-1\right\}\right)\right\} = alpha,p\left( extbf\left\{Z\right\}_k| extbf\left\{x\right\}_k\right) p\left( extbf\left\{x\right\}_k| extbf\left\{Z\right\}_\left\{k-1\right\}\right)$

The denominator:$p\left( extbf\left\{Z\right\}_k| extbf\left\{Z\right\}_\left\{k-1\right\}\right) = int p\left( extbf\left\{Z\right\}_k| extbf\left\{x\right\}_k\right) p\left( extbf\left\{x\right\}_k| extbf\left\{Z\right\}_\left\{k-1\right\}\right) d extbf\left\{x\right\}_k$is constant relative to $x$, so we can always substitute it for a coefficient $alpha$, which can usually be ignored in practice. The numerator can be calculated and then simply normalized, since its integral must be unitary.

Applications

* Kalman filter, a recursive Bayesian filter for multivariate normal distributions
* Particle filter, a sequential Monte Carlo (SMC) based technique, which models the PDF using a set of discrete points
* Grid-based estimators, which subdivide the PDF into a discrete grid

equential Bayesian filtering

Sequential Bayesian filtering is the extension of the Bayesian estimation for the case when the observed value changes in time. It is a method to estimate the real value of an observed variable that evolves in time. The method is named filtering when we estimate the current value given past observations, smoothing when estimating past value given present and past measures, and prediction when estimating a probable future value.

The notion of Sequential Bayesian filtering is extensively used in control and robotics.

* [http://citeseer.ist.psu.edu/504843.html A Tutorial on Particle Filters for On-line Non-linear/Non-Gaussian Bayesian Tracking] , IEEE Transactions on Signal Processing (2001)
* [http://julien.diard.free.fr/articles/CIRAS03.pdf A survey of probabilistic models, using the Bayesian Programming methodology as a unifying framework]

Wikimedia Foundation. 2010.

### Look at other dictionaries:

• Bayesian Filtering Library — (BFL) is an open source C++ library for recursive Bayesian estimation. The library is mainly written by the Belgian scientist Klaas Gadeyne, and runs on Linux, Mac OS X, and Microsoft Windows. Features * Kalman filtering * Particle filters *… …   Wikipedia

• Bayesian spam filtering — (pronounced BAYS ee ən, IPA pronunciation: IPA| [ beɪz.i.ən] , after Rev. Thomas Bayes), a form of e mail filtering, is the process of using a naive Bayes classifier to identify spam e mail.The first known mail filtering program to use a Bayes… …   Wikipedia

• Dynamic Bayesian network — A dynamic Bayesian network is a Bayesian network that represents sequences of variables. These sequences are often time series (for example, in speech recognition) or sequences of symbols (for example, protein sequences). The hidden Markov model… …   Wikipedia

• Motion estimation — is the process of determining motion vectors that describe the transformation from one 2D image to another; usually from adjacent frames in a video sequence. It is an ill posed problem as the motion is in three dimensions but the images are a… …   Wikipedia

• Kalman filter — Roles of the variables in the Kalman filter. (Larger image here) In statistics, the Kalman filter is a mathematical method named after Rudolf E. Kálmán. Its purpose is to use measurements observed over time, containing noise (random variations)… …   Wikipedia

• List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

• Particle filter — Particle filters, also known as sequential Monte Carlo methods (SMC), are sophisticated model estimation techniques based on simulation. They are usually used to estimate Bayesian models and are the sequential ( on line ) analogue of Markov chain …   Wikipedia

• Outline of statistics — The following outline is provided as an overview and guide to the variety of topics included within the subject of statistics: Statistics pertains to the collection, analysis, interpretation, and presentation of data. It is applicable to a wide… …   Wikipedia

• Bayes estimator — In decision theory and estimation theory, a Bayes estimator is an estimator or decision rule that maximizes the posterior expected value of a utility function or minimizes the posterior expected value of a loss function (also called posterior… …   Wikipedia

• Ensemble Kalman filter — The ensemble Kalman filter (EnKF) is a recursive filter suitable for problems with a large number of variables, such as discretizations of partial differential equations in geophysical models. The EnKF originated as a version of the Kalman filter …   Wikipedia