Marginal likelihood

Marginal likelihood

In statistics, a marginal likelihood function, or integrated likelihood, is a likelihood function in which some parameter variables have been marginalised. It may also be referred to as evidence, but this usage is somewhat idiosyncratic.

Given a parameter θ=(ψ,λ), where ψ is the parameter of interest, it is often desirable to consider the likelihood function only in terms of ψ. If there exists a probability distribution for λ, sometimes referred to as the nuisance parameter, in terms of ψ, then it may be possible to marginalise or integrate out λ:

\mathcal{L}(\psi;x) = p(x|\psi) = \int_\Lambda p(x|\psi,\lambda) p(\lambda|\psi) \ \operatorname{d} \lambda

Unfortunately, marginal likelihoods are generally difficult to compute. Exact solutions are known for a small class of distributions. In general, some kind of numerical integration method is needed, either a general method such as Gaussian integration or a Monte Carlo method, or a method specialized to statistical problems such as the Laplace approximation, Gibbs sampling or the EM algorithm.

Contents

Applications

Bayesian model comparison

In Bayesian model comparison, the marginalized variables are parameters for a particular type of model, and the remaining variable is the identity of the model itself. In this case, the marginalized likelihood is the probability of the data given the model type, not assuming any particular model parameters. Writing θ for the model parameters, the marginal likelihood for the model M is

 p(x|M) = \int p(x|\theta, M) \, p(\theta|M) \, d\theta

This quantity is important because the posterior odds ratio for a model M1 against another model M2 involves a ratio of marginal likelihoods, the so-called Bayes factor:

 \frac{p(M_1|x)}{p(M_2|x)} = \frac{p(M_1)}{p(M_2)} \, \frac{p(x|M_1)}{p(x|M_2)}

which can be stated schematically as

posterior odds = prior odds × Bayes factor

See also

  • Empirical Bayes methods
  • Marginal probability

References


Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Likelihood function — In statistics, a likelihood function (often simply the likelihood) is a function of the parameters of a statistical model, defined as follows: the likelihood of a set of parameter values given some observed outcomes is equal to the probability of …   Wikipedia

  • Marginal model — In statistics, marginal models (Heagerty Zeger, 2000) are a technique for obtaining regression estimates in multilevel modeling, also called hierarchical linear models. People often want to know the effect of a predictor/explanatory variable X,… …   Wikipedia

  • Quasi-maximum likelihood — A quasi maximum likelihood estimate (QMLE, also known as a pseudo likelihood estimate or a composite likelihood estimate ) is an estimate of a parameter θ in a statistical model that is formed by maximizing a function that is related to the… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Bayes factor — In statistics, the use of Bayes factors is a Bayesian alternative to classical hypothesis testing.cite journal | author = Goodman S | title = Toward evidence based medical statistics. 1: The P value fallacy | journal = Ann Intern Med | volume =… …   Wikipedia

  • List of mathematics articles (M) — NOTOC M M estimator M group M matrix M separation M set M. C. Escher s legacy M. Riesz extension theorem M/M/1 model Maass wave form Mac Lane s planarity criterion Macaulay brackets Macbeath surface MacCormack method Macdonald polynomial Machin… …   Wikipedia

  • Minimum description length — The minimum description length (MDL) principle is a formalization of Occam s Razor in which the best hypothesis for a given set of data is the one that leads to the best compression of the data. MDL was introduced by Jorma Rissanen in 1978. It is …   Wikipedia

  • Evidence (disambiguation) — Evidence may refer to:*Evidence, a scientific and philosophical concept *Evidence (law), which governs testimony and exhibits presented in a case * Evidence (short story) (1946), a short story by Isaac Asimov *Evidence (artist), Rapper/Producer,… …   Wikipedia

  • Variational Bayesian methods — Variational Bayesian methods, also called ensemble learning, are a family of techniques for approximating intractable integrals arising in Bayesian statistics and machine learning. They can be used to lower bound the marginal likelihood (i.e.… …   Wikipedia

  • Empirical Bayes method — In statistics, empirical Bayes methods are a class of methods which use empirical data to evaluate / approximate the conditional probability distributions that arise from Bayes theorem. These methods allow one to estimate quantities… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”