Method of moments (statistics)

Method of moments (statistics)

In statistics, the method of moments is a method of estimation of population parameters such as mean, variance, median, etc. (which need not be moments), by equating sample moments with unobservable population moments and then solving those equations for the quantities to be estimated.

Contents

Methodology

Suppose that the problem is to estimate p unknown parameters θ12,...θp characterizing a distribution fW(w;θ). Suppose p of the moments of the true distribution can be expressed as functions of the θs:

\mu_{1} \equiv E[W^1]=g_{1}(\theta_{1}, \theta_{2}, ... \theta_{p}) ,
\mu_{2} \equiv E[W^2]=g_{2}(\theta_{1}, \theta_{2}, ... \theta_{p})  ,
...
\mu_{p} \equiv E[W^p]=g_{p}(\theta_{1}, \theta_{2}, ... \theta_{p})  .

Let \hat{\mu_{j}}=(\Sigma_{i=1}^{n} w_{i}^{j})/n be the j-th sample moment corresponding to the population moment μj. The method of moments estimator for θ12,...θp denoted by \hat{\theta}_{1}, \hat{\theta_{2}}, ... \hat{\theta}_{p} is defined by the solution (if there is one) to the equations:[citation needed]

\hat \mu_{1} = g_{1}(\hat{\theta}_{1}, \hat{\theta_{2}}, ... \hat{\theta}_{p}) ,
\hat \mu_{2} = g_{2}(\hat{\theta}_{1}, \hat{\theta_{2}}, ... \hat{\theta}_{p}) ,
...
\hat \mu_{p} = g_{p}(\hat{\theta}_{1}, \hat{\theta_{2}}, ... \hat{\theta}_{p}) .

Example

Suppose X1, ..., Xn are independent identically distributed random variables with a gamma distribution with probability density function

{x^{\alpha-1} e^{-x/\beta} \over \beta^\alpha\, \Gamma(\alpha)} \,\!

for x > 0, and 0 for x < 0.

The first moment, i.e., the expected value, of a random variable with this probability distribution is

\operatorname{E}(X_1)=\alpha\beta\,

and the second moment, i.e., the expected value of its square, is

\operatorname{E}(X_1^2)=\beta^2\alpha(\alpha+1).\,

These are the "population moments".

The first and second "sample moments" m1 and m2 are respectively

m_{1} = {X_1+\cdots+X_n \over n} \,\!

and

m_{2} = {X_1^2+\cdots+X_n^2 \over n}.\,\!

Equating the population moments with the sample moments, we get

\alpha\beta = m_{1} \,\!

and

\beta^2\alpha(\alpha+1) = m_{2}.\,\!

Solving these two equations for α and β, we get

\alpha={ m_{1}^2 
\over m_{2} - m_{1}^2}\,\!

and

\beta={ m_{2} - m_{1}^2 \over m_{1}}.\,\!

We then use these 2 quantities as estimates, based on the sample, of the two unobservable population parameters α and β.

Advantages and disadvantages of this method

In some respects, when estimating parameters of a known family of probability distributions, this method was superseded by Fisher's method of maximum likelihood, because maximum likelihood estimators have higher probability of being close to the quantities to be estimated.

However, in some cases, as in the above example of the gamma distribution, the likelihood equations may be intractable without computers, whereas the method-of-moments estimators can be quickly and easily calculated by hand as shown above.

Estimates by the method of moments may be used as the first approximation to the solutions of the likelihood equations, and successive improved approximations may then be found by the Newton–Raphson method. In this way the method of moments and the method of maximum likelihood are symbiotic.

In some cases, infrequent with large samples but not so infrequent with small samples, the estimates given by the method of moments are outside of the parameter space; it does not make sense to rely on them then. That problem never arises in the method of maximum likelihood. Also, estimates by the method of moments are not necessarily sufficient statistics, i.e., they sometimes fail to take into account all relevant information in the sample.

When estimating other structural parameters (e.g., parameters of a utility function, instead of parameters of a known probability distribution), appropriate probability distributions may not be known, and moment-based estimates may be preferred to MLE.

See also


Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Method of moments — may refer to: Method of moments (statistics), a method of parameter estimation in statistics Method of moments (probability theory), a way of proving convergence in distribution in probability theory Second moment method, a technique used in… …   Wikipedia

  • Generalized method of moments — GMM may also mean Gaussian mixture model. : For the Thai entertainment company, see GMM Grammy. The generalized method of moments is a very general statistical method for obtaining estimates of parameters of statistical models. It is a… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Monte Carlo method — Not to be confused with Monte Carlo algorithm. Computational physics …   Wikipedia

  • Cross-validation (statistics) — Cross validation, sometimes called rotation estimation,[1][2][3] is a technique for assessing how the results of a statistical analysis will generalize to an independent data set. It is mainly used in settings where the goal is prediction, and… …   Wikipedia

  • Official statistics — on Germany in 2010, published in UNECE Countries in Figures 2011. Official statistics are statistics published by government agencies or other public bodies such as international organizations. They provide quantitative or qualitative information …   Wikipedia

  • Descriptive statistics — quantitatively describe the main features of a collection of data.[1] Descriptive statistics are distinguished from inferential statistics (or inductive statistics), in that descriptive statistics aim to summarize a data set, rather than use the… …   Wikipedia

  • Outline of statistics — The following outline is provided as an overview and guide to the variety of topics included within the subject of statistics: Statistics pertains to the collection, analysis, interpretation, and presentation of data. It is applicable to a wide… …   Wikipedia

  • Multivariate statistics — is a form of statistics encompassing the simultaneous observation and analysis of more than one statistical variable. The application of multivariate statistics is multivariate analysis. Methods of bivariate statistics, for example simple linear… …   Wikipedia

  • Standard error (statistics) — For a value that is sampled with an unbiased normally distributed error, the above depicts the proportion of samples that would fall between 0, 1, 2, and 3 standard deviations above and below the actual value. The standard error is the standard… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”