A method of of moments [[estimator]] (MME) uses the [[moments]] of a distribution to estimate population parameters.
In method of moments estimation, we equate the population and sample moments and solve for the unknown parameters.
For $\text{k = 1, 2, 3, } \dots$ the kth population moment is given by
$\mu_k = E[X^k]$
and the kth sample moment is given by
$M_k = \frac1n \sum_{i=1}^n X^k_i$
When you have multiple unknown parameters, include a higher moment to give you as many equations as variables.
MME is not guaranteed to be an unbiased estimator. To check, calculate the expected value of the parameter. The MME can be manipulated to be unbiased based on the transformation that is indicated by this evaluation (e.g., to make the MME of an [[exponential distribution]] unbiased, multiply b $n-1$.)
# Example with one parameter
Let's find the rate parameter from the [[exponential distribution]] given some data. Let $X \overset{iid}{\sim} exp(rate = \lambda)$. Find $\lambda$.
The first population moment is the mean (by definition)
$\mu = E(X) = \frac1\lambda$
The corollary from the sample, the first sample moment, is the mean of all sample points
$M_1 = \frac1n \sum_{i=1}^n X_i = \bar X$
Set both equal to the other to get the MME.
$\frac{1}{\hat \lambda} = \frac{1}{\bar X}$
> [!NOTE]
> Note that a "hat" symbol indicates the estimator of the parameter, rather than the parameter itself. This is often left out but is important as you would never equate a true population parameter with an estimator from a sample.
## Example with two parameters
Let's find the parameters for the [[gamma distribution]] given some data. Let $X \overset{iid}{\sim} \Gamma(\alpha, \beta)$. Find $\alpha$ and $\beta$.
Set the first population moment and sample moment (mean in both cases) equal to each other (look up the mean for the gamma distribution in a table if needed).
$\frac{\hat \alpha}{\hat \beta} = \bar X$
Set the second population moment and sample moment equal to each other. The trick here is to use the computational formula for [[variance]] to extract the second moment for the distribution (or do it the hard way and integrate $X^2$ in front of the pdf for the distribution).
The computational formula for variance is
$V(X) = E(X^2) - (E(X))^2$
Rearranging terms we get
$E(X^2) = V(X) + (E(X))^2$
For the gamma distribution, we know $E(X) = \frac{\alpha}{\beta}$ and $V(X) = \frac{\alpha}{\beta ^ 2}$. Plugging those values into the computational formula for variance we get the following. (Note that we are leaving off the "hats" that indicate we're working with parameter estimators for simplicity.)
$E(X^2) = \frac{\alpha}{\beta ^ 2} + (\frac{\alpha}{\beta})^2$
Finally set the second moments of the population and sample equal to each other.
$\frac{\alpha}{\beta ^ 2} + (\frac{\alpha}{\beta})^2 = \frac1n \sum_{i=1}^n X_i^2$
We can simplify the left side of this equation by replacing any $\frac{\alpha}{\beta}$ with $\bar X$.
$\frac1\beta \frac{\alpha}{\beta} + (\frac{\alpha}{\beta})^2 = \frac1n \sum_{i=1}^n X_i^2$
$\frac1\beta \bar X + \bar X^2 = \frac1n \sum_{i=1}^n X_i^2$
Finally we can solve for $\beta$.
$\hat \beta = \frac{\bar X}{\frac1n (\sum_{i=1}^n X_i) - \bar X^2}$
Plug in the estimator for $\beta$ in the first equation to solve for $\hat \alpha$.
$\hat \alpha = \frac{\bar X^2}{\frac1n \sum_{i=1}^n X_i^2} - \bar X^2$