In this post I will talk about Minimum Mean Squared Error a little bit. This is an estimation method in which the mean of squared error between the actual and the estimated error is minimized.

Assume that we want to estimate some parameter $$\theta$$ from the available set of observations, $$x = [x, x, ... , x[N-1]]$$. For example suppose the observed data comes from the following process,

$$x[n] = A + w[n]$$, w[n] is additive noise with a normal distribution. In this example we would be trying to estimate the parameter A(in real scenarios the actual value of A will not be available) from N observations. The most simple method for estimating A (this is the $$theta$$ for this problem), as you have probably guessed it, is the sample mean of the observed data.

$\hat{A} = frac{x + x + ... + x[N-1]}{N}$

But then the question comes is this the best estimate of A we can have. Well, that depends on how are we judging a estimator function,

$$\bar{A} = f(x, x, ... , x[N-1])$$.

Now to decide on whether a estimation is the best one we need a Objective function. There are quite a few of objective functions that are used in different estimation problems.

• Mean Squared Error
• Root Mean Squared Logarithmic Error
• Hit or Miss
• Sum of Absolute Difference and etc. etc.

Here we would be talking about the first one, MSE or Mean Squared Error.

$MSE(\hat{A}) = E[(A - \hat{A})^2]$

Thus a Minimum Mean Squared Error or MMSE estimate would be that $latex \hat{A}$, which minimizes the value of $latex MSE(\hat{A})$.

Now we verify the performance of MMSE estimator using the following MATLAB Code.