3.3 Plug-in principle to define an estimator

Consider an r.v. $\mathbf z$ and sample dataset $D_ N$ drawn from the parametric distribution $F_{\mathbf z}(z,\theta )$. The main issue of estimation is how to define an estimate of $\theta $. A possible solution is given by the plug-in principle, that is a simple method of estimating parameters from samples. The plug-in estimate of a parameter $\theta $ is defined to be:

\begin{equation} \hat{\theta }=t(\hat{F}(z)) \end{equation}

obtained by replacing the distribution function with the empirical distribution in the analytical expression of the parameter.

If $\theta $ is the mean of $\mathbf z$, the sample average is an example of plug-in estimate

[ \hat{\mu }=\int z d \hat{F}(z)=\frac{1}{N} \sum _{i=1}^ N z_i ]

The following section will discuss the plug-in estimators of the first two moments of a probability distribution.