Demo: Efficient Estimators and Vector Cramer-Rao Bounds

In a two-parameter Gaussian location model, use the covariance ellipse to connect Fisher information with the matrix form of the Cramer-Rao lower bound.

Mathematical setup

Let $X_i\sim N_2(\theta,\Sigma)$ iid, where

\[\Sigma= \begin{bmatrix} \sigma_1^2 & \rho\sigma_1\sigma_2\\ \rho\sigma_1\sigma_2 & \sigma_2^2 \end{bmatrix}.\]

For the vector mean parameter $\theta=(\theta_1,\theta_2)^T$, the Fisher information is

\[I_n(\theta)=n\Sigma^{-1}.\]

The vector CRLB says that any unbiased estimator $\hat\theta$ must satisfy

\[\operatorname{Cov}(\hat\theta)-I_n(\theta)^{-1} \quad\text{is positive semidefinite}.\]

The sample mean attains the bound because $\operatorname{Cov}(\bar X)=\Sigma/n$. The plotted ellipse is the covariance contour

\[u^T(\Sigma/n)^{-1}u=4,\]

so its semiaxes are two marginal standard deviations along the principal directions. It is not the same object as a fixed-probability confidence ellipse unless the radius is chosen from a chi-squared quantile.

What to try

  • Increase $n$. The whole ellipse contracts, reflecting the $1/n$ covariance scaling.
  • Change $\rho$. Correlation rotates the ellipse and changes the covariance term without changing the marginal variances directly.
  • Make $\sigma_1$ much larger than $\sigma_2$. The ellipse stretches in the less precise coordinate.

The ellipse shows the Mahalanobis-radius-2 contour for the inverse-information covariance $\Sigma/n$.

Back to topic notes