Demo: MLE Properties and Numerical Optimization
For a normal mean with known variance, inspect the likelihood curvature that connects optimization, Fisher information, and standard error.
Mathematical setup
Let $X_i\sim N(\theta,\sigma^2)$ iid with known $\sigma$. Up to constants that do not depend on $\theta$,
\[\ell(\theta)=-\frac{1}{2\sigma^2}\sum_{i=1}^n(X_i-\theta)^2.\]The score equation gives $\hat\theta_{\mathrm{MLE}}=\bar X$. The observed curvature is exact:
\[-\ell''(\theta)=\frac{n}{\sigma^2}=I_n(\theta), \qquad \operatorname{SE}(\hat\theta)=\frac{1}{\sqrt{I_n(\theta)}}=\frac{\sigma}{\sqrt n}.\]In this model the asymptotic normal approximation is also the exact sampling distribution of $\bar X$.
What to try
- Increase $n$ while holding $\sigma$ fixed. The likelihood curve becomes sharper because information grows linearly in $n$.
- Increase $\sigma$. The same observed mean becomes less precisely estimated because the curvature decreases.
- Move $\bar x$. The maximum shifts location, but the curvature stays controlled by $n/\sigma^2$.
Here $I_n(\theta)=n/\sigma^2$, so $1/\sqrt{I_n(\theta)}=\sigma/\sqrt n$. In this normal mean model that is the exact standard error of $\bar X$, not only an asymptotic approximation.
