ESE 531: Statistical Learning and Inference

Graduate, Stony Brook University, Department of Electrical and Computer Engineering, 2024

Course Description

ESE 531 is a graduate-level course on statistical estimation, inference, and detection. The course develops the tools used to infer unknown quantities from data: random samples, point estimation, estimator evaluation, Bayesian inference, linear models, and hypothesis testing.

The notes below are web-native course materials organized by topic. The sequence is meant to move from the behavior of random samples, through estimators and their limits, into Bayesian computation, linear models, and detection.

Topic Sequence

  1. Random Samples and Sample StatisticsPopulation models, iid samples, sample means, sample variances, and why statistics are random variables.
  2. Probability Inequalities and Limit TheoremsMarkov, Chebyshev, Chernoff, convergence modes, laws of large numbers, the central limit theorem, Slutsky, and the delta method.
  3. Point EstimationIdentifiability, method of moments, likelihood construction, maximum likelihood estimation, constraints, and boundary cases.
  4. MLE Properties and Numerical OptimizationConsistency, Fisher information, asymptotic normality, invariance, convexity, gradient methods, Newton updates, and exponential families.
  5. Expectation-MaximizationLatent-variable likelihoods, EM monotonicity, and Gaussian mixture model updates.
  6. Estimator Risk and Scalar Cramer-Rao BoundsMSE, bias-variance decomposition, risk, MVUEs, Fisher information benchmarks, and scalar efficiency.
  7. Efficient Estimators and Vector Cramer-Rao BoundsEquality in the CRLB, transformed parameters, nuisance parameters, and matrix lower bounds.
  8. Bayesian EstimationBayes' theorem, conjugacy, MAP, MMSE, beta-Bernoulli updating, Gaussian priors, and improper flat priors.
  9. Prior Design and Predictive ChecksPrior strength, reparameterization, Jeffreys prior, calibration by moment matching, and posterior predictive diagnostics.
  10. Approximate Bayesian InferencePredictive accuracy, cross-validation, Laplace approximation, variational inference, and the role of approximation error.
  11. Monte Carlo MethodsMonte Carlo error, rejection sampling, importance sampling, effective sample size, and Metropolis-Hastings.
  12. Linear Models and Least SquaresNormal equations, projection geometry, covariance, Gauss-Markov intuition, generalized least squares, and whitening.
  13. Detection TheoryBinary hypothesis tests, likelihood ratios, Neyman-Pearson testing, Gaussian mean and variance detection, and Bayes risk.
  14. Detection Theory ExamplesWorked mean-shift, variance-change, threshold, and change-point calculations.

Interactive Demos

These static browser demos are practical companions to the topic notes. Each demo now opens with a short mathematical setup and a few guided mini-cases before the browser widget, so students can connect the lecture formulas with what the controls are changing. The demo pages also include copyable Python snippets for students who want to reproduce or extend the calculation in a notebook.

Practice

Topics

  • Properties of random samples: sampling distributions, sample mean, sample variance, probability inequalities, convergence, laws of large numbers, central limit theorem, and delta method.
  • Point estimation: method of moments, maximum likelihood, sufficient statistics, exponential families, constrained estimation, numerical optimization, and expectation-maximization.
  • Evaluation of estimators: mean squared error, bias-variance tradeoff, MVUEs, Fisher information, Cramer-Rao bounds, efficiency, and predictive performance.
  • Bayesian estimation: priors, Bayes’ theorem, conjugacy, MAP, MMSE, Jeffreys prior, posterior predictive checks, Laplace approximation, Monte Carlo, and MCMC.
  • Linear models and detection theory: least squares, generalized least squares, Gaussian noise models, likelihood-ratio tests, Neyman-Pearson testing, ROC curves, and Bayes risk.
  • S. M. Kay, Fundamentals of Statistical Signal Processing: Estimation Theory, Prentice Hall, 1993.
  • S. M. Kay, Fundamentals of Statistical Signal Processing: Detection Theory, Prentice Hall, 1998.
  • G. Casella and R. Berger, Statistical Inference, 2nd ed., Duxbury Press, 2002.

Prerequisites

  • Calculus
  • Linear algebra
  • Probability theory