This is a graduate level course in Statistics aimed to acquaint students with the classical concepts of the subject. Familiarity with an elementary probability and advanced calculus class is assumed.
The following topics will be covered:
- Probability review
- Fundamentals of statistics: population, samples.
- Unbiased estimation- sufficiency, completeness, U-statistics.
- Parametric estimation – Bayes decision, minimaxity, admissibility, efficiency, MLE.
- Non-parametric estimation – Empirical distribution, bootstrap, jack-knife.
- Hypothesis testing and confidence intervals: Neyman-Pearson Lemma, Likelihood Ratio Test.
If time permits regression techniques and linear models will be touched upon.
Through this course, the students will:
- Understand classical probabilistic theories,
- Appreciate important statistical paradigms, and
- Apply useful learning methodologies in real-world problems.
By the end of the course, the students will be able to:
- Define basic statistical concepts such as model, estimator, inference, bias and consistency,
- State differences between parametric and non-parametric models,
- State differences between frequentist and Bayesian inference,
- Estimate the CDF and statistical functionals using the empirical distribution,
- Estimate standard errors and confidence intervals using the bootstrap,
- Derive the maximum likelihood estimator for important models,
- Apply suitable tests for different problems in hypothesis testing,
- Outline typical strategies in Bayesian inference,
- Outline typical strategies in statistical decision theory,
- Derive important estimators used in linear regression,
- Define the multinomial and multivariate normal distributions,
- Estimate integrals using importance sampling and Markov Chain Monte Carlo methods, and
- Review learning methodologies described in state-of-the-art research papers.
Prerequisites: Calculus, Linear Algebra, Multivariate Calculus, Elementary Probability.