## Teaching Notes

- The notes below are written in a concise form that is less readable than a textbook. They may be useful as condensed summaries but are not meant to be a replacement for a more complete explanation.
- Some of the notes were written as part of course material. As such they are not written in full generality and sometimes are not completely accurate. For example, measure-theoretic arguments are omitted and some assumptions such as differentiability are implicitly made.
- Please notify me if you find any mistakes or typos.

## Probability (Most of these notes are superceded by my probability book)

- Sample space, events, event space, probability as a function and its axioms. Discrete and continuous sample spaces
- Probability distributions on finite sample spaces, classical model for finite and continuous sample spaces
- Conditional probability, independence of events, Bayes theorem
- Random variables: basic definitions, discrete and continuous
- Important random variables: Bernoulli, geometric, binomial, Poisson, uniform, exponential, normal
- Basic Combinatorics for Probability
- Functions of Random Variable
- Expectations and Variances
- Vector of Random Variables - basic definitions
- Functions of vector of random variables
- Conditional probability and vector random variables
- Expectation and vector random variables
- The multinomial and the multivariate Gaussian distributions
- The weak law of large numbers and central limit theorem
- The moment generating function
- The Poisson process
- Statistics

## Statistics

- Sampling Distributions
- The bias, variance and MSE of estimators
- Confidence intervals
- Relative efficiency, efficiency and the Fisher information
- Consistency of estimators
- Sufficient statistics
- The Rao-Blackwell theorem and the UMVUE
- The method of moments
- Maximum likelihood estimation
- Exponential family of distributions and logistic regression
- Hypothesis tests
- p-Values, power and the Neyman Pearson lemma
- Pearson’s Chi-square
- Linear regression
- Inference in High Dimensions and Regularization
- Consistency of the maximum likelihood estimator
- Asymptotic Efficiency of the maximum likelihood estimator
- Linear Classifiers and Logistic Regression
- Basic sampling methods
- Markov chain Monte Carlo and Metropolis-Hastings
- Missing data and the EM algorithm
- Hidden Markov models
- Statistical Classifiers - the Generative Story
- Support vector machines
- Inference in Linear Regression
- M-estimators and Z-estimators
- Stein’s Unbiased Risk Estimator
- Shrinkage and the James-Stein Estimator for Normal Means

## Random Processes and LTI filtering (Most of these notes are superceded by my probability book)

- Random Processes - Basic Definitions
- Discrete-time discrete-valued random processes (the iid process and counting processes)
- Discrete-time discrete-valued random processes (the iid process and counting processes)
- Continuous-time discrete-valued random processes (Poisson process)
- Continuous-time continuous-valued random processes (Gaussian processes and the Wiener process)
- Stationary and wide sense stationary (WSS) processes
- Response of LTI systems to WSS processes
- Markov chains