Topics covered: 1. Basic probability - random variables, cumulative distribution function, probability density function, marginal and conditional densities, joint density, Bayes rule (all illustrated with a simple 1D 2-class classification problem with Gaussian class-conditional densities) 2. Expectations and conditional expectations of random variables, iterated expectation (E[E[x|y]]), mean, mean square value, variance, covariance of pair of random variables, correlation (E[xy]). 3. Notion of i.i.d. samples and statistics calculated from them: sample mean, sample variance, mean and variance of sample mean, biased vs. unbiased estimates, rate of convergence of sample mean to true mean (i.e. variance of sample mean goes as 1/n, where n is the number of i.i.d. variables in the sample) 4. Extension of random variable concept to random vectors. CDF and PDF defined for vectors. Expectations of scalar and vector values functions of random vectors (e.g. mean of random vector, covariance matrix of random vector) 5. Topics related to matrices: invertible matrices, symmetric matrices, positive definite matrices, and connections between these, eigenvalue decomposition, symmetric matrix => eigvals real, orthonormal eigvecs etc. Primarily provided a set of pointers to results that students should know (e.g. positive definite symm matrix => eigval positive, symm matrix with non-zero eigvals => matrix invertible) rather than proving these in class