Ermin presents the material through an interactive whiteboard presentation.
The course starts with Linear Algebra.
We start with a definition of what a linear equation is, look at forms of a linear equation, define systems of linear equations, consider notation, and how to solve systems of equations via Row Echelon Form (REF) and Reduced Row Echelon Form (R-REF), and perform matrix-vector multiplication. Then, we explore the concept of mathematical structures to better understand the idea of a vector space, before dealing with concepts like subspaces, bases for vector spaces, dimensions of a vector space/subspace, linear maps, orthogonal projection, and how that is related to least-squares approximation.
The next section is an intro to probability. You will first explore the idea of probability models and axioms, simple counting, before considering discrete cases of marginal probability, conditional probability, and Bayesian probability. You will also discover the concept of independence and permutations and combinations. Next, the idea of a random variable is illustrated, along with the probability mass and density function, cumulative distribution function, covariance/correlation, the law of large numbers, and central limit theorem. In the final part, you will discover statistical inference. You will see how the Bayesian Estimator works.