We will look first into linear Regression, where we will learn to predict continuous variables and this will include details of Simple and Multiple Linear Regression, Ordinary Least Squares, Testing your Model, R-Squared, and Adjusted R-Squared.
We will get full details of Logistic Regression, which is by far the most popular model for Classification. We will learn all about Maximum Likelihood, Feature Scaling, The Confusion Matrix, Accuracy Ratios.... and you will build your very first Logistic Regression
We will look in to Naïve bias classifier which will give full details of Bayes Theorem, and implementation of Naïve bias in machine learning. This can be used in Spam Filtering, Text analysis, •Recommendation Systems.
Random forest algorithm can be used in regression and classification problems. This gives good accuracy even if data is incomplete.
A Decision Tree is a Supervised learning technique that can be used for both classification and Regression problems, but mostly it is preferred for solving Classification problems.
We will look in to KNN algorithm which will working way of KNN algorithm, compute KNN distance matrix, Makowski distance, live examples of implementation of KNN in industry.
We will look in to PCA, K-means clustering, and Agglomerative clustering which will be part of unsupervised learning.
Along all parts of machine-supervised and unsupervised learning , we will be following data reading , data prerprocessing, EDA, data scaling, preparation of training and testing data along machine learning model selection , implemention and prediction of models.