Interested in the field of Machine Learning? Then this course is for you!
This course has been designed by two professional Data Scientists so that we can share our knowledge and help you learn complex theory, algorithms, and coding libraries in a simple way.
We will walk you step-by-step into the World of Machine Learning. With every tutorial, you will develop new skills and improve your understanding of this challenging yet lucrative sub-field of Data Science.
This course is fun and exciting, but at the same time, we dive deep into Machine Learning. It is structured the following way:
Part 1 – Data Preprocessing
Part 2 – Regression: Simple Linear Regression, Multiple Linear Regression, Polynomial Regression, SVR, Decision Tree Regression, Random Forest Regression
Part 3 – Classification: Logistic Regression, K-NN, SVM, Kernel SVM, Naive Bayes, Decision Tree Classification, Random Forest Classification
Part 4 – Clustering: K-Means, Hierarchical Clustering
Part 5 – Association Rule Learning: Apriori, Eclat
Part 6 – Reinforcement Learning: Upper Confidence Bound, Thompson Sampling
Part 7 – Natural Language Processing: Bag-of-words model and algorithms for NLP
Part 8 – Deep Learning: Artificial Neural Networks, Convolutional Neural Networks
Part 9 – Dimensionality Reduction: PCA, LDA, Kernel PCA
Part 10 – Model Selection & Boosting: k-fold Cross Validation, Parameter Tuning, Grid Search, XGBoost
Moreover, the course is packed with practical exercises that are based on real-life examples. So not only will you learn the theory, but you will also get some hands-on practice building your own models.
And as a bonus, this course includes both Python and R code templates which you can download and use on your own projects.
Important updates (June 2020):
CODES ALL UP TO DATE
DEEP LEARNING CODED IN TENSORFLOW 2.0
TOP GRADIENT BOOSTING MODELS INCLUDING XGBOOST AND EVEN CATBOOST!
Real-life examples of Machine Learning applications.
Greetings from instructors, and an SDS podcast about some machine learning concepts & an overview of popular machine learning algorithms.
The course introduction, the instructors, and the importance of Machine Learning.
Important notes, tips & tricks for Machine Learning A-Z course.
An important PDF. It contains the whole structure of Machine Learning A-Z course and the answers to important questions.
In this video, Kirill explains in details how to install R programming language and R studio on your computer so you can swiftly go through the rest of the course.
A short written summary of what needs to know in Object-oriented programming, e.g. class, object, and method.
What is regression? 6 types of regression models are taught in this course.
The math behind Simple Linear Regression.
Finding the best fitting line with Ordinary Least Squares method to model the linear relationship between independent variable and dependent variable.
Data preprocessing for Simple Linear Regression in R.
Fitting Simple Linear Regression (SLR) model to the training set using R function ‘lm’.
Predicting the test set results with the SLR model using R function ‘predict’ .
Visualizing the training set results and test set results with R package ‘ggplot2’.
An application of Multiple Linear Regression: profit prediction for Startups.
The math behind Multiple Linear Regression: modelling the linear relationship between the independent (explanatory) variables and dependent (response) variable.
The 5 assumptions associated with a linear regression model: linearity, homoscedasticity, multivariate normality, independence of error, and lack of multicollinearity.
Coding categorical variables in regression with dummy variables.
Dummy variable trap and how to avoid it.
An intuitive guide to 5 Stepwise Regression methods of building multiple linear regression models: All-in, Backward Elimination, Forward Selection, Bidirectional Elimination, and Score Comparison.
The math behind Polynomial Regression: modelling the non-linear relationship between independent variables and dependent variable.
Data preprocessing for Polynomial Regression in R.
Fitting Polynomial Regression model and Linear Regression model to the dataset in R.
Visualizing Linear Repression results and Polynomial Regression results and comparing the models' performance.
Predicting new results with Linear Regression model and Polynomial Regression model.
Template for regression modelling in R.
Understanding the intuition behind Support Vector Regression (SVR) for the linear case. Concepts like epsilon-insensitive tube and slack variables are explained in this tutorial.
Some info about upcoming tutorials on Support Vector Machines (SVM), Kernel functions and non-Linear Support Vector Regression (SVR)
Salary prediction with Support Vector Regression using R package ‘e1071’: data preprocessing, fitting, predicting, and visualizing the SVR results.
An intuitive guide to understanding Decision Tree Regression algorithms.