Every day more and more use cases are found for machine learning. It is a great field to get into.

We just released a 10-hour machine learning course for beginners on the freeCodeCamp.org YouTube channel.

Ayush Singh developed this course. He is a young data scientist and machine learning engineer.

Here are the sections covered in this course:

Section 1: Basics of Machine Learning

• What is Machine Learning? The way I like to think about it!
• Cool Applications of Machine Learning
• Types of ML and Their Types
• Workflow of basic ML Problem
• Main Challenges of Machine Learning
• Dividing the data
• Two famous problems of Machine learning: Underfitting and Overfitting
• Solutions to the Overfitting and Underfitting
• Supervised Learning and Unsupervised Learning In Depth

Section 2: Linear Regression & Regularization

• What is Linear Regression? Visual Understanding
• Hypothesis Function Or Prediction Function
• Closed Form Solution aka Normal Equation
• Coding Normal Equation
• Cost Function
• Assumptions & Pros and Cons of Linear Regression
• Regularized Linear Models
• Ridge Regression
• Lasso Regression

Section 3: Logistic Regression & Performance Metrics

• Logistic Regression
• Hypothesis Function
• Cost Function
• Assumptions and Pros and Cons

Section 4: Support Vector Machine

• Support Vector Machines
• Linear SVM Classification
• Hard/Soft Margin Classification
• Non-Linear SVM Classification
• Polynomial Kernel [Homogenous & Inhomogeneous ]
• RBF Kernel
• Computing SVM Classifier
• Primal and Dual Problem
• Coordinate Descent
• Transductive SVM
• SVR

Section 5: PCA

• Review of Linear Transformation & EigenVectors and EigenValues
• Dimensionality Reduction Need
• Basic Intuition Behind PCA
• Data Preprocessing [Data Standardization]
• Compute the Covariance Matrix
• Compute the cumulative energy content for each eigenvector
• Select a subset of the eigenvectors as basis vectors
• Projecting Back

Section 6: Learning Theory

• Approx Estimation Error
• Empirical Risk Minimization
• Problem Sets releases

Section 7: Decision Trees & Random Forest

• Decision Trees
• Training of Decision Trees
• Prediction in Decision Trees
• Entropy
• Information Gain
• Gini Impurity
• Hyperparameter Tuning
• Project Proposal
• Decision Trees Assignment
• Ensemble Learning
• Ensemble Learning
• Bagging
• Random Forest
• Boosting
• XGboost
• Stacking

Section 7.5: Learning more algorithms and building more projects

• Naive Bayes
• K-Nearest Neighbors

Section 8: Unsupervised Learning Algorithms

• Unsupervised Learning and Clustering Intro and Types
• Cluster Analysis
• Unsupervised Learning Algorithms
• K-Means with K-Means ++
• Hierarchical Clustering Techniques
• Unsupervised Learning Problem set and project releases
• K-Means Programming Assignment

Section 9: Building Applications

• Building Heart Failure Detection System with deployment
• Building Fake news detection system
• Building Email Spam Detection System

Watch the full course below or on the freeCodeCamp.org YouTube channel (10-hour watch).