ML Basics
Classical Machine Learning
14 tutorials
What is Machine Learning?
The big picture — what ML is, why it matters, and where it shows up in the real world.
ML Foundations
Supervised, unsupervised, and reinforcement learning — the three ways a machine can learn.
Linear Regression
Predict numbers using a straight line — the simplest and most widely used idea in all of ML.
Logistic Regression
Sort things into two buckets — spam or not, sick or healthy — using a surprisingly simple trick.
Decision Trees
Make decisions step by step — like a flowchart your computer can learn from data.
Random Forests
Build hundreds of trees and let them vote — together they're far smarter than any one tree alone.
Gradient Boosting
Build a powerful model by training each new tree to fix what the last one got wrong.
XGBoost
A faster, smarter take on Gradient Boosting — the algorithm that wins most Kaggle competitions.
Support Vector Machines
Find the widest possible boundary between two groups — works even when data isn't linearly separable.
K-Nearest Neighbors
Classify anything by asking: what do its closest neighbours look like?
Naive Bayes
Make smart predictions using probability — fast, simple, and surprisingly good at text.
PCA
Squash high-dimensional data down to fewer dimensions while keeping the most important patterns.
K-Means Clustering
Group similar data points together without any labels — the machine figures out the clusters itself.
What's Next?
A signpost page — where you are in the ML journey and what comes next.
Deep Learning
Neural Networks
6 tutorials
What are Neural Networks?
The idea that changed everything — loosely inspired by the brain, but really just clever maths.
MLP
Your first real neural network — stack some layers, train it on data, watch it learn.
What is Deep Learning?
What makes a network "deep" — and why depth unlocks capabilities shallower models simply can't reach.
CNN
Neural networks that see — how computers learn to recognise cats, faces, and X-rays.
RNN
Networks with memory — built for sequences like text, speech, and time-series data.
LSTM
A smarter RNN that can remember things from way earlier in a sequence without forgetting.