We just posted a course on the freeCodeCamp.org YouTube channel that will teach you how to become an AI Researcher.
This course will guide you step-by-step, starting with the foundational mathematics essential for understanding modern AI, before diving into PyTorch fundamentals. You will then learn about the building blocks of AI, from simple neural networks to the complexities of multi-layer architectures. The course ends with an in-depth module on Transformers, the critical technology underpinning today's Large Language Models (LLMs) and generative AI.
Here are the sections in this course:
Introduction & Course Overview
Welcome & Course Overview
Requirements & Setup for the Course
Module 1: Foundational Mathematics for AI Research
Math Lesson: Functions (Linear, Quadratic, Cubic, Square Root)
Math Lesson: Derivatives (Rate of Change)
Math Lesson: Vectors (Magnitude, Dot Product, Normalization)
Math Lesson: Gradients (Steepest Ascent/Descent, Partial Derivatives)
Math Lesson: Matrices (Multiplication, Transpose, Identity)
Math Lesson: Probability (Expected Value, Conditional Probability)
Module 2: PyTorch Fundamentals
START: PyTorch Fundamentals & Creating Tensors
PyTorch Lesson: Reshaping and Viewing Tensors
PyTorch Lesson: Squeezing and Unsqueezing Dimensions
PyTorch Lesson: Indexing and Slicing Tensors
PyTorch Lesson: Special Tensors (Zero, Ones, Linspace)
Module 3: Neural Networks
START: Coding Neural Networks from Scratch
Neural Networks Lesson: Single Neuron (Weights, Bias, Weighted Sum)
Neural Networks Lesson: Activation Functions (Sigmoid, ReLU, tanh)
Neural Networks Lesson: Multi-Layer Networks & Backpropagation
Module 4: Transformers (for Large Language Models)
START: Understanding Transformers for LLMs
Transformers Lesson: Attention Mechanism (Query, Key, Value)
Transformers Lesson: Self-Attention & Causal Self-Attention
Transformers Lesson: Rotary Positional Embeddings (RoPE)
Transformers Lesson: Multi-Head Attention
Transformers Lesson: Transformer Block (Feed-Forward, Add & Norm)
Tokenization (for GPT Architecture)
Conclusion
- Conclusion & Next Steps
Watch the full course on the freeCodeCamp.org YouTube channel (3-hour watch).