Schoolhouse.world: peer tutoring, for free.
Free SAT® Prep, as part of a research study.
SAT® Bootcamps
Free SAT® Prep, as part of a research study.
Get free help applying to college.
College Admissions Workshops
Get free help applying to college.
1-on-1 conversations on global topics
Dialogues
1-on-1 conversations on global topics
A global network of volunteers.
Explore Tutors
A global network of volunteers.
Introduction to Machine Learning | The Math Behind Neural Nets | Then We Build One

SAT Score Range

10 sessions

+3

🔥 4 spots left!

About

Hands-on, math-first class in machine and deep learning. We'll start at supervised learning, loss functions, gradient descent, and the chain rule, then work forward and backprop by hand for a tiny model. We'll compare and contrast layer types (linear, conv;
quick tour of transfoerms/RNNs) and training heuristics (like initialization, normalization, regularization, early stopping, and SGD vs. Adam).

Hands-on time: create your toy net using NumPy, recreate it in PyTorch and get it to learn on a very small real dataset. Short whiteboarding blocks, live coding, and frequent check-ins where you get to do it yourself will occur. Colab links will be given out. Python familiarity is preferred, but not required; the math is graspable and is performed so it will stick, though basic knowledge of calculus is required.

Tutored by

Alex S 🇸🇲

View Profile

Hi everyone, I'm a rising senior from Italy, enrolled in the IB program. I'm heavily STEM oriented, doing research in Neuroscience and Machine Learning. I'm here in the first place to help you out in Maths & Computer science, but also to meet and know you as a person!!

✋ ATTENDANCE POLICY

Please do not miss more than two sessions in a row; do let me know beforehand when you can't attend.

SESSION 1

25

Aug

SESSION 1

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:00 AM UTCAug 25, 11:00 PM - Aug 26, 12:00 AM UTC

What We'll Do: agree on goals, scope, and how the course progresses. Get Colab/PyTorch set up, present the starter repo, and get some tiny data in. Quick demo: train a line, examine under/overfitting. You depart with working tools and a baseline notebook.
SESSION 2

28

Aug

SESSION 2

Artificial Intelligence

Artificial Intelligence

Thu 11:00 PM - Fri, 12:30 AM UTCAug 28, 11:00 PM - Aug 29, 12:30 AM UTC

We will derive MSE and gradient step by step, then implement vanilla gradient descent using NumPy. You will play with different learning rates, look at loss curves, and understand why batching is significant. Pithy takes-home points about step size, convergence, and "why the math counts."
SESSION 3

1

Sep

SESSION 3

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:30 AM UTCSep 1, 11:00 PM - Sep 2, 12:30 AM UTC

Change to classification: sigmoid, logits, and cross-entropy. We will develop decision boundaries using a 2-D toy dataset and L2 regularization , fighting overfit. You will get some exposure to precision/recall and class imbalance , and how your model is affected.
SESSION 4

4

Sep

SESSION 4

Artificial Intelligence

Artificial Intelligence

Thu 11:00 PM - Fri, 12:30 AM UTCSep 4, 11:00 PM - Sep 5, 12:30 AM UTC

We use an ultrasmall two-layer net in NumPy and compute backprop by hand using the chain rule. You will implement gradient checking to check that your math is accurate, flip activations (ReLU vs. Tanh) and explain how they differ.
SESSION 5

7

Sep

SESSION 5

Artificial Intelligence

Artificial Intelligence

Sun 11:00 PM - Mon, 12:30 AM UTCSep 7, 11:00 PM - Sep 8, 12:30 AM UTC

Make the math a neat training loop: mini-batches, init, normalization, validation split, early stopping. We'll include accuracy/F1 and confusion matrices so outcomes aren't merely "loss decreased" but also truthful.
SESSION 6

10

Sep

SESSION 6

Artificial Intelligence

Artificial Intelligence

Wed 11:00 PM - Thu, 12:30 AM UTCSep 10, 11:00 PM - Sep 11, 12:30 AM UTC

`Dataset/DataLoader`, and an official `nn.Module` way of doing things. We will synchronize PyTorch gradients to your NumPy solution and sanity-check everything.
SESSION 7

22

Sep

SESSION 7

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:30 AM UTCSep 22, 11:00 PM - Sep 23, 12:30 AM UTC

What do convolution calculate, stride/padding, receptive field, and pooling. Training a tiny CNN on Fashion-MNIST and inspecting feature maps. We will compare CNN vs. MLP and look at why locality/weight sharing dominates.
SESSION 8

29

Sep

SESSION 8

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:30 AM UTCSep 29, 11:00 PM - Sep 30, 12:30 AM UTC

Weight decay (vs. plain L2), dropout, data augmentation, and batch norm—what each fixes and when it bites. You’ll run a simple ablation (±dropout/aug/wd) and pick a best recipe with evidence.
SESSION 9

6

Oct

SESSION 9

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:30 AM UTCOct 6, 11:00 PM - Oct 7, 12:30 AM UTC

SGD + momentum vs. Adam; cosine/one-cycle schedules; gradient clipping; seeds and reproducibility. You’ll tune LR/WD, log curves, inspect gradients/activations, and practice a checklist for “why is my loss stuck?”
SESSION 10

13

Oct

SESSION 10

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:30 AM UTCOct 13, 11:00 PM - Oct 14, 12:30 AM UTC

Choose a small image or text task. You’ll scope a baseline, train a clean PyTorch model, track metrics/plots, and write a short results note (what worked, what didn’t, next steps). Quick share-outs to close the loop.

Public Discussion

Please log in to see discussion on this series.

Aug 25 - Oct 14

8 weeks

60 - 90 mins

/ session

Next session on August 25, 2025

SCHEDULE

Mondays

11:00PM