Schoolhouse.world: peer tutoring, for free.
Free SAT® Prep, as part of a research study.
SAT® Bootcamps
Free SAT® Prep, as part of a research study.
Get free help applying to college.
College Admissions Workshops
Get free help applying to college.
1-on-1 conversations on global topics
Dialogues
1-on-1 conversations on global topics
A global network of volunteers.
Explore Tutors
A global network of volunteers.
Introduction to Machine Learning | The Math Behind Neural Nets | Then We Build One

SAT Score Range

10 sessions

+5

🔥 2 spots left!

About

Hands-on, math-first class in machine and deep learning. We'll start at supervised learning, loss functions, gradient descent, and the chain rule, then work forward and backprop by hand for a tiny model. We'll compare and contrast layer types (linear, conv;
quick tour of transfoerms/RNNs) and training heuristics (like initialization, normalization, regularization, early stopping, and SGD vs. Adam).

Hands-on time: create your toy net using NumPy, recreate it in PyTorch and get it to learn on a very small real dataset. Short whiteboarding blocks, live coding, and frequent check-ins where you get to do it yourself will occur. Colab links will be given out. Python familiarity is preferred, but not required; the math is graspable and is performed so it will stick, though basic knowledge of calculus is required.

Tutored by

Alex S 🇸🇲

View Profile

Hi everyone, I'm a rising senior from Italy, enrolled in the IB program. I'm heavily STEM oriented, doing research in Neuroscience and Machine Learning. I'm here in the first place to help you out in Maths & Computer science, but also to meet and know you as a person!!

✋ ATTENDANCE POLICY

Please do not miss more than two sessions in a row; do let me know beforehand when you can't attend.

SESSION 7

22

Sep

SESSION 7

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:30 AM UTCSep 22, 11:00 PM - Sep 23, 12:30 AM UTC

What do convolution calculate, stride/padding, receptive field, and pooling. Training a tiny CNN on Fashion-MNIST and inspecting feature maps. We will compare CNN vs. MLP and look at why locality/weight sharing dominates.
SESSION 8

29

Sep

SESSION 8

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:30 AM UTCSep 29, 11:00 PM - Sep 30, 12:30 AM UTC

Weight decay (vs. plain L2), dropout, data augmentation, and batch norm—what each fixes and when it bites. You’ll run a simple ablation (±dropout/aug/wd) and pick a best recipe with evidence.
SESSION 9

6

Oct

SESSION 9

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:30 AM UTCOct 6, 11:00 PM - Oct 7, 12:30 AM UTC

SGD + momentum vs. Adam; cosine/one-cycle schedules; gradient clipping; seeds and reproducibility. You’ll tune LR/WD, log curves, inspect gradients/activations, and practice a checklist for “why is my loss stuck?”
SESSION 10

13

Oct

SESSION 10

Artificial Intelligence

Artificial Intelligence

Mon 11:00 PM - Tue, 12:30 AM UTCOct 13, 11:00 PM - Oct 14, 12:30 AM UTC

Choose a small image or text task. You’ll scope a baseline, train a clean PyTorch model, track metrics/plots, and write a short results note (what worked, what didn’t, next steps). Quick share-outs to close the loop.

Public Discussion

Please log in to see discussion on this series.

Aug 25 - Oct 14

8 weeks

60 - 90 mins

/ session

Next session on September 22, 2025

SCHEDULE

Mondays

11:00PM