Mini project — continual learning
Forget nothing.
Learn everything.
Comparing four anti-forgetting strategies on the UCI Human Activity Recognition dataset. Watch a neural network learn 3 sequential tasks — and see which methods remember the past.
Dataset
UCI HAR
Features
561
Tasks
3 binary
Architecture
561 → 256 → 256 → 1
Subjects
30
Samples
10K
T1
Walking vs Walk-upstairs
T2
Walk-downstairs vs Sitting
T3
Standing vs Laying
// select method
baseline
Fine-tuning
No protection. Demonstrates catastrophic forgetting — the lower bound to beat.
best result
Experience Replay
Keeps a circular buffer of past samples. Replays during training on new tasks.
no stored data
LwF
Knowledge distillation. No raw data stored — privacy-friendly alternative.
structural fail
EWC
Fisher penalty anchors important weights. Fails due to shared output head.
Accuracy matrix A[i,j]
T1
T2
T3
T1
—
—
—
T2
—
—
—
T3
—
—
—
low → high accuracy
■ diagonal = trained task
Final metrics
Avg accuracy ↑
—
BWT ↑
—
FWT ↑
—
Forgetting ↓
—
Metrics will appear after training completes.
Training log