nnayz 21 hours ago

I’ve been tinkering with a small side project called “MyTorch.” It’s a from-scratch, pure-Python re-imagining of the core ideas behind PyTorch, squeezed into a few hundred lines of code. The goal was to demystify autograd and neural-network plumbing by stripping everything down to the essentials while still being able to train a real model. Under the hood there’s a NumPy-backed Tensor class that supports automatic differentiation, a minimal nn.Module system with Linear layers and activation functions, plus SGD and Adam optimizers. A simple three-layer MLP reaches roughly 97 % accuracy on MNIST, and an evaluation script prints accuracy, per-class precision/recall/F1, and the full confusion matrix. The project uses a modern Python workflow with a pyproject.toml file and the uv package manager, so environment setup is quick and dependency management is clean.