alt.hn

1/1/2026 at 2:53:50 PM

Build a Deep Learning Library

https://zekcrates.quarto.pub/deep-learning-library/

by butanyways

1/2/2026 at 1:05:32 AM

Thanks for sharing! It's inspiring to see more people "reinventing for insight" in the age of AI. This reminds me of my similar previous project a year ago when I built an entire PyTorch-style machine learning library [1] from scratch, using nothing but Python and NumPy. I started with a tiny autograd engine, then gradually created layer modules, optimizers, data loaders etc... I simply wanted to learn machine learning from first principles. Along the way I attempted to reproduce classical convnets [2] all the way to a toy GPT-2 [3] using the library I built. It definitely helped me understand how machine learning worked underneath the hood without all the fancy abstractions that PyTorch/TensorFlow provides. I eventually wrote a blog post [4] of this journey.

[1] https://github.com/workofart/ml-by-hand

[2] https://github.com/workofart/ml-by-hand/blob/main/examples/c...

[3] https://github.com/workofart/ml-by-hand/blob/main/examples/g...

[4] https://www.henrypan.com/blog/2025-02-06-ml-by-hand/

by megadragon9

1/2/2026 at 2:11:20 AM

During my Bachelor's, I wrote a small "immutable" algebraic machine learning library based on just NumPy. This made it easy to play around with combining weights by simply summing two networks by whatever operations are supported on normal NumPy arrays.

... turns out, this is only useful in some very specific scenarios, and it's probably not worth the extreme memory overhead.

by RestartKernel

1/1/2026 at 5:26:17 PM

This is cool! This summer I made something similar but in C++. The goal was to build an entire LLM, but I only got to neural networks. GitHub repo here: https://github.com/amitav-krishna/llm-from-scratch. I have a few blogs on this project on my website (https://amitav.net/building-lists.html, https://amitav.net/building-vectors.html, https://amitav.net/building-matrices.html (incomplete)). I hope to finish that series eventually, but some other projects have stolen the spotlight! It probably would have made more sense to write it in Python because I had no C++ experience.

by amitav1

1/1/2026 at 6:25:29 PM

Did something similar a while back [1], best way to learn neural nets and backprop. Just using Numpy also makes sure you get the math right without having to deal with higher level frameworks or c++ libraries.

[1] https://github.com/santinic/claudioflow

by csantini

1/1/2026 at 7:01:31 PM

Its nice! Yeah a lot of the heavy lifting is done by Numpy.

by butanyways

1/1/2026 at 7:48:53 PM

Isn't this what Karpathy does as well in the Zero to Hero lecture series on YT? I am sure this is great as well!

by silentsea90

1/1/2026 at 8:06:56 PM

If you are asking about the "micrograd" video then yes a little bit. "micrograd" is for scalars and we use tensors in the book. If you are reading the book I would recommend to first complete the series or atleast the "micrograd" video.

by butanyways

1/1/2026 at 5:34:11 PM

It's alright, but a C version would be even better to fully grasp the implementation details of tensors etc. Shelling out to numpy isn't particularly exciting.

by yunnpp

1/1/2026 at 5:39:42 PM

I agree! What NumPy is doing is actually quite beautiful. I was thinking of writing a custom c++ backend for this thing. Lets see what happens this year.

by butanyways

1/1/2026 at 7:41:18 PM

If someone is interested in low level tensor implementation details they could benefit from a course/book “let’s build numpy in C”. No need to complicate DL library design discussion with that stuff.

by p1esk

1/1/2026 at 8:01:37 PM

Yes!!

by butanyways

1/1/2026 at 7:52:46 PM

This is good. Its well positioned for software engineers to understand DL stuff beyond the frameworks.

by grandimam

1/1/2026 at 8:07:21 PM

thanks!!

by butanyways

1/1/2026 at 6:24:42 PM

[flagged]

by yazide

1/1/2026 at 8:39:01 PM

Perhaps obvious to some, but this does not seem to be about learning in the traditional sense, nor a library in the book sense, unfortunately.

by opan