This is good. Its well positioned for software engineers to understand DL stuff beyond the frameworks.
butanyways 13 minutes ago|
thanks!!
silentsea90 32 minutes ago||
Isn't this what Karpathy does as well in the Zero to Hero lecture series on YT? I am sure this is great as well!
butanyways 14 minutes ago|
If you are asking about the "micrograd" video then yes a little bit. "micrograd" is for scalars and we use tensors in the book. If you are reading the book I would recommend to first complete the series or atleast the "micrograd" video.
Did something similar a while back [1], best way to learn neural nets and backprop. Just using Numpy also makes sure you get the math right without having to deal with higher level frameworks or c++ libraries.
Its nice! Yeah a lot of the heavy lifting is done by Numpy.
yunnpp 2 hours ago||
It's alright, but a C version would be even better to fully grasp the implementation details of tensors etc. Shelling out to numpy isn't particularly exciting.
butanyways 2 hours ago|
I agree! What NumPy is doing is actually quite beautiful. I was thinking of writing a custom c++ backend for this thing. Lets see what happens this year.
p1esk 39 minutes ago||
If someone is interested in low level tensor implementation details they could benefit from a course/book “let’s build numpy in C”. No need to complicate DL library design discussion with that stuff.