Skip to content

AutoGrad: A Learning Tool for Automatic Differentiation.This project features two autograd engines that follow patterns from Andrej Karpathy's micrograd: A minimal scalar-valued engine that provides a clear understanding of basic neural network operations. An optimized tensor engine that demonstrates improved performance techniques.

License

Notifications You must be signed in to change notification settings

kabatyy/AutoGrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

46 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AutoGrad

AutoGrad Illustration

This project features an autograd engine that follow patterns from Andrej Karpathy's micrograd. It has two abstractions:

  1. A minimal scalar-valued class that provides a clear understanding of basic neural network operations.(Start here!)
  2. An optimized tensor class that demonstrates improved performance techniques.

Both classes implement backpropagation through dynamically constructed computational graphs (DAGs). The project also includes neural network implementations built on top of both classes.

  • Install with pip install -e . The only runtime dependency is numpy.

About

AutoGrad: A Learning Tool for Automatic Differentiation.This project features two autograd engines that follow patterns from Andrej Karpathy's micrograd: A minimal scalar-valued engine that provides a clear understanding of basic neural network operations. An optimized tensor engine that demonstrates improved performance techniques.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published