Skip to content

Test implementation of forward-mode and reverse-mode autograd algorithms

Notifications You must be signed in to change notification settings

mauicv/autograd-test

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Example Autograd Implmentation


Example implementation of reverse mode automatic differentiation for deep neural networks.

Training loss plot

Loss over 20000 training steps of a network aiming to model the 2 dimensional function on the left bellow. The plot on the right is the model output.

True output and model output

Usage:

from src.model import Model
from src.layer import Layer
from src.functions import Linear, LeakyRelu

layer_dims = [2, 30, 30, 1]
model = Model()
layers = [Layer(d1, d2, activation=LeakyRelu(0.1)) for d1, d2
          in zip(layer_dims[:-1], layer_dims[1:-1])]
last_layer = Layer(layer_dims[-2], layer_dims[-1],
                   activation=Linear())
layers.append(last_layer)
model.add_layers(layers)
model.compile()

X = np.uniform.random(-1, 1, 2)
Y = np.uniform.random(-1, 1, 1)
loss, grads = model.compute_grads(X, Y)

About

Test implementation of forward-mode and reverse-mode autograd algorithms

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages