Skip to content

Agurato/goceptron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Perceptron library in Golang

Build Status Go Report Card GoDoc License: GPL v3

A library implementing a multi-layer perceptron in Golang

Quick How-to-Use

Creation of the network

3 parameters are necessary:

  • size of the input layer (int)
  • sizes of the hidden layers (slice of int)
  • size of the output layer (int)
var (
    p                 gct.Percetron
    inputLayerSize    int
    hiddenLayersSizes []int
    outputLayerSize   int
)
inputLayerSize = 784
hiddenLayersSizes = []int{100}
outputLayerSize = 10
p.Init(inputLayersize, hiddenLayersSizes, outputLayersize)

Learning

For the forward propagation, you can either implement your own neuron activation function, using p.ComputeFromInputCustom, or use p.ComputeFromInput to use a sigmoid function. To make the backpropagation, you give as parameters, the expected values of each output neuron, as well as the learning rate eta.

var (
    expected   []float64
    eta        float64
    activation func(input float64) float64
    mse        float64
)
activation = func(input float64) float64 {
    return 1 / (1 + math.Exp(-input))
}
expected = make([]float64, 10)
eta = 0.3

// Init input layer here
// Modify expected values

p.ComputeFromInputCustom(activation)
mse = p.Backpropagation(expected, eta)

Testing the neural network

To test the neural network and get the recognition rate, use TryRecognitionCustom (to be able to use your own activation function) or TryRecognition (to use a sigmoid). The return value is the rate of recognition (between 0 and 1)

var (
    rate float64
)
rate = p.TryRecognitionCustom(activation)

Full example, using the MNIST dataset, is available here.

More documentation here.

TODO

  • More GoDoc