A library implementing a multi-layer perceptron in Golang
3 parameters are necessary:
- size of the input layer (int)
- sizes of the hidden layers (slice of int)
- size of the output layer (int)
var (
p gct.Percetron
inputLayerSize int
hiddenLayersSizes []int
outputLayerSize int
)
inputLayerSize = 784
hiddenLayersSizes = []int{100}
outputLayerSize = 10
p.Init(inputLayersize, hiddenLayersSizes, outputLayersize)
For the forward propagation, you can either implement your own neuron activation function, using p.ComputeFromInputCustom
, or use p.ComputeFromInput
to use a sigmoid function.
To make the backpropagation, you give as parameters, the expected values of each output neuron, as well as the learning rate eta.
var (
expected []float64
eta float64
activation func(input float64) float64
mse float64
)
activation = func(input float64) float64 {
return 1 / (1 + math.Exp(-input))
}
expected = make([]float64, 10)
eta = 0.3
// Init input layer here
// Modify expected values
p.ComputeFromInputCustom(activation)
mse = p.Backpropagation(expected, eta)
To test the neural network and get the recognition rate, use TryRecognitionCustom
(to be able to use your own activation function) or TryRecognition
(to use a sigmoid). The return value is the rate of recognition (between 0 and 1)
var (
rate float64
)
rate = p.TryRecognitionCustom(activation)
Full example, using the MNIST dataset, is available here.
More documentation here.
- More GoDoc