Skip to content

Latest commit

 

History

History
25 lines (13 loc) · 1.43 KB

readme.md

File metadata and controls

25 lines (13 loc) · 1.43 KB

MultiLayer Perceptron

This is a simple multilayer perceptron neural network class thing

I wrote everything from scratch with a VERY BASIC understanding of neural networks, so nothing about this is optimized or efficient or well done

It has the ability to accept any number of inputs, have any number of hidden layers with any number of neurons for each, and produce any number of outputs. It uses tanh (or sigmoid if you want) as the activation function and is very slow because I implemented my matrix class super badly

It's a feed forward network which uses backpropagation to 'learn' - using supervsed learning or possibly reinforcement learning

It has only really been tested by learning to solve XOR but I plan on having it do the classic handwritten digit recognition thing and also have it learn to play some games

Resources I used to make this: