Skip to content

Python code to create an infinitely extendable vanilla feed forward neural network from scratch

Notifications You must be signed in to change notification settings

amokhvarma/Neural_Network_From_Scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Neural_Network_From_Scratch

The Code has been written using only numpy for calculations and matplotlib.pyplot for plots

The code can be run by using the following steps :

!git clone https://github.com/amokhvarma/Neural_Network_From_Scratch.git
!cd Neural_Network_From_Scratch
!python3 Neural_Network.py

To install numpy :

!pip3 install numpy

The hyperparameters available to you are :

l=3 #Number of layers
layer_neuron=np.array([784,120,10]) #Number of neurons in each layer. 1st must be 784 and last must be 10
alpha = 0.11 # learning rate
lambda = 0.1 # regularisation contant

Change the above variable to extend the neural network infinitely.

Note

  • input size must be (784,1) and not (28,28)

download

Improvements to this can be made by :

  1. Changing to Batch Gradient Descent.
  2. LR scheduling
  3. Grid/Linear Search for hyperparameter tuning.
  4. Weight initialisation using special methods like xavier initialisation.

    https://towardsdatascience.com/weight-initialization-in-neural-networks-a-journey-from-the-basics-to-kaiming-954fb9b47c79 is a fairly detailed analysis and intuition for weight initialisation.

About

Python code to create an infinitely extendable vanilla feed forward neural network from scratch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published