Skip to content

AlexGiazitzis/RNN-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RNN from scratch

An implementation of the Recurrent Neural Network architecture from the ground up using NumPy.

The model is trained in a simple alphabet sequence, for demonstration purposes. The back propagation through time algorithm was implemented with a gradient descent optimizer and negative log likelihood loss function hardcoded, as this project serves educational purposes. By no means should it be used in production.

During the training phase, the model state with the least loss and highest accuracy is saved, which is then used for inference. A pretrained model is contained in the repository, which can be used similarly to the inference part of the train.py file.

About

An implementation of the RNN architecture using NumPy.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages