Skip to content

Latest commit

 

History

History
6 lines (4 loc) · 198 Bytes

README.md

File metadata and controls

6 lines (4 loc) · 198 Bytes

mynet

My implementation of a MLP for self-education purposes using raw numpy.

Types of layers implemented: Linear, ReLu activation layer, BatchNorm

Optimizing algorithm: Adam using mini-batches