mynet My implementation of a MLP for self-education purposes using raw numpy. Types of layers implemented: Linear, ReLu activation layer, BatchNorm Optimizing algorithm: Adam using mini-batches