Skip to content

Latest commit

 

History

History
5 lines (4 loc) · 387 Bytes

README.md

File metadata and controls

5 lines (4 loc) · 387 Bytes

Dropout-and-Batch-Normalization

In this lesson, I explored two special types of layers in deep learning: The Dropout layer and the Batch Normalization layer. These layers do not contain neurons but add essential functionality to neural networks, often used in modern architectures.

Feel free to check the whole cource here: https://www.kaggle.com/learn/intro-to-deep-learning/course