Skip to content

In this lesson, I explored two special types of layers in deep learning: The Dropout layer and the Batch Normalization layer. These layers do not contain neurons but add essential functionality to neural networks, often used in modern architectures.

Notifications You must be signed in to change notification settings

mbakos95/Dropout-and-Batch-Normalization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Dropout-and-Batch-Normalization

In this lesson, I explored two special types of layers in deep learning: The Dropout layer and the Batch Normalization layer. These layers do not contain neurons but add essential functionality to neural networks, often used in modern architectures.

Feel free to check the whole cource here: https://www.kaggle.com/learn/intro-to-deep-learning/course

About

In this lesson, I explored two special types of layers in deep learning: The Dropout layer and the Batch Normalization layer. These layers do not contain neurons but add essential functionality to neural networks, often used in modern architectures.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published