Question about initialization Unit 4.2(Part 1-3) #50
Answered
by
rasbt
jihobak
asked this question in
Lecture videos and quizzes
-
Unit. 4.2 Multilayer Neural Networks (Part 1-3) Let's say activation func is relu and If we initialize weights and bias to zero, then all of hidden nodes have zero value after one forward propagation. here's a question, Can I say all parameters(weights and bias) don't change even after backprop? because backprob is the change of multiplication and there's zero layer in the middle. |
Beta Was this translation helpful? Give feedback.
Answered by
rasbt
May 19, 2023
Replies: 1 comment
-
Yeah, that's a fair point. Since all values are 0, even the bias units, there won't be an update since all gradient are 0 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
rasbt
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yeah, that's a fair point. Since all values are 0, even the bias units, there won't be an update since all gradient are 0