Engage with knowledgeable experts and get accurate answers on IDNLearn.com. Join our Q&A platform to receive prompt and accurate responses from knowledgeable professionals in various fields.

Initializing the parameters of a neural network with all zeroes will not back-propagate any gradient signal if we use the ReLU activation function. Note the ReLU activation function is given by σ(x) = max(0, x).
A. True
B. False.


Sagot :