IDNLearn.com provides a collaborative environment for finding accurate answers. Ask anything and receive comprehensive, well-informed responses from our dedicated team of experts.
Answer:
A. To scale the output of each layer
Explanation:
The batch norm normalizes the output of a previous activation layer by subtracting the batch mean and dividing by the standard deviation