Find accurate and reliable answers to your questions on IDNLearn.com. Ask your questions and receive prompt, detailed answers from our experienced and knowledgeable community members.
in gradient descent technique, we chose an alpha value (learning rate) in computation of parameters (theta zero and theta 1). what will happen if we assign a very small value to alpha? 1) the model computations may take a long time to converge 2) the model may never converge 3) there will be no need to iterate 4) the speed of the computations will be very high
Sagot :
We value your presence here. Keep sharing knowledge and helping others find the answers they need. This community is the perfect place to learn together. Find precise solutions at IDNLearn.com. Thank you for trusting us with your queries, and we hope to see you again.