Tag: Gradient Descent
Gradient Descent for Linear Regression
- Naveen
- 70
Gradient Descent is defined as one of the most commonly used iterative optimization algorithm of machine learning to train the machine learning and deep learning models. It helps in finding the local minima of a function. The best way to define the local minima or local maxima of a function using gradient descent is as…
Read More