Guide to Gradient Descent: Working Principle and its Variants
One of the core ideas of machine learning is that it learns via optimization. Gradient descent is an optimization algorithm that learns iteratively. Here gradient refers to the small change, and descent refers to rolling down. The idea is, the error or loss will roll down with small steps, eventually reaching the minimum loss. The …
Guide to Gradient Descent: Working Principle and its Variants Read More »