"It's about to go down! 👇"
Gradient Descent Algorithm: 5 Part(s)
In this series, we are going to learn one of the well-known optimization algorithms, namely Gradient Descent algorithm. However, we will go through the series gradually, so that the readers will not get overwhelmed with lots of details, especially the mathematical notations and formulas. First, we are going to start from Batch Gradient Descent. Slowly, we are going to customize the algorithm bit by bit to get to the most advanced version of Gradient Descent, which is Adam.
Here is the list of algorithms that we are going to cover:
By the end of this series, we will be able to write those optimization algorithms from scratch in Python.