Optimized Gradient Descent
Putting some turbo boost into our gradient descent code
Series
Series
It's about to go down! 👇
Bijon Setyawan Raya
January 11, 2022
2 mins
Introduction
Linear Regression
Mathematics of Gradient Descent
Batch Gradient Descent
Mini Batch Gradient Descent
Stochastic Gradient Descent
In this series, we are going to learn one of the well-known optimization algorithms, namely Gradient Descent algorithm. However, we will go through the series gradually, so that the readers will not get overwhelmed with lots of details, especially the mathematical notations and formulas.
In the beginning, we are going to discuss what Linear Regression is and how to use it. Even though it's an easy concept and many resources can be found, it's still a good idea to refresh your memory before delving into Gradient Descent. After that, we are going to learn the inner working of Gradient Descent. Then, we will start implementing the simplest form of Gradient Descent, which is Batch Gradient Descent. By tweaking abit the formula and adding some complexities into the algorithm little by little, we can get all the way to developing Adam and many of its variations.
Here is the list of algorithms that we are going to cover:
By the end of this series, we will be able to write optimization algorithms from scratch. Since this is my first attempt in writing scientific papers, please send me your feedback and suggestions via email. Also, if you notice some mistakes or type in my posts, please do let me know.