facebook-pixel

Imad Dabbura

About Me

Imad Dabbura is a Data Scientist at Baylor Scott and White Health. He has many years of experience in predictive analytics where he worked in a variety of industries such as Consumer Goods, Real Estate, Marketing, and Healthcare.  Among other things, Imad is interested in Artificial Intelligence and Machine Learning. He writes articles related to machine learning and AI and contributes to open source such as publishing his educational notebooks on Github.

Gradient Descent Algorithm and Its Variants

In this note, we’ll cover gradient descent algorithm and its variants: Batch Gradient Descent, Mini-batch Gradient Descent, and Stochastic Gradient Descent. Gradient Descent is the most common optimization algorithm in machine learning and deep learning. It is a first-order optimization algorithm. This means it only takes into account the first derivative when performing the updates on the parameters. Let’s first see how gradient descent and its associated steps work on logistic regression before going into the details of its variants. 

The Harvard Innovation Lab

Made in Boston @

The Harvard Innovation Lab

350

Matching Providers

Matching providers 2
comments powered by Disqus.