Our Courses

Machine Learning and Deep Learning Optimizers Implementation

  • Category
    IT & Software
  • View
    45
  • Review
    • 4.9
  • Created At
    6 months ago
Machine Learning and Deep Learning Optimizers Implementation

In this course  you will learn:

1-  How to  implement batch (vanilla) gradient descent (GD) optimizer to obtain the optimal model parameters of the single and Multi variable linear regression (LR) models.

2- How to implement mini-batch and stochastic GD for single and multi-variable LR models.

You will do this by following the guided steps represented in the attached notebook.

In addition a video series describing each step.

You will also implement the cost function, stop conditions, as well as plotting the learning curves.

You will understand the power of applying vectorize implementation of the optimizer.

This implementation will help you to solidify the concept and gain the momentum of how the optimizers work during training phase.

By the end of this course you will obtain the balance between the theoretical and practical point of view of the optimizers that is used widely in both machine learning (ML) and deep learning (DL).

In this course we will focus on the main numerical optimization concepts and techniques used in ML and DL.

Although, we apply these techniques for single and multivariable LR, the concept is the same for other ML and DL models.

We use LR here for simplification and to focus on the optimizers rather than the models.

In the subsequent practical works we will scale this vision to implement more advanced optimizers such as:

-  Momentum based GD.

- Nestrov accelerated gradient NAG.

- Adaptive gradient Adagrad.

- RmsProp.

- Adam.

- BFGS.

You will be provided by the following:

- Master numerical optimization for machine learning and deep learning in 5 days course material (slides).

- Notebooks of the guided steps you should follow.

- Notebooks of the practical works ideal solution (the implementation).

- Data files.

You should do the implementation by yourself and compare your code to the practical session solution provided in a separate notebook.

A video series explaining the solution is provided. However, do not see the solution unless you finish your own implementation.