Course AI Campus Original

Foundations of Deep Learning I: Basics

The first part of the course Foundations of Deep Learning

📊︎ Intermediate
15 hours
🏅︎ Record of Achievement
🎁︎ For free
© CC BY-SA 4.0
🌐︎ English

Overview

The course provides an introduction to the basic blocks of deep learning. It begins with motivating examples and a brief history of how models have evolved from logistic regression to MLPs, then presents MLPs in detail including activation functions and how they are effectively trained using the backpropagation algorithm. The final chapters introduce some of the most commonly used optimization algorithms in deep learning (SGD, Adam), as well as some of the most common regularization techniques applied to deep learning models in order to improve generalization (e.g. L2 regularization, dropout).

Which topics will be covered?

  • Introduction to MLPs
  • Introduction to backpropagation algorithm
  • Introduction to optimization of weights in neural networks
  • Introduction to regularization techniques in deep learning

What will I achieve?

On completion of the course you will be able to:

  • Explain how the main components of neural networks and their training pipelines work (MLPs, activation functions, loss functions, backpropagation).
  • Describe how learning differs from optimization, and how neural networks weights are optimized.
  • Describe how the most commonly used optimization algorithms work (e.g. SGD, Adam, AdamW).
  • Describe different regularization techniques and explain some regularization techniques specific to deep learning (e.g. early stopping, data augmentation, dropout).

Which prerequisites do I need to fulfill?

  • Good knowledge of linear algebra
  • Good knowledge of probability theory
  • Good knowledge of machine learning

This course is offered by

logo_uni_freiburg_automated_ML

Helpdesk

Hint: Have you already checked our FAQ for an answer to your question?