|
Detailed information |
Original study plan |
Master's programme Artificial Intelligence 2019W |
Objectives |
Deep learning is a machine learning technique based on artificial neural networks. In this lecture, students will learn the basic building blocks of deep learning, mainly focused on the supervised case. The lecture will discuss issues in optimization and modelling, as well as see some common applications of deep learning techniques. It is expected that students visiting this class already have a solid understanding of machine learning.
|
Subject |
- Basics of neural networks (Logistic Regression, MLP, Backprop, Vanishing Gradients)
- Deep learning techniques for NNs (Stacking, Dropout, layer normalization techniques, initializations, activation functions)
- Issues in regularization and optimization (Weight decay, Early Stopping, SGD, Momentum, Adagrad & Adam, 2nd Order Methods, pruning methods)
- Convolutional neural networks and applications (Object recognition, Detection, Semantic Segmentation, Super-Resolution, Style Transfer, ...)
- Well known CNN architectures (LeNet, AlexNet, VGG, ResNet, ...)
- Transfer- and multitask learning
|
Criteria for evaluation |
Exam at the end of the semester
|
Methods |
Slide presentations, discussions, and code examples
|
Language |
English |
Changing subject? |
No |
|