(*)Implementing and Understanding Neural Network Basics (k4) Students can construct neural networks, understand core concepts like single-layer neural networks, loss functions, empirical and structural risk minimization, multi-layer perceptrons (MLP), backpropagation, and identify issues such as vanishing gradients.
Applying Advanced Deep Learning Techniques (k4) Students are able to enhance neural network performance using regularization and normalization techniques, and through various activation functions and initialization strategies.
Addressing Optimization and Regularization Challenges (k5) Students can apply optimization algorithms like SGD, Momentum, Adagrad, and Adam, and use regularization techniques such as weight decay, dropout, early stopping, and pruning to improve model generalization.
Building and Using Convolutional Neural Networks (CNNs) (k4) Students are capable of designing and training CNNs for tasks such as image classification and they understanding their application potential.
Understanding and Implementing Transfer and Multitask Learning (k5) Students can train basic MLP and CNN architectures and apply transfer learning and multitask learning strategies to adapt models for related tasks and improve learning efficiency.
|
(*)Students acquire foundational knowledge of neural networks and deep learning, covering key components like multi-layer neural networks, learning and backpropagation, regularization, normalization, initialization. They have the mathematical background to denote or derive the forward pass of MLPs and CNNs, gradient descent, backpropagation and the backward pass, backpropagation, the vanishing gradient problem, initialization, and normalization techniques.
|