|
Detailed information |
Original study plan |
Master's programme Computer Science 2019W |
Objectives |
In this course, students will learn the basic concepts of a large and centrally important class of methods in modern Artificial Intelligence: Probabilistic Graphical Models, used for representing and reasoning about uncertain information and knowledge in complex real-world scenarios. All three aspects related to such models will be covered: model semantics, inference, and learning. Both basic concepts and specific algorithms will be taught, and all methods will be derived in a mathematically rigorous way.
|
Subject |
Elementary Concepts: Probability Distributions, Density Functions, (conditional) independence, Probabilistic Reasoning and Inference. Bayesian Networks: Representation, Semantics, Factorisation.
Inference in Bayesian Networks: Exact Inference; Approximate Inference: Stochastic Sampling, Markov Chain Monte Carlo (MCMC) Methods.
Learning Bayesian Networks: Parameter Learning, Structure Learning, Learning
Generative vs. Discriminative Models (optional).
Special Types of Bayes Nets: Linear Gaussian Network Models.
Modelling and Predicting Temporal Processes: Dynamic Bayes Networks, Hidden Markov Models, Kalman Filters; Particle Filters.
Selected Applications of Probabilistic Graphical Models.
|
Criteria for evaluation |
Written exam at the end of the semester.
|
Methods |
Slide presentation with case studies on the blackboard
|
Language |
English |
Study material |
Koller, Daphne, and Friedman, Nir (2009). Probabilistic Graphical Models. Principles and Techniques. Cambridge, MA: MIT Press.
Russell, Stuart J. and Norvig, P. (2010). Artificial Intelligence: A Modern Approach (3rd Edition). Upper Saddle River, NJ: Prentice Hall.
|
Changing subject? |
No |
|