|
Detailed information |
Original study plan |
Master's programme Computer Science 2021S |
Objectives |
After this class, students will know and understand the basic concepts of a large and centrally important class of methods in modern Artificial Intelligence: Probabilistic Graphical Models, used for representing and reasoning about uncertain information and knowledge in complex real-world scenarios. All three aspects related to such models will be covered: model semantics, inference, and learning. Both basic concepts and specific algorithms will be taught, and all methods will be derived in a mathematically rigorous way. Students will understand how such models can be used for problem modeling and solving, and what their limitations are.
|
Subject |
Elementary Concepts: Probability Distributions, Density Functions, (conditional) independence, Probabilistic Reasoning and Inference. Bayesian Networks: Representation, Semantics, Factorisation.
Inference in Bayesian Networks: Exact Inference; Approximate Inference: Stochastic Sampling, Markov Chain Monte Carlo (MCMC) Methods.
Learning Bayesian Networks: Parameter Learning (maximum likelihood and Bayesian estimation), Structure Learning, Generative vs. Discriminative Models.
Special Types of Bayes Nets: Linear Gaussian Network Models.
Modelling and Predicting Temporal Processes: Dynamic Bayes Networks, Hidden Markov Models, Kalman Filters; Particle Filters.
Selected Applications of Probabilistic Graphical Models.
|
Criteria for evaluation |
Written exam at the end of the semester.
|
Methods |
Lecture series with written materials (presentation slides) provided regularly in electronic form.
|
Language |
English |
Study material |
Koller, Daphne, and Friedman, Nir (2009). Probabilistic Graphical Models. Principles and Techniques. Cambridge, MA: MIT Press.
Russell, Stuart J. and Norvig, P. (2010). Artificial Intelligence: A Modern Approach (3rd Edition). Upper Saddle River, NJ: Prentice Hall.
|
Changing subject? |
No |
|